SmollerLM 20M Instruct Pruned Sft5 by mehmetkeremturkcan

 ยป  All LLMs  ยป  mehmetkeremturkcan  ยป  SmollerLM 20M Instruct Pruned Sft5   URL Share it on

SmollerLM 20M Instruct Pruned Sft5 is an open-source language model by mehmetkeremturkcan. Features: 20m LLM, VRAM: 0.1GB, Context: 8K, Instruction-Based, LLM Explorer Score: 0.18.

  Autotrain compatible Base model:finetune:mehmetkere... Base model:mehmetkeremturkcan/...   Conversational Dataset:huggingfacetb/smol-smo...   Endpoints compatible   Generated from trainer   Instruct   Llama   Region:us   Safetensors   Sft   Tensorboard   Trl

SmollerLM 20M Instruct Pruned Sft5 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SmollerLM 20M Instruct Pruned Sft5 (mehmetkeremturkcan/SmollerLM-20M-Instruct-Pruned-sft5)
๐ŸŒŸ Advertise your project ๐Ÿš€

SmollerLM 20M Instruct Pruned Sft5 Parameters and Internals

LLM NameSmollerLM 20M Instruct Pruned Sft5
Repository ๐Ÿค—https://huggingface.co/mehmetkeremturkcan/SmollerLM-20M-Instruct-Pruned-sft5 
Model NameSmollerLM-20M-Instruct-Pruned-sft5
Base Model(s)  mehmetkeremturkcan/SmollerLM-20M-Instruct-Pruned-Base   mehmetkeremturkcan/SmollerLM-20M-Instruct-Pruned-Base
Model Size20m
Required VRAM0.1 GB
Updated2025-09-17
Maintainermehmetkeremturkcan
Model Typellama
Instruction-BasedYes
Model Files  0.1 GB   0.0 GB
Model ArchitectureLlamaForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.48.1
Tokenizer ClassGPT2Tokenizer
Padding Token<|im_end|>
Vocabulary Size49152
Torch Data Typefloat32
Errorsreplace

Best Alternatives to SmollerLM 20M Instruct Pruned Sft5

Best Alternatives
Context / RAM
Downloads
Likes
Bagel 20B V04 Llama32K / 39.6 GB207
Bagel DPO 20B V04 Llama32K / 39.6 GB53
...lm2 1.7B Distilled GPT Oss 20B8K / 0 GB252
...runed Base PostPruneRetraining8K / 0 GB50
Rose 20B4K / 39.9 GB74345
...I Llama2 Ko En Instruct 20B V12K / 40.4 GB50
Rose 20B GPTQ4K / 10.5 GB5973
Rose 20B AWQ4K / 10.9 GB41
Note: green Score (e.g. "73.2") means that the model is better than mehmetkeremturkcan/SmollerLM-20M-Instruct-Pruned-sft5.

Rank the SmollerLM 20M Instruct Pruned Sft5 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a