Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected by FlofloB

 ยป  All LLMs  ยป  FlofloB  ยป  Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected   URL Share it on

Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected is an open-source language model by FlofloB. Features: 135m LLM, VRAM: 0.5GB, Context: 8K, License: apache-2.0, LLM Explorer Score: 0.19.

Base model:finetune:floflob/sm... Base model:floflob/smollm2-135...   Endpoints compatible   Generated from trainer   Llama   Region:us   Safetensors

Smollm2 135M Pretrained 1000k Fineweb Uncovai Selected Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected (FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected)
๐ŸŒŸ Advertise your project ๐Ÿš€

Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected Parameters and Internals

LLM NameSmollm2 135M Pretrained 1000k Fineweb Uncovai Selected
Repository ๐Ÿค—https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected 
Base Model(s)  ... 800k Fineweb Uncovai Selected   FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected
Model Size135m
Required VRAM0.5 GB
Updated2026-01-18
MaintainerFlofloB
Model Typellama
Model Files  0.5 GB   0.0 GB
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.44.2
Tokenizer ClassGPT2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size49152
Torch Data Typefloat32

Best Alternatives to Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected

Best Alternatives
Context / RAM
Downloads
Likes
SmolLM2 135M8K / 0.3 GB988892172
SmolLM2 135M Instruct8K / 0.3 GB587488295
Smollm2 Helpbot 135M8K / 0.3 GB11
BokantLM0.1 135M Deepseek8K / 0.5 GB60
SmolLM2 135M Mofbandgap8K / 0.3 GB50
SmolLM2 Rethink 135M8K / 0.5 GB111
SmolLM2 135M Instruct Ita8K / 0.1 GB240
SmolLM2 FT MyDataset8K / 0.5 GB650
Sft Output8K / 0.5 GB310
SmolLM2 135M Eagle8K / 0.3 GB73
Note: green Score (e.g. "73.2") means that the model is better than FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected.

Rank the Smollm2 135M Pretrained 1000K Fineweb Uncovai Selected Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a