Gemma 3 270M is an open-source language model by google. Features: 270m LLM, VRAM: 0.5GB, Context: 32K, License: gemma, LLM Explorer Score: 0.39.
| LLM Name | Gemma 3 270M |
| Repository 🤗 | https://huggingface.co/google/gemma-3-270m |
| Model Size | 270m |
| Required VRAM | 0.5 GB |
| Updated | 2026-04-15 |
| Maintainer | |
| Model Type | gemma3_text |
| Model Files | |
| Model Architecture | Gemma3ForCausalLM |
| License | gemma |
| Context Length | 32768 |
| Model Max Length | 32768 |
| Transformers Version | 4.55.0.dev0 |
| Tokenizer Class | GemmaTokenizer |
| Padding Token | <pad> |
| Vocabulary Size | 262144 |
| Torch Data Type | bfloat16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| ...ma 3 270M Qat Q4 0 Unquantized | 8 | 139 | 0 GB |
| Gemma 3 270M 8bit | 1 | 22 | 0 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Functiongemma 270M It | 32K / 0.5 GB | 36641 | 941 |
| Gemma 3 270M It | 32K / 0.5 GB | 110713 | 569 |
| Gemma 3 270M Sindhi | 32K / 0.6 GB | 6840 | 0 |
| Episodic Nothink | 32K / 0.5 GB | 314 | 0 |
| Fullseq Lora Merged | 32K / 0.5 GB | 368 | 0 |
| Gemma 3 110M English Only | 32K / 0.4 GB | 138 | 0 |
| Episodic Lora3 Grpo2 Merged | 32K / 0.5 GB | 728 | 0 |
| Functiongemma 270M It | 32K / 0.5 GB | 9775 | 12 |
| Episodic Lora3 Grpo4 Merged | 32K / 0.5 GB | 507 | 0 |
| Episodic Lora3 Grpo Merged | 32K / 0.5 GB | 420 | 1 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟