2 PRYMMAL ECE 2B SLERP V1 is an open-source language model by Lil-R. Features: 2b LLM, VRAM: 15.8GB, Context: 8K, License: apache-2.0, Merged.
| LLM Name | 2 PRYMMAL ECE 2B SLERP V1 |
| Repository 🤗 | https://huggingface.co/Alyon-AI/2_PRYMMAL-ECE-2B-SLERP-V1 |
| Base Model(s) | |
| Merged Model | Yes |
| Model Size | 2b |
| Required VRAM | 15.8 GB |
| Updated | 2026-04-23 |
| Maintainer | Lil-R |
| Model Type | gemma2 |
| Model Files | |
| Model Architecture | Gemma2ForCausalLM |
| License | apache-2.0 |
| Context Length | 8192 |
| Model Max Length | 8192 |
| Transformers Version | 4.46.1 |
| Tokenizer Class | GemmaTokenizer |
| Padding Token | <pad> |
| Vocabulary Size | 256000 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| SJT 2B | 128K / 5.2 GB | 5 | 0 |
| SILMA Kashif 2B Instruct V1.0 | 12K / 5.2 GB | 927 | 23 |
| ...mma 2 2B Jpn It Finetuning Sft | 8K / 5.2 GB | 5 | 0 |
| Gemma 2 2B It | 8K / 5.2 GB | 379516 | 1322 |
| Gemma 2 2B It | 8K / 5.2 GB | 7 | 0 |
| Gemma 2 2B | 8K / 10.5 GB | 475407 | 636 |
| Gemma 2 2B Jpn It | 8K / 5.2 GB | 24326 | 215 |
| Gemma2Slerp2 2.6B | 8K / 5.3 GB | 115 | 2 |
| ...emma 2 2B It Chinese Kyara DPO | 8K / 15.7 GB | 1926 | 14 |
| ... 2B It Alpaca Cleaned SFT 1024 | 8K / 5.2 GB | 274 | 0 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟