| LLM Name | Distil Mistral 1.5B Instruct V0.2 Cosmo 100K | 
| Repository ๐ค | https://huggingface.co/sanchit-gandhi/distil-mistral-1.5B-Instruct-v0.2-cosmo-100k | 
| Model Size | 1.5b | 
| Required VRAM | 3.1 GB | 
| Updated | 2025-09-23 | 
| Maintainer | sanchit-gandhi | 
| Model Type | mistral | 
| Instruction-Based | Yes | 
| Model Files | |
| Model Architecture | MistralForCausalLM | 
| Context Length | 32768 | 
| Model Max Length | 32768 | 
| Transformers Version | 4.40.0.dev0 | 
| Tokenizer Class | LlamaTokenizer | 
| Padding Token | </s> | 
| Vocabulary Size | 32000 | 
| Torch Data Type | float32 | 
| Best Alternatives | Context / RAM | Downloads | Likes | 
|---|---|---|---|
| RakutenAI 2.0 Mini Instruct | 128K / 3.1 GB | 629 | 27 | 
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐