| LLM Name | DeepSeek R1 Distill Qwen 1.5B Multilingual |
| Repository 🤗 | https://huggingface.co/lightblue/DeepSeek-R1-Distill-Qwen-1.5B-Multilingual |
| Model Size | 70b |
| Required VRAM | 3.5 GB |
| Updated | 2025-09-23 |
| Maintainer | lightblue |
| Model Type | qwen2 |
| Model Files | |
| Supported Languages | am ar bn zh cs nl en fr de el ha he hi id it ja jv km ko lo ms mr fa pl pt ro ru es sw sv tl ta te th tr uk ur vi |
| Model Architecture | Qwen2ForCausalLM |
| License | apache-2.0 |
| Context Length | 131072 |
| Model Max Length | 131072 |
| Transformers Version | 4.48.1 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | <|end▁of▁sentence|> |
| Vocabulary Size | 151936 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Jais Inception 70B V1.2 | 128K / 80.1 GB | 2203 | 2 |
| Jais Inception 70B V1.1 | 128K / 146 GB | 2215 | 0 |
| T1 1.5B | 32K / 3.6 GB | 9 | 1 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟