DeepSeek R1 Distill Qwen 1.5B by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  DeepSeek R1 Distill Qwen 1.5B   URL Share it on

DeepSeek R1 Distill Qwen 1.5B is an open-source language model by deepseek-ai. Features: 1.5b LLM, VRAM: 3.5GB, Context: 128K, License: mit, LLM Explorer Score: 0.44.

  Arxiv:2501.12948   Conversational   Endpoints compatible   Qwen2   Region:us   Safetensors

DeepSeek R1 Distill Qwen 1.5B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

DeepSeek R1 Distill Qwen 1.5B Parameters and Internals

LLM NameDeepSeek R1 Distill Qwen 1.5B
Repository 🤗https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B 
Model Size1.5b
Required VRAM3.5 GB
Updated2026-01-08
Maintainerdeepseek-ai
Model Typeqwen2
Model Files  3.5 GB
Model ArchitectureQwen2ForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.44.0
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size151936
Torch Data Typebfloat16

Quantized Models of the DeepSeek R1 Distill Qwen 1.5B

Model
Likes
Downloads
VRAM
...Seek R1 Distill Qwen 1.5B GGUF131386840 GB
...Seek R1 Distill Qwen 1.5B GGUF612791 GB
OpenThink08801 GB
...ill Qwen 1.5B Unsloth Bnb 4bit1765941 GB
... R1 Distill Qwen 1.5B Bnb 4bit1114971 GB
Dhanishtha4583 GB
Atlas Flash 1.5B Preview21063 GB

Best Alternatives to DeepSeek R1 Distill Qwen 1.5B

Best Alternatives
Context / RAM
Downloads
Likes
ReaderLM V2500K / 3.1 GB2591766
Reader Lm 1.5B250K / 3.1 GB1411607
Qwen2.5 1.5B128K / 3.1 GB888741150
AceInstruct 1.5B128K / 3.5 GB7688020
DeepScaleR 1.5B Preview128K / 7.1 GB11774573
...n Research Reasoning Qwen 1.5B128K / 7.1 GB5716221
Qwen2 1.5B128K / 3.1 GB8373497
Stella En 1.5B V5128K / 6.2 GB581890211
OpenReasoning Nemotron 1.5B128K / 3.1 GB518847
...1 Distill Qwen 1.5B GSPO Basic128K / 3.5 GB18060

Rank the DeepSeek R1 Distill Qwen 1.5B Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51643 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20241124