DeepSeek R1 Distill Qwen 1.5B by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  DeepSeek R1 Distill Qwen 1.5B   URL Share it on

DeepSeek R1 Distill Qwen 1.5B is an open-source language model by deepseek-ai. Features: 1.5b LLM, VRAM: 3.5GB, Context: 128K, License: mit, LLM Explorer Score: 0.38.

  Arxiv:2501.12948   Conversational   Endpoints compatible   Qwen2   Region:us   Safetensors

DeepSeek R1 Distill Qwen 1.5B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

DeepSeek R1 Distill Qwen 1.5B Parameters and Internals

LLM NameDeepSeek R1 Distill Qwen 1.5B
Repository 🤗https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B 
Model Size1.5b
Required VRAM3.5 GB
Updated2026-04-01
Maintainerdeepseek-ai
Model Typeqwen2
Model Files  3.5 GB
Model ArchitectureQwen2ForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.44.0
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size151936
Torch Data Typebfloat16

Quantized Models of the DeepSeek R1 Distill Qwen 1.5B

Model
Likes
Downloads
VRAM
...Seek R1 Distill Qwen 1.5B GGUF133879100 GB
OpenThink08801 GB
...ill Qwen 1.5B Unsloth Bnb 4bit1769071 GB
... R1 Distill Qwen 1.5B Bnb 4bit1130141 GB
...Seek R1 Distill Qwen 1.5B GGUF61021 GB
Dhanishtha52423 GB
Atlas Flash 1.5B Preview21063 GB
Atlas Flash 1.5B Preview2693 GB

Best Alternatives to DeepSeek R1 Distill Qwen 1.5B

Best Alternatives
Context / RAM
Downloads
Likes
ReaderLM V2500K / 3.1 GB14849779
Reader Lm 1.5B250K / 3.1 GB566608
VibeThinker 1.5B128K / 3.5 GB2025520
Ko R1 1.5B Preview128K / 7.1 GB159
Qwen2.5 1.5B128K / 3.1 GB914961176
AceInstruct 1.5B128K / 3.5 GB7688020
Qwen2 1.5B128K / 3.1 GB127243102
1.5B Cold Start SFT128K / 3.5 GB45030
OpenReasoning Nemotron 1.5B128K / 3.1 GB793456
DeepScaleR 1.5B Preview128K / 7.1 GB7129583

Rank the DeepSeek R1 Distill Qwen 1.5B Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53518 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a