DeepSeek R1 by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  DeepSeek R1   URL Share it on

  Arxiv:2501.12948   Autotrain compatible   Conversational   Custom code   Deepseek v3   Endpoints compatible   Fp8   Region:us   Safetensors   Sharded   Tensorflow

DeepSeek R1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

DeepSeek R1 Parameters and Internals

LLM NameDeepSeek R1
Repository 🤗https://huggingface.co/deepseek-ai/DeepSeek-R1 
Model Size684.5b
Required VRAM688.4 GB
Updated2025-07-25
Maintainerdeepseek-ai
Model Typedeepseek_v3
Model Files  5.2 GB: 1-of-163   4.3 GB: 2-of-163   4.3 GB: 3-of-163   4.3 GB: 4-of-163   4.3 GB: 5-of-163   4.4 GB: 6-of-163   4.3 GB: 7-of-163   4.3 GB: 8-of-163   4.3 GB: 9-of-163   4.3 GB: 10-of-163   4.3 GB: 11-of-163   1.3 GB: 12-of-163   4.3 GB: 13-of-163   4.3 GB: 14-of-163   4.3 GB: 15-of-163   4.3 GB: 16-of-163   4.3 GB: 17-of-163   4.3 GB: 18-of-163   4.3 GB: 19-of-163   4.3 GB: 20-of-163   4.3 GB: 21-of-163   4.3 GB: 22-of-163   4.3 GB: 23-of-163   4.3 GB: 24-of-163   4.3 GB: 25-of-163   4.3 GB: 26-of-163   4.3 GB: 27-of-163   4.3 GB: 28-of-163   4.3 GB: 29-of-163   4.3 GB: 30-of-163   4.3 GB: 31-of-163   4.3 GB: 32-of-163   4.3 GB: 33-of-163   1.8 GB: 34-of-163   4.3 GB: 35-of-163   4.3 GB: 36-of-163   4.3 GB: 37-of-163   4.3 GB: 38-of-163   4.3 GB: 39-of-163   4.3 GB: 40-of-163   4.3 GB: 41-of-163   4.3 GB: 42-of-163   4.3 GB: 43-of-163   4.3 GB: 44-of-163   4.3 GB: 45-of-163   4.3 GB: 46-of-163   4.3 GB: 47-of-163   4.3 GB: 48-of-163   4.3 GB: 49-of-163   4.3 GB: 50-of-163   4.3 GB: 51-of-163   4.3 GB: 52-of-163   4.3 GB: 53-of-163   4.3 GB: 54-of-163   4.3 GB: 55-of-163   1.8 GB: 56-of-163   4.3 GB: 57-of-163   4.3 GB: 58-of-163   4.3 GB: 59-of-163   4.3 GB: 60-of-163   4.3 GB: 61-of-163   4.3 GB: 62-of-163   4.3 GB: 63-of-163   4.3 GB: 64-of-163   4.3 GB: 65-of-163   4.3 GB: 66-of-163   4.3 GB: 67-of-163   4.3 GB: 68-of-163   4.3 GB: 69-of-163   4.3 GB: 70-of-163   4.3 GB: 71-of-163   4.3 GB: 72-of-163   4.3 GB: 73-of-163   4.3 GB: 74-of-163   4.3 GB: 75-of-163   4.3 GB: 76-of-163   4.3 GB: 77-of-163   1.8 GB: 78-of-163   4.3 GB: 79-of-163   4.3 GB: 80-of-163   4.3 GB: 81-of-163   4.3 GB: 82-of-163   4.3 GB: 83-of-163   4.3 GB: 84-of-163   4.3 GB: 85-of-163   4.3 GB: 86-of-163   4.3 GB: 87-of-163   4.3 GB: 88-of-163   4.3 GB: 89-of-163   4.3 GB: 90-of-163   4.3 GB: 91-of-163   4.3 GB: 92-of-163   4.3 GB: 93-of-163   4.3 GB: 94-of-163   4.3 GB: 95-of-163   4.3 GB: 96-of-163   4.3 GB: 97-of-163   4.3 GB: 98-of-163   4.3 GB: 99-of-163   1.8 GB: 100-of-163   4.3 GB: 101-of-163   4.3 GB: 102-of-163   4.3 GB: 103-of-163   4.3 GB: 104-of-163   4.3 GB: 105-of-163   4.3 GB: 106-of-163   4.3 GB: 107-of-163   4.3 GB: 108-of-163   4.3 GB: 109-of-163   4.3 GB: 110-of-163   4.3 GB: 111-of-163   4.3 GB: 112-of-163   4.3 GB: 113-of-163   4.3 GB: 114-of-163   4.3 GB: 115-of-163   4.3 GB: 116-of-163   4.3 GB: 117-of-163   4.3 GB: 118-of-163   4.3 GB: 119-of-163   4.3 GB: 120-of-163   4.3 GB: 121-of-163   1.8 GB: 122-of-163   4.3 GB: 123-of-163   4.3 GB: 124-of-163   4.3 GB: 125-of-163   4.3 GB: 126-of-163   4.3 GB: 127-of-163   4.3 GB: 128-of-163   4.3 GB: 129-of-163   4.3 GB: 130-of-163   4.3 GB: 131-of-163   4.3 GB: 132-of-163   4.3 GB: 133-of-163   4.3 GB: 134-of-163   4.3 GB: 135-of-163   4.3 GB: 136-of-163   4.3 GB: 137-of-163   4.3 GB: 138-of-163   4.3 GB: 139-of-163   4.3 GB: 140-of-163   3.1 GB: 141-of-163   4.3 GB: 142-of-163   4.3 GB: 143-of-163   4.3 GB: 144-of-163   4.3 GB: 145-of-163   4.3 GB: 146-of-163   4.3 GB: 147-of-163   4.3 GB: 148-of-163   4.3 GB: 149-of-163   4.3 GB: 150-of-163   4.3 GB: 151-of-163   4.3 GB: 152-of-163   4.3 GB: 153-of-163   4.3 GB: 154-of-163   4.3 GB: 155-of-163   4.3 GB: 156-of-163   4.3 GB: 157-of-163   4.3 GB: 158-of-163   4.3 GB: 159-of-163   5.2 GB: 160-of-163   4.3 GB: 161-of-163   4.3 GB: 162-of-163   6.6 GB: 163-of-163
Model ArchitectureDeepseekV3ForCausalLM
Licensemit
Context Length163840
Model Max Length163840
Transformers Version4.46.3
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size129280
Torch Data Typebfloat16

Quantized Models of the DeepSeek R1

Model
Likes
Downloads
VRAM
DeepSeek R1 GGUF1088286780 GB
DeepSeek R1 AWQ845762225 GB
DeepSeek R1 GGUF UD1617300 GB
DeepSeek R1 AWQ833980225 GB
DeepSeek R1 2bit8131238 GB

Best Alternatives to DeepSeek R1

Best Alternatives
Context / RAM
Downloads
Likes
DeepSeek V3160K / 171.8 GB4438653926
DeepSeek R1 0528160K / 180.4 GB4229942325
DeepSeek V3.0324160K / 184.7 GB5491983004
DeepSeek TNG R1T2 Chimera160K / 184.7 GB5858220
DeepSeek V3 Base160K / 171.8 GB1166641662
DeepSeek R1T Chimera160K / 189 GB1453261
DeepSeek R1 Zero160K / 688.4 GB2150931
DeepSeek R1 0528 BF16160K / 358.8 GB30428
DeepSeek R1 0528160K / 176.1 GB162616
DeepSeek TNG R1T2 Chimera160K / 180.4 GB706
Note: green Score (e.g. "73.2") means that the model is better than deepseek-ai/DeepSeek-R1.

Rank the DeepSeek R1 Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50068 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124