DeepSeek R1 0528 Qwen3 8B by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  DeepSeek R1 0528 Qwen3 8B   URL Share it on

  Arxiv:2501.12948   Autotrain compatible   Conversational   Endpoints compatible   Qwen3   Region:us   Safetensors   Sharded   Tensorflow

DeepSeek R1 0528 Qwen3 8B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

DeepSeek R1 0528 Qwen3 8B Parameters and Internals

LLM NameDeepSeek R1 0528 Qwen3 8B
Repository 🤗https://huggingface.co/deepseek-ai/DeepSeek-R1-0528-Qwen3-8B 
Model Size8b
Required VRAM16.4 GB
Updated2025-07-30
Maintainerdeepseek-ai
Model Typeqwen3
Model Files  8.6 GB: 1-of-2   7.8 GB: 2-of-2
Model ArchitectureQwen3ForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.51.0
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size151936
Torch Data Typebfloat16

Quantized Models of the DeepSeek R1 0528 Qwen3 8B

Model
Likes
Downloads
VRAM
DeepSeek R1 0528 Qwen3 8B GGUF2677776522 GB
DeepSeek R1 0528 Qwen3 8B GGUF51863793 GB
...0528 Qwen3 8B Unsloth Bnb 4bit11253687 GB
...Seek R1 0528 Qwen3 8B Bnb 4bit7279616 GB
...Seek R1 Qwen3 0528 8B 4bit AWQ484914 GB
...Seek R1 0528 Qwen3 8B 4bit DWQ813624 GB
DeepSeek R1 0528 Qwen3 8B 4bit412084 GB

Best Alternatives to DeepSeek R1 0528 Qwen3 8B

Best Alternatives
Context / RAM
Downloads
Likes
...n3 8B 320K Context 10X Massive320K / 16.4 GB300
Qwen3 8B 256K Context 8X Grand256K / 16.4 GB1290
...wen3 8B 192K Context 6X Larger192K / 16.4 GB240
DeepSeek R1 0528 Qwen3 8B128K / 16.4 GB414314
...1 0528 Qwen3 8B Abliterated V1128K / 16.4 GB96926
...1 Qwen3 8B ArliAI RpR V4 Small128K / 16.4 GB100516
RPT DeepSeek R1 0528 Qwen3 8B128K / 16.4 GB4432
DeepSeek R1 0528 Qwen3 8B Bf16128K / 16.3 GB10782
OPC R1 8B128K / 16.4 GB3353
Qwen3 EZO 8B YOYO Karcher 128K128K / 16.4 GB111

Rank the DeepSeek R1 0528 Qwen3 8B Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50230 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124