Deepseek Coder 33B Base by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  Deepseek Coder 33B Base   URL Share it on

Deepseek Coder 33B Base is an open-source language model by deepseek-ai. Features: 33b LLM, VRAM: 66.5GB, Context: 16K, License: other, Code Generating, LLM Explorer Score: 0.13, HumanEval: 52.5.

  Codegen   Deploy:azure   Endpoints compatible   Llama   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Deepseek Coder 33B Base Benchmarks

Deepseek Coder 33B Base Parameters and Internals

Model Type 
code generation, code completion
Use Cases 
Areas:
Research, Commercial applications
Applications:
Project-level code completion, Infilling tasks
Primary Use Cases:
Code generation, Code completion, Repository-level code completion
Limitations:
Not specified
Supported Languages 
English (High proficiency), Chinese (High proficiency)
Training Details 
Data Sources:
Project-level code corpus
Data Volume:
2 trillion tokens
Methodology:
Grouped-Query Attention
Context Length:
16000
Input Output 
Accepted Modalities:
Text
LLM NameDeepseek Coder 33B Base
Repository 🤗https://huggingface.co/deepseek-ai/deepseek-coder-33b-base 
Model Size33b
Required VRAM66.5 GB
Updated2026-04-09
Maintainerdeepseek-ai
Model Typellama
Model Files  9.7 GB: 1-of-7   9.9 GB: 2-of-7   9.9 GB: 3-of-7   9.8 GB: 4-of-7   9.9 GB: 5-of-7   9.9 GB: 6-of-7   7.4 GB: 7-of-7   9.7 GB: 1-of-7   9.9 GB: 2-of-7   9.9 GB: 3-of-7   9.8 GB: 4-of-7   9.9 GB: 5-of-7   9.9 GB: 6-of-7   7.4 GB: 7-of-7
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size32256
Torch Data Typebfloat16

Quantized Models of the Deepseek Coder 33B Base

Model
Likes
Downloads
VRAM
Deepseek Coder 33B Base GGUF8135514 GB
Deepseek Coder 33B Base GPTQ24217 GB
Deepseek Coder 33B Base AWQ3918 GB

Best Alternatives to Deepseek Coder 33B Base

Best Alternatives
Context / RAM
Downloads
Likes
ReflectionCoder DS 33B16K / 67 GB97424
Deepseek Coder 33B Instruct16K / 66.5 GB10247566
Deepseek Wizard 33B Slerp16K / 35.3 GB100
WhiteRabbitNeo 33B V116K / 67 GB97390
ValidateAI 3 33B Ties16K / 66.5 GB80
ValidateAI 2 33B AT16K / 66.5 GB50
Everyone Coder 33B Base16K / 66.5 GB11721
Fortran2Cpp16K / 67.3 GB44
F2C Translator16K / 67.3 GB01
Llm4decompile 33B16K / 66.5 GB18
Note: green Score (e.g. "73.2") means that the model is better than deepseek-ai/deepseek-coder-33b-base.

Rank the Deepseek Coder 33B Base Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a