| Model Type | 
 | ||||||||||||
| Use Cases | 
 | ||||||||||||
| Additional Notes | 
 | ||||||||||||
| Supported Languages | 
 | ||||||||||||
| Training Details | 
 | ||||||||||||
| Safety Evaluation | 
 | ||||||||||||
| Input Output | 
 | 
| LLM Name | MiniCPM 2B DPO Bf16 | 
| Repository ๐ค | https://huggingface.co/openbmb/MiniCPM-2B-dpo-bf16 | 
| Model Size | 2b | 
| Required VRAM | 5.5 GB | 
| Updated | 2025-10-27 | 
| Maintainer | openbmb | 
| Model Files | |
| Supported Languages | en zh | 
| Model Architecture | MiniCPMForCausalLM | 
| Context Length | 4096 | 
| Model Max Length | 4096 | 
| Transformers Version | 4.36.0 | 
| Tokenizer Class | LlamaTokenizer | 
| Vocabulary Size | 122753 | 
| Torch Data Type | bfloat16 | 
| Best Alternatives | Context / RAM | Downloads | Likes | 
|---|---|---|---|
| MiniCPM 2B 128K | 64K / 6 GB | 1094 | 43 | 
| MiniCPM 2B Sft Bf16 | 4K / 5.5 GB | 35500 | 121 | 
| Sparsing Law 0.1B Relu | 4K / 0.4 GB | 6 | 2 | 
| MiniCPM 2B Sft Fp32 | 4K / 10.9 GB | 1407 | 296 | 
| MiniCPM MoE 8x2B | 4K / 27.7 GB | 1280 | 44 | 
| ...iCPM 2B RAFT Lora Hotpotqa Dev | 4K / 5.5 GB | 8 | 0 | 
| MiniCPM Duplex | 4K / 5.5 GB | 1 | 3 | 
| ...iniCPM 2B DPO Fp32 Safetensors | 4K / 10.9 GB | 14 | 1 | 
| ...iniCPM 2B Sft Fp32 Safetensors | 4K / 10.9 GB | 13 | 1 | 
| ...iniCPM 2B DPO Bf16 Safetensors | 4K / 5.5 GB | 12 | 1 | 
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐