DeepSeek V3.0324 4bit is an open-source language model by mlx-community. Features: 104.9b LLM, VRAM: 198.7GB, Context: 160K, License: mit, Quantized, LLM Explorer Score: 0.22.
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| DeepSeek V3.1 4bit | 160K / 190.1 GB | 3723 | 6 |
| Kimi K2 Thinking NVFP4 | 256K / 210 GB | 26355 | 30 |
| Kimi K2 Thinking | 256K / 383.2 GB | 115 | 4 |
| Kimi K2 Thinking BF16 | 256K / 220 GB | 86 | 6 |
| Kimi K2 Instruct 0905 BF16 | 256K / 1399.1 GB | 155 | 4 |
| Kimi K2 Instruct 0905 | 256K / 667.4 GB | 35 | 7 |
| DeepSeek V3 Base | 160K / 171.8 GB | 2345 | 1684 |
| DeepSeek V3.1 Terminus | 160K / 171.8 GB | 202 | 4 |
| DeepSeek V3.1 | 160K / 180.4 GB | 147941 | 819 |
| DeepSeek V3.1 | 160K / 176.1 GB | 47 | 3 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟