DeepSeek V3.1 4bit is an open-source language model by mlx-community. Features: 671b LLM, VRAM: 190.1GB, Context: 160K, License: mit, Quantized, LLM Explorer Score: 0.29.
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| DeepSeek V3.0324 4bit | 160K / 198.7 GB | 1576 | 38 |
| Kimi K2 Thinking NVFP4 | 256K / 210 GB | 15086 | 30 |
| Kimi K2 Thinking BF16 | 256K / 220 GB | 86 | 6 |
| Kimi K2 Thinking | 256K / 383.2 GB | 30 | 4 |
| Kimi K2 Instruct 0905 BF16 | 256K / 1399.1 GB | 156 | 4 |
| DeepSeek V3.1 | 160K / 180.4 GB | 150107 | 819 |
| DeepSeek V3.1 Terminus | 160K / 171.8 GB | 703 | 4 |
| DeepSeek V3.1 Base | 160K / 180.4 GB | 10644 | 1010 |
| DeepSeek V3.1 | 160K / 176.1 GB | 44 | 3 |
| DeepSeek V3 AWQ | 160K / 351.9 GB | 1809 | 34 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐