DeepSeek R1 AWQ is an open-source language model by cognitivecomputations. Features: 671b LLM, VRAM: 225GB, Context: 160K, License: mit, Quantized, LLM Explorer Score: 0.21.
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| DeepSeek V3 AWQ | 160K / 351.9 GB | 1809 | 34 |
| DeepSeek V3 AWQ | 160K / 351.9 GB | 1039 | 35 |
| DeepSeek R1 0528 AWQ | 160K / 205 GB | 8496 | 15 |
| DeepSeek V3.0324 AWQ | 160K / 351.9 GB | 4061 | 22 |
| DeepSeek R1 AWQ | 160K / 225 GB | 3980 | 83 |
| DeepSeek V3.1 AWQ W4AFP8 | 160K / 225 GB | 15 | 6 |
| DeepSeek V3.1 4bit | 160K / 190.1 GB | 28555 | 6 |
| DeepSeek V3.0324 4bit | 160K / 198.7 GB | 1576 | 38 |
| Kimi K2 Thinking NVFP4 | 256K / 210 GB | 4820 | 29 |
| Kimi K2 Thinking BF16 | 256K / 220 GB | 86 | 6 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟