DeepSeek V3.1 AWQ W4AFP8 is an open-source language model by TMElyralab. Features: LLM, VRAM: 225GB, Context: 160K, License: mit, Quantized, LLM Explorer Score: 0.21.
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| DeepSeek V3 AWQ | 160K / 351.9 GB | 1809 | 34 |
| DeepSeek V3 AWQ | 160K / 351.9 GB | 1028 | 35 |
| DeepSeek R1 0528 AWQ | 160K / 205 GB | 8496 | 15 |
| DeepSeek V3.0324 AWQ | 160K / 351.9 GB | 4061 | 22 |
| DeepSeek R1 AWQ | 160K / 225 GB | 3980 | 83 |
| DeepSeek R1 AWQ | 160K / 225 GB | 1077 | 86 |
| DeepSeek V3.1 4bit | 160K / 190.1 GB | 3723 | 6 |
| DeepSeek V3.0324 4bit | 160K / 198.7 GB | 3096 | 38 |
| Kimi K2 Thinking NVFP4 | 256K / 210 GB | 26355 | 30 |
| Kimi K2 Thinking | 256K / 383.2 GB | 115 | 4 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟