Kimi K2 Thinking 4bit is an open-source language model by mlx-community. Features: 1026.4b LLM, VRAM: 151.2GB, Context: 256K, License: other, Quantized.
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Kimi K2 Instruct 4bit | 128K / 143.2 GB | 39626 | 13 |
| ...i K2 Instruct 0905 Mlx DQ3 K M | 256K / 207.8 GB | 474 | 5 |
| Kimi K2 Instruct 0905 BF16 | 256K / 1399.1 GB | 93 | 3 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐