WavGPT 2 is an open-source language model by Hack337. Features: 7b LLM, VRAM: 14.8GB, Context: 32K, License: apache-2.0, Instruction-Based, LLM Explorer Score: 0.18.
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Qwen2.5 7B Instruct 1M | 986K / 15.4 GB | 101838 | 363 |
| Hush Qwen2.5 7B RP V1.4 1M | 986K / 15.2 GB | 15 | 2 |
| Qwen2.5 7B RRP 1M | 986K / 15.2 GB | 14 | 6 |
| Qwen2.5 7B CelestialHarmony 1M | 986K / 14.8 GB | 10 | 7 |
| Q2.5 Instruct 1M Harmony | 986K / 15.2 GB | 4 | 2 |
| Impish QWEN 7B 1M | 986K / 15.2 GB | 7 | 6 |
| COCO 7B Instruct 1M | 986K / 15.2 GB | 21 | 4 |
| Apollo 7B | 986K / 15.2 GB | 6 | 0 |
| Graph R1 7B | 986K / 15.3 GB | 5 | 3 |
| Qwen2.5 7B Sky R1 Mini | 986K / 15.2 GB | 7 | 0 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟