| Training Details |
|
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| MixTAO 19B Pass | 32K / 38.1 GB | 3 | 2 |
| Lorge 2x7B UAMM | 32K / 38.2 GB | 16 | 0 |
| Multimerge 19B Pass | 32K / 38 GB | 10 | 0 |
| TaoPassthrough 15B S | 32K / 38.4 GB | 5 | 0 |
| Raccoon Small | 32K / 38.4 GB | 5 | 1 |
| Mixtral 11Bx2 MoE 19B | 4K / 38.4 GB | 352 | 38 |
| Truthful DPO MoE 19B | 4K / 38.4 GB | 1731 | 1 |
| Venus DPO 50 | 4K / 38.4 GB | 1714 | 1 |
| SOLAR Math 2x10.7B | 4K / 38.4 GB | 1742 | 0 |
| SOLAR Math 2x10.7B V0.2 | 4K / 38.4 GB | 1161 | 4 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐