Model Type |
| ||||||||||||
Use Cases |
| ||||||||||||
Additional Notes |
| ||||||||||||
Input Output |
|
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Meta Llama 3.1 405B | 128K / 186 GB | 521433 | 808 |
Meta Llama 3.1 405B Instruct | 128K / 186 GB | 55654 | 473 |
Llama 3.1 405B Instruct | 128K / 183.1 GB | 36217 | 570 |
Llama 3.1 405B | 128K / 183.1 GB | 9838 | 934 |
Shisa V2 Llama3.1 405B | 128K / 191.2 GB | 282 | 16 |
Meta Llama 3.1 405B FP8 | 128K / 197.6 GB | 131499 | 94 |
...ta Llama 3.1 405B Instruct FP8 | 128K / 197.6 GB | 55370 | 165 |
Llama 3.1 Tulu 3 405B | 128K / 191.2 GB | 383 | 106 |
Llama 3.1 405B Instruct FP8 | 128K / 209.2 GB | 6037 | 10 |
Llama 3.1 405B Instruct FP8 | 128K / 193.4 GB | 2229 | 188 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐