Mixtral AI SwahiliTron 7B is an open-source language model by LeroyDyer. Features: 7b LLM, VRAM: 14.4GB, Context: 32K, License: mit, HF Score: 61.1, LLM Explorer Score: 0.17, Arc: 57.1, HellaSwag: 81.6, MMLU: 58.5, TruthfulQA: 60.7, WinoGrande: 75.5, GSM8K: 33.5.
| Additional Notes |
| ||||||
| Supported Languages |
| ||||||
| Training Details |
|
| LLM Name | Mixtral AI SwahiliTron 7b |
| Repository ๐ค | https://huggingface.co/LeroyDyer/Mixtral_AI_SwahiliTron_7b |
| Model Size | 7b |
| Required VRAM | 14.4 GB |
| Updated | 2024-08-30 |
| Maintainer | LeroyDyer |
| Model Type | mistral |
| Model Files | |
| Supported Languages | en sw |
| Model Architecture | MistralForCausalLM |
| License | mit |
| Context Length | 32768 |
| Model Max Length | 32768 |
| Transformers Version | 4.41.2 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | <unk> |
| Vocabulary Size | 32000 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ...Nemo Instruct 2407 Abliterated | 1000K / 24.5 GB | 254 | 20 |
| MegaBeam Mistral 7B 512K | 512K / 14.4 GB | 8908 | 50 |
| SpydazWeb AI HumanAI RP | 512K / 14.4 GB | 14 | 1 |
| SpydazWeb AI HumanAI 002 | 512K / 14.4 GB | 18 | 1 |
| ...daz Web AI ChatML 512K Project | 512K / 14.5 GB | 12 | 0 |
| MegaBeam Mistral 7B 300K | 282K / 14.4 GB | 3779 | 16 |
| MegaBeam Mistral 7B 300K | 282K / 14.4 GB | 8082 | 16 |
| Hebrew Mistral 7B 200K | 256K / 30 GB | 1316 | 15 |
| Astral 256K 7B V2 | 250K / 14.4 GB | 5 | 0 |
| Astral 256K 7B | 250K / 14.4 GB | 5 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐