Mixtral 11Bx2 MoE 19B is an open-source language model by cloudyu. Features: 19.2b LLM, VRAM: 38.4GB, Context: 4K, License: cc-by-nc-4.0, MoE, HF Score: 74.4, LLM Explorer Score: 0.19, Arc: 71.2, HellaSwag: 88.5, MMLU: 66.3, TruthfulQA: 72, WinoGrande: 83.3, GSM8K: 65.3.
| Model Type |
| |||||||||
| Input Output |
|
| LLM Name | Mixtral 11Bx2 MoE 19B |
| Repository ๐ค | https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B |
| Model Size | 19.2b |
| Required VRAM | 38.4 GB |
| Updated | 2026-02-25 |
| Maintainer | cloudyu |
| Model Type | mixtral |
| Model Files | |
| Model Architecture | MixtralForCausalLM |
| License | cc-by-nc-4.0 |
| Context Length | 4096 |
| Model Max Length | 4096 |
| Transformers Version | 4.36.2 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | <s> |
| Vocabulary Size | 32000 |
| Torch Data Type | float16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| Mixtral 11Bx2 MoE 19B GGUF | 19 | 831 | 6 GB |
| Mixtral 11Bx2 MoE 19B AWQ | 5 | 13 | 10 GB |
| Mixtral 11Bx2 MoE 19B GPTQ | 5 | 7 | 10 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| MixTAO 19B Pass | 32K / 38.1 GB | 3 | 2 |
| Lorge 2x7B UAMM | 32K / 38.2 GB | 16 | 0 |
| Multimerge 19B Pass | 32K / 38 GB | 10 | 0 |
| Mistralmath 15B Pass | 32K / 38.5 GB | 11 | 0 |
| TaoPassthrough 15B S | 32K / 38.4 GB | 5 | 0 |
| Raccoon Small | 32K / 38.4 GB | 74 | 1 |
| Truthful DPO MoE 19B | 4K / 38.4 GB | 1731 | 1 |
| SOLAR Math 2x10.7B | 4K / 38.4 GB | 1742 | 0 |
| SOLAR Math 2x10.7B V0.2 | 4K / 38.4 GB | 1161 | 4 |
| ...oundary Solar Chat 2x10.7B MoE | 4K / 38 GB | 123 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐