Mixtral 11Bx2 MoE 19B by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  Mixtral 11Bx2 MoE 19B   URL Share it on

Mixtral 11Bx2 MoE 19B is an open-source language model by cloudyu. Features: 19.2b LLM, VRAM: 38.4GB, Context: 4K, License: cc-by-nc-4.0, MoE, HF Score: 74.4, LLM Explorer Score: 0.19, Arc: 71.2, HellaSwag: 88.5, MMLU: 66.3, TruthfulQA: 72, WinoGrande: 83.3, GSM8K: 65.3.

  Conversational   Endpoints compatible   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

Mixtral 11Bx2 MoE 19B Benchmarks

Mixtral 11Bx2 MoE 19B (cloudyu/Mixtral_11Bx2_MoE_19B)
๐ŸŒŸ Advertise your project ๐Ÿš€

Mixtral 11Bx2 MoE 19B Parameters and Internals

Model Type 
text-generation
Input Output 
Input Format:
Prompt text in accepted format
Accepted Modalities:
text
Output Format:
Generated text
LLM NameMixtral 11Bx2 MoE 19B
Repository ๐Ÿค—https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B 
Model Size19.2b
Required VRAM38.4 GB
Updated2026-02-25
Maintainercloudyu
Model Typemixtral
Model Files  9.9 GB: 1-of-4   10.0 GB: 2-of-4   10.0 GB: 3-of-4   8.5 GB: 4-of-4
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length4096
Model Max Length4096
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the Mixtral 11Bx2 MoE 19B

Model
Likes
Downloads
VRAM
Mixtral 11Bx2 MoE 19B GGUF198316 GB
Mixtral 11Bx2 MoE 19B AWQ51310 GB
Mixtral 11Bx2 MoE 19B GPTQ5710 GB

Best Alternatives to Mixtral 11Bx2 MoE 19B

Best Alternatives
Context / RAM
Downloads
Likes
MixTAO 19B Pass32K / 38.1 GB32
Lorge 2x7B UAMM32K / 38.2 GB160
Multimerge 19B Pass32K / 38 GB100
Mistralmath 15B Pass32K / 38.5 GB110
TaoPassthrough 15B S32K / 38.4 GB50
Raccoon Small32K / 38.4 GB741
Truthful DPO MoE 19B4K / 38.4 GB17311
SOLAR Math 2x10.7B4K / 38.4 GB17420
SOLAR Math 2x10.7B V0.24K / 38.4 GB11614
...oundary Solar Chat 2x10.7B MoE4K / 38 GB1231
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/Mixtral_11Bx2_MoE_19B.

Rank the Mixtral 11Bx2 MoE 19B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52628 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a