Mixtral 7B 8expert by DiscoResearch

 ยป  All LLMs  ยป  DiscoResearch  ยป  Mixtral 7B 8expert   URL Share it on

  Autotrain compatible   Custom code   De   En   Endpoints compatible   Es   Fr   It   Mistral   Pytorch   Region:us   Sharded

Mixtral 7B 8expert Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Mixtral 7B 8expert Parameters and Internals

Model Type 
text generation
Additional Notes 
This is a preliminary HuggingFace implementation of the newly released Mixtral model. Make sure to load with `trust_remote_code=True`.
Supported Languages 
en (supported), fr (supported), it (supported), es (supported), de (supported)
LLM NameMixtral 7B 8expert
Repository ๐Ÿค—https://huggingface.co/DiscoResearch/mixtral-7b-8expert 
Model Size7b
Required VRAM93.6 GB
Updated2025-06-09
MaintainerDiscoResearch
Model Typemistral
Model Files  4.9 GB: 1-of-19   5.0 GB: 2-of-19   5.0 GB: 3-of-19   4.9 GB: 4-of-19   5.0 GB: 5-of-19   5.0 GB: 6-of-19   4.9 GB: 7-of-19   5.0 GB: 8-of-19   5.0 GB: 9-of-19   4.9 GB: 10-of-19   5.0 GB: 11-of-19   5.0 GB: 12-of-19   5.0 GB: 13-of-19   4.9 GB: 14-of-19   5.0 GB: 15-of-19   5.0 GB: 16-of-19   4.9 GB: 17-of-19   5.0 GB: 18-of-19   4.2 GB: 19-of-19
Supported Languagesen fr it es de
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16
Mixtral 7B 8expert (DiscoResearch/mixtral-7b-8expert)

Best Alternatives to Mixtral 7B 8expert

Best Alternatives
Context / RAM
Downloads
Likes
Multimaster 7B V632K / 142.5 GB62121
MultiverseBuddy 15B MoE32K / 25.8 GB90
Mini Mixtral V0.232K / 25.8 GB154
Lumina 232K / 37.1 GB470
OpenMistral MoE32K / 48.3 GB12180
Merged Model MoE32K / 53.3 GB191
RogerWizard 12B MoE32K / 25.8 GB161
Rawr32K / 93.5 GB13020
StarlingMaths 12B MoE32K / 25.8 GB140
WestLakeLaser 12B MoE32K / 25.8 GB510
Note: green Score (e.g. "73.2") means that the model is better than DiscoResearch/mixtral-7b-8expert.

Rank the Mixtral 7B 8expert Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124