Mini Mixtral V0.2 by NeuralNovel

 ยป  All LLMs  ยป  NeuralNovel  ยป  Mini Mixtral V0.2   URL Share it on

Mini Mixtral V0.2 is an open-source language model by NeuralNovel. Features: 7b LLM, VRAM: 25.8GB, Context: 32K, License: apache-2.0, Instruction-Based, HF Score: 64, LLM Explorer Score: 0.14, Arc: 61.3, HellaSwag: 84.1, MMLU: 63.8, TruthfulQA: 50.4, WinoGrande: 78.9, GSM8K: 45.6.

  Arxiv:2101.03961   Autotrain compatible Base model:merge:mistralai/mis... Base model:merge:unsloth/mistr... Base model:mistralai/mistral-7... Base model:unsloth/mistral-7b-...   Endpoints compatible   Frankenmoe   Instruct   Lazymergekit   Merge   Mergekit Mistralai/mistral-7b-instruct-...   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow   Unsloth/mistral-7b-v0.2

Mini Mixtral V0.2 Benchmarks

Mini Mixtral V0.2 (NeuralNovel/Mini-Mixtral-v0.2)
๐ŸŒŸ Advertise your project ๐Ÿš€

Mini Mixtral V0.2 Parameters and Internals

Model Type 
Mixture of Experts (MoE)
Additional Notes 
This model is a fusion of the unsloth/mistral-7b-v0.2 and mistralai/Mistral-7B-Instruct-v0.2 models using the LazyMergekit.
LLM NameMini Mixtral V0.2
Repository ๐Ÿค—https://huggingface.co/NeuralNovel/Mini-Mixtral-v0.2 
Base Model(s)  unsloth/mistral-7b-v0.2   mistralai/Mistral-7B-Instruct-v0.2   unsloth/mistral-7b-v0.2   mistralai/Mistral-7B-Instruct-v0.2
Model Size7b
Required VRAM25.8 GB
Updated2025-09-24
MaintainerNeuralNovel
Model Typemixtral
Instruction-BasedYes
Model Files  1.9 GB: 1-of-13   2.0 GB: 2-of-13   2.0 GB: 3-of-13   2.0 GB: 4-of-13   2.0 GB: 5-of-13   2.0 GB: 6-of-13   2.0 GB: 7-of-13   2.0 GB: 8-of-13   2.0 GB: 9-of-13   2.0 GB: 10-of-13   2.0 GB: 11-of-13   2.0 GB: 12-of-13   1.9 GB: 13-of-13
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.39.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Mini Mixtral V0.2

Best Alternatives
Context / RAM
Downloads
Likes
Multilingual Mistral32K / 93.5 GB7242
OpenMistral MoE32K / 48.3 GB12180
Magiq 332K / 37.1 GB203
FNCARL900032K / 48.3 GB210
CollAIborate4x7B32K / 48.7 GB71
Bigstral 12B 32K 8xMoE32K / 163.3 GB72
CollAIborate4x7B32K / 48.7 GB11
OpenMistral MoE32K / 48.3 GB260
...t V0.2 2x7B MoE 6.0bpw H6 EXL232K / 9.9 GB141

Rank the Mini Mixtral V0.2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52628 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a