Mistral 22B V0.2 by Vezora

 ยป  All LLMs  ยป  Vezora  ยป  Mistral 22B V0.2   URL Share it on

  Autotrain compatible   Endpoints compatible   Mistral   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/Vezora/Mistral-22B-v0.2 

Mistral 22B V0.2 Benchmarks

Mistral 22B V0.2 (Vezora/Mistral-22B-v0.2)
๐ŸŒŸ Advertise your project ๐Ÿš€

Mistral 22B V0.2 Parameters and Internals

Model Type 
dense model, MOE to Dense conversion
Use Cases 
Applications:
Coding, Math proficiency
Limitations:
This model is highly uncensored
Additional Notes 
The model requires Guanaco prompt format for optimal results. It's still experimental and v0.3 is under training.
Training Details 
Data Volume:
8x more data than v0.1
Methodology:
Experimental model, MOE to Dense conversion
Context Length:
32000
Input Output 
Input Format:
Guanaco prompt format
Release Notes 
Version:
v.02
Date:
April 13
Notes:
Handcrafted experimental model, trained on more data, capabilities improved
LLM NameMistral 22B V0.2
Repository ๐Ÿค—https://huggingface.co/Vezora/Mistral-22B-v0.2 
Model Size22b
Required VRAM44.7 GB
Updated2025-09-14
MaintainerVezora
Model Typemistral
Model Files  4.9 GB: 1-of-9   5.0 GB: 2-of-9   5.0 GB: 3-of-9   4.9 GB: 4-of-9   5.0 GB: 5-of-9   5.0 GB: 6-of-9   4.9 GB: 7-of-9   5.0 GB: 8-of-9   5.0 GB: 9-of-9
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length65536
Model Max Length65536
Transformers Version4.40.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the Mistral 22B V0.2

Model
Likes
Downloads
VRAM
Mistral 22B V0.2 GGUF0388 GB
Mistral 22B V0.2 AWQ2612 GB

Best Alternatives to Mistral 22B V0.2

Best Alternatives
Context / RAM
Downloads
Likes
MS Schisandra 22B V0.2128K / 44.7 GB59
...ntheon RP Pure 1.6.2 22B Small128K / 44.7 GB1032
MS Meadowlark 22B128K / 44.7 GB7915
...rker The Final Abomination 22B128K / 44.7 GB76
...er The Final Transgression 22B128K / 44.7 GB103
...Darker The Final Directive 22B128K / 44.7 GB70
The Omega Directive M 22B V1.0128K / 44.7 GB92
Retrograde Omega M 22B V1.0128K / 44.7 GB80
Beeper King 22B128K / 44.7 GB87
... V4x1.6.2RP Cydonia VXXX 22B 8128K / 44.7 GB75
Note: green Score (e.g. "73.2") means that the model is better than Vezora/Mistral-22B-v0.2.

Rank the Mistral 22B V0.2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51368 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124