Mixtral 8x22B V0.1 Bnb 4bit Smashed by PrunaAI

 ยป  All LLMs  ยป  PrunaAI  ยป  Mixtral 8x22B V0.1 Bnb 4bit Smashed   URL Share it on

Mixtral 8x22B V0.1 Bnb 4bit Smashed is an open-source language model by PrunaAI. Features: 72.7b LLM, VRAM: 80.2GB, Context: 64K, MoE, Quantized, LLM Explorer Score: 0.13.

  4-bit   4bit   Bitsandbytes   Mixtral   Moe   Pruna-ai   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Mixtral 8x22B V0.1 Bnb 4bit Smashed Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mixtral 8x22B V0.1 Bnb 4bit Smashed (PrunaAI/Mixtral-8x22B-v0.1-bnb-4bit-smashed)
๐ŸŒŸ Advertise your project ๐Ÿš€

Mixtral 8x22B V0.1 Bnb 4bit Smashed Parameters and Internals

Model Type 
Causal Language Model
Use Cases 
Areas:
Research, Commercial Applications
Additional Notes 
PrunaAI intends to make AI models cheaper, smaller, faster, and greener.
Training Details 
Data Sources:
WikiText
Methodology:
Compression with llm-int8
Hardware Used:
NVIDIA A100-PCIE-40GB
Input Output 
Performance Tips:
Efficiency gains might differ across hardware and settings; it is recommended to test in specific use-case conditions.
Release Notes 
Version:
1.0
Notes:
The smashed model uses safetensors format and includes efficiency improvements.
LLM NameMixtral 8x22B V0.1 Bnb 4bit Smashed
Repository ๐Ÿค—https://huggingface.co/PrunaAI/Mixtral-8x22B-v0.1-bnb-4bit-smashed 
Model Size72.7b
Required VRAM80.2 GB
Updated2025-09-28
MaintainerPrunaAI
Model Typemixtral
Model Files  5.0 GB: 1-of-17   5.0 GB: 2-of-17   5.0 GB: 3-of-17   5.0 GB: 4-of-17   5.0 GB: 5-of-17   5.0 GB: 6-of-17   5.0 GB: 7-of-17   5.0 GB: 8-of-17   5.0 GB: 9-of-17   5.0 GB: 10-of-17   5.0 GB: 11-of-17   5.0 GB: 12-of-17   5.0 GB: 13-of-17   5.0 GB: 14-of-17   5.0 GB: 15-of-17   4.8 GB: 16-of-17   0.4 GB: 17-of-17
Quantization Type4bit
Model ArchitectureMixtralForCausalLM
Context Length65536
Model Max Length65536
Transformers Version4.40.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Rank the Mixtral 8x22B V0.1 Bnb 4bit Smashed Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a