SauerkrautLM Mixtral 8x7B Instruct GGUF by TheBloke

 »  All LLMs  »  TheBloke  »  SauerkrautLM Mixtral 8x7B Instruct GGUF   URL Share it on

SauerkrautLM Mixtral 8x7B Instruct GGUF is an open-source language model by TheBloke. Features: LLM, VRAM: 15.6GB, License: apache-2.0, MoE, Quantized, Instruction-Based, LLM Explorer Score: 0.11.

  Augmentation Base model:quantized:vagosolut... Base model:vagosolutions/sauer...   Conversational Dataset:argilla/distilabel-mat...   De   Dpo   En   Es   Finetuned   Fr   German   Gguf   Instruct   It   Mistral   Mixtral   Moe   Quantized   Region:us

SauerkrautLM Mixtral 8x7B Instruct GGUF Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

SauerkrautLM Mixtral 8x7B Instruct GGUF Parameters and Internals

Model Type 
mixtral
Use Cases 
Primary Use Cases:
text generation
Supported Languages 
English (unknown), German (unknown), French (unknown), Italian (unknown), Spanish (unknown)
Training Details 
Data Sources:
argilla/distilabel-math-preference-dpo
Methodology:
Mixture of Experts (MoE) Model
Model Architecture:
based on mistralai/Mixtral-8x7B-Instruct-v0.1
Input Output 
Input Format:
[INST] {prompt} [/INST]
Accepted Modalities:
text
LLM NameSauerkrautLM Mixtral 8x7B Instruct GGUF
Repository 🤗https://huggingface.co/TheBloke/SauerkrautLM-Mixtral-8x7B-Instruct-GGUF 
Model NameSauerkrautLM Mixtral 8X7B Instruct
Model CreatorVAGO solutions
Base Model(s)  ...rkrautLM Mixtral 8x7B Instruct   VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
Required VRAM15.6 GB
Updated2026-04-18
MaintainerTheBloke
Model Typemixtral
Instruction-BasedYes
Model Files  15.6 GB   20.4 GB   26.4 GB   26.4 GB   32.2 GB   32.2 GB   38.4 GB   49.6 GB
Supported Languagesen de fr it es
GGUF QuantizationYes
Quantization Typegguf
Model ArchitectureAutoModel
Licenseapache-2.0

Best Alternatives to SauerkrautLM Mixtral 8x7B Instruct GGUF

Best Alternatives
Context / RAM
Downloads
Likes
...ixtral 8x7B Instruct V0.1 GGUF0K / 15.6 GB20684655
Dolphin 2.5 Mixtral 8x7b GGUF0K / 15.6 GB15978308
Dolphin 2.7 Mixtral 8x7b GGUF0K / 15.6 GB9237147
Phi 3 Mini 4K Instruct GGUF0K / 1.4 GB155417
Spydaz Web AI 0K / 4.4 GB1543
Autocomplete Model0K / 4.1 GB340
Phi 3 Mini 4K Instruct0K / 2.3 GB80
Dre Phi Gguf0K / 2.3 GB60
...3 Mini 4k OLScience 4bitQ Gguf0K / 2.3 GB80
...i 3 Mini 4K Instruct V0.3 GGUF0K / 1.4 GB2275
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/SauerkrautLM-Mixtral-8x7B-Instruct-GGUF.

Rank the SauerkrautLM Mixtral 8x7B Instruct GGUF Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53232 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a