Openbuddy Mixtral 8x7b V15.2 GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Openbuddy Mixtral 8x7b V15.2 GPTQ   URL Share it on

Openbuddy Mixtral 8x7b V15.2 GPTQ is an open-source language model by TheBloke. Features: 6.1b LLM, VRAM: 23.9GB, Context: 32K, License: apache-2.0, MoE, Quantized, LLM Explorer Score: 0.11.

  4-bit   Autotrain compatible Base model:openbuddy/openbuddy... Base model:quantized:openbuddy...   Gptq   Mixtral   Moe   Quantized   Region:us   Safetensors

Openbuddy Mixtral 8x7b V15.2 GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Openbuddy Mixtral 8x7b V15.2 GPTQ (TheBloke/openbuddy-mixtral-8x7b-v15.2-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Openbuddy Mixtral 8x7b V15.2 GPTQ Parameters and Internals

LLM NameOpenbuddy Mixtral 8x7b V15.2 GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/openbuddy-mixtral-8x7b-v15.2-GPTQ 
Model NameOpenbuddy Mixtral 8X7B V15.2
Model CreatorOpenBuddy
Base Model(s)  OpenBuddy/openbuddy-mixtral-8x7b-v15.2   OpenBuddy/openbuddy-mixtral-8x7b-v15.2
Model Size6.1b
Required VRAM23.9 GB
Updated2025-11-01
MaintainerTheBloke
Model Typemixtral
Model Files  23.9 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size36608
Torch Data Typebfloat16

Best Alternatives to Openbuddy Mixtral 8x7b V15.2 GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 8x7B V0.1 GPTQ32K / 23.8 GB1626127
Dolphin 2.7 Mixtral 8x7b GPTQ32K / 23.8 GB7819
Open Gpt4 8x7B V0.2 GPTQ32K / 23.8 GB16
Sensualize Mixtral GPTQ32K / 23.8 GB165
SauerkrautLM Mixtral 8x7B GPTQ32K / 23.8 GB102
Fennec Mixtral 8x7B GPTQ32K / 23.8 GB112
Mixtral SlimOrca 8x7B GPTQ32K / 23.8 GB1411
Chupacabra 8x7B MoE GPTQ32K / 23.8 GB91
Mixtral 8x7B V0.1 Gptq 4bit32K / 24.1 GB90
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/openbuddy-mixtral-8x7b-v15.2-GPTQ.

Rank the Openbuddy Mixtral 8x7b V15.2 GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52509 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a