PiVoT MoE AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  PiVoT MoE AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq   Base model:maywell/pivot-moe Base model:quantized:maywell/p...   Conversational   Mixtral   Moe   Quantized   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/TheBloke/PiVoT-MoE-AWQ 

PiVoT MoE AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
PiVoT MoE AWQ (TheBloke/PiVoT-MoE-AWQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

PiVoT MoE AWQ Parameters and Internals

Model Type 
mixtral
Input Output 
Input Format:
{system_message} ### Instruction: {prompt} ### Response:
LLM NamePiVoT MoE AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/PiVoT-MoE-AWQ 
Model NamePivot MoE
Model CreatorJeonghwan Park
Base Model(s)  PiVoT MoE   maywell/PiVoT-MoE
Model Size5.1b
Required VRAM19.2 GB
Updated2025-09-23
MaintainerTheBloke
Model Typemixtral
Model Files  10.0 GB: 1-of-2   9.2 GB: 2-of-2
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to PiVoT MoE AWQ

Best Alternatives
Context / RAM
Downloads
Likes
SOLARC MoE 10.7Bx4 AWQ4K / 19.2 GB82
Minerva MoE 2x3B16K / 10.2 GB19600
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/PiVoT-MoE-AWQ.

Rank the PiVoT MoE AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51535 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124