SOLARC MoE 10.7Bx4 GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  SOLARC MoE 10.7Bx4 GPTQ   URL Share it on

  4-bit   Autotrain compatible Base model:dopeornope/solarc-m... Base model:quantized:dopeornop...   Conversational   Gptq   Ko   Mixtral   Moe   Quantized   Region:us   Safetensors

SOLARC MoE 10.7Bx4 GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SOLARC MoE 10.7Bx4 GPTQ (TheBloke/SOLARC-MOE-10.7Bx4-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

SOLARC MoE 10.7Bx4 GPTQ Parameters and Internals

Model Type 
mixtral, text-generation
Additional Notes 
This model is based on the Mixture of Experts (MOE) approach.
Supported Languages 
ko (Native or proficient)
Input Output 
Accepted Modalities:
text
Output Format:
text
LLM NameSOLARC MoE 10.7Bx4 GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/SOLARC-MOE-10.7Bx4-GPTQ 
Model NameSolarc MOE 10.7Bx4
Model CreatorSeungyoo Lee
Base Model(s)  DopeorNope/SOLARC-MOE-10.7Bx4   DopeorNope/SOLARC-MOE-10.7Bx4
Model Size4.8b
Required VRAM18.5 GB
Updated2025-09-23
MaintainerTheBloke
Model Typemixtral
Model Files  18.5 GB
Supported Languagesko
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-sa-4.0
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to SOLARC MoE 10.7Bx4 GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
PiVoT MoE GPTQ32K / 18.5 GB71
Lumosia MoE 4x10.7 GPTQ4K / 18.5 GB84
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/SOLARC-MOE-10.7Bx4-GPTQ.

Rank the SOLARC MoE 10.7Bx4 GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51535 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124