SOLARC MoE 10.7Bx4 GGUF by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  SOLARC MoE 10.7Bx4 GGUF   URL Share it on

Base model:dopeornope/solarc-m... Base model:quantized:dopeornop...   Conversational   Gguf   Ko   Mixtral   Moe   Quantized   Region:us

SOLARC MoE 10.7Bx4 GGUF Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SOLARC MoE 10.7Bx4 GGUF (TheBloke/SOLARC-MOE-10.7Bx4-GGUF)
๐ŸŒŸ Advertise your project ๐Ÿš€

SOLARC MoE 10.7Bx4 GGUF Parameters and Internals

Model Type 
mixtral, text-generation
Additional Notes 
The model uses the Mixture of Experts (MOE) approach, based on SOLAR architecture.
Supported Languages 
ko (full)
Input Output 
Input Format:
text
Accepted Modalities:
text
Output Format:
text
LLM NameSOLARC MoE 10.7Bx4 GGUF
Repository ๐Ÿค—https://huggingface.co/TheBloke/SOLARC-MOE-10.7Bx4-GGUF 
Model NameSolarc MOE 10.7Bx4
Model CreatorSeungyoo Lee
Base Model(s)  DopeorNope/SOLARC-MOE-10.7Bx4   DopeorNope/SOLARC-MOE-10.7Bx4
Required VRAM12 GB
Updated2025-09-23
MaintainerTheBloke
Model Typemixtral
Model Files  12.0 GB   15.7 GB   20.3 GB   20.4 GB   24.8 GB   24.9 GB   29.6 GB   38.4 GB
Supported Languagesko
GGUF QuantizationYes
Quantization Typegguf
Model ArchitectureAutoModel
Licensecc-by-nc-sa-4.0

Best Alternatives to SOLARC MoE 10.7Bx4 GGUF

Best Alternatives
Context / RAM
Downloads
Likes
ComicBot V.2 Gguf32K / 5 GB390
Qwen3 Medical GRPO GGUF0K / 1.7 GB10382
Gemma2 WizardLM0K / 5.2 GB100
...ixtral 8x7B Instruct V0.1 GGUF0K / 15.6 GB25473639
Phi 2 GGUF0K / 1.2 GB109438228
Marco O1 GGUF0K / 3 GB2286
Dolphin 2.5 Mixtral 8x7b GGUF0K / 15.6 GB11283303
Mixtral 8x7B V0.1 GGUF0K / 15.6 GB5510430
Dolphin 2.7 Mixtral 8x7b GGUF0K / 15.6 GB9237147
GOAT Llama3.1 V0.10K / 0.2 GB23
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/SOLARC-MOE-10.7Bx4-GGUF.

Rank the SOLARC MoE 10.7Bx4 GGUF Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51535 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124