SOLARC MoE 10.7Bx4 by DopeorNope

 ยป  All LLMs  ยป  DopeorNope  ยป  SOLARC MoE 10.7Bx4   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   Ko   Merge   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

SOLARC MoE 10.7Bx4 Benchmarks

SOLARC MoE 10.7Bx4 (DopeorNope/SOLARC-MOE-10.7Bx4)
๐ŸŒŸ Advertise your project ๐Ÿš€

SOLARC MoE 10.7Bx4 Parameters and Internals

Model Type 
text generation
Supported Languages 
Korean (fluent), English (unknown)
Training Details 
Methodology:
Mixture of Experts (MOE) utilizing SOLAR architecture.
Model Architecture:
Auto-regressive language model based on SOLAR architecture
Input Output 
Input Format:
Text only
Accepted Modalities:
text
Output Format:
Text only
LLM NameSOLARC MoE 10.7Bx4
Repository ๐Ÿค—https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4 
Model Size36.1b
Required VRAM144.7 GB
Updated2025-09-23
MaintainerDopeorNope
Model Typemixtral
Model Files  4.8 GB: 1-of-30   4.8 GB: 2-of-30   4.9 GB: 3-of-30   4.8 GB: 4-of-30   4.9 GB: 5-of-30   4.8 GB: 6-of-30   4.8 GB: 7-of-30   4.9 GB: 8-of-30   4.8 GB: 9-of-30   5.0 GB: 10-of-30   4.9 GB: 11-of-30   4.8 GB: 12-of-30   4.9 GB: 13-of-30   4.8 GB: 14-of-30   4.8 GB: 15-of-30   4.9 GB: 16-of-30   4.8 GB: 17-of-30   5.0 GB: 18-of-30   4.9 GB: 19-of-30   4.8 GB: 20-of-30   4.9 GB: 21-of-30   4.8 GB: 22-of-30   4.8 GB: 23-of-30   4.9 GB: 24-of-30   4.8 GB: 25-of-30   5.0 GB: 26-of-30   4.9 GB: 27-of-30   4.8 GB: 28-of-30   4.9 GB: 29-of-30   3.8 GB: 30-of-30
Supported Languagesko
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-sa-4.0
Context Length4096
Model Max Length4096
Transformers Version4.36.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat32

Quantized Models of the SOLARC MoE 10.7Bx4

Model
Likes
Downloads
VRAM
SOLARC MoE 10.7Bx4 GGUF01015 GB
SOLARC MoE 10.7Bx4 GGUF1922512 GB
SOLARC MoE 10.7Bx4 GPTQ4718 GB
SOLARC MoE 10.7Bx4 AWQ2819 GB

Best Alternatives to SOLARC MoE 10.7Bx4

Best Alternatives
Context / RAM
Downloads
Likes
Umbra V3 MoE 4x11b 2ex32K / 72.3 GB2864
PiVoT MoE32K / 72.3 GB17908
Umbra V3 MoE 4x11b 2ex32K / 72.3 GB54
Umbra V3 MoE 4x11b32K / 72.3 GB55
Umbra V2.1 MoE 4x10.732K / 72.3 GB66
Mixolar 4x7b4K / 72.3 GB97803
Smartsolmix 4x10.7B V14K / 72.3 GB18580
Orca SOLAR 4x10.7B4K / 72.3 GB17380
MetaModel MoE4K / 72.3 GB19140
Frankenstein MoE En 10.7Bx44K / 72.3 GB19150
Note: green Score (e.g. "73.2") means that the model is better than DopeorNope/SOLARC-MOE-10.7Bx4.

Rank the SOLARC MoE 10.7Bx4 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51535 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124