Model Type |
| |||||||||
Supported Languages |
| |||||||||
Training Details |
| |||||||||
Input Output |
|
LLM Name | SOLARC MoE 10.7Bx4 |
Repository ๐ค | https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4 |
Model Size | 36.1b |
Required VRAM | 144.7 GB |
Updated | 2025-09-23 |
Maintainer | DopeorNope |
Model Type | mixtral |
Model Files | |
Supported Languages | ko |
Model Architecture | MixtralForCausalLM |
License | cc-by-nc-sa-4.0 |
Context Length | 4096 |
Model Max Length | 4096 |
Transformers Version | 4.36.0.dev0 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <s> |
Vocabulary Size | 32000 |
Torch Data Type | float32 |
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
SOLARC MoE 10.7Bx4 GGUF | 0 | 10 | 15 GB |
SOLARC MoE 10.7Bx4 GGUF | 19 | 225 | 12 GB |
SOLARC MoE 10.7Bx4 GPTQ | 4 | 7 | 18 GB |
SOLARC MoE 10.7Bx4 AWQ | 2 | 8 | 19 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Umbra V3 MoE 4x11b 2ex | 32K / 72.3 GB | 286 | 4 |
PiVoT MoE | 32K / 72.3 GB | 1790 | 8 |
Umbra V3 MoE 4x11b 2ex | 32K / 72.3 GB | 5 | 4 |
Umbra V3 MoE 4x11b | 32K / 72.3 GB | 5 | 5 |
Umbra V2.1 MoE 4x10.7 | 32K / 72.3 GB | 6 | 6 |
Mixolar 4x7b | 4K / 72.3 GB | 9780 | 3 |
Smartsolmix 4x10.7B V1 | 4K / 72.3 GB | 1858 | 0 |
Orca SOLAR 4x10.7B | 4K / 72.3 GB | 1738 | 0 |
MetaModel MoE | 4K / 72.3 GB | 1914 | 0 |
Frankenstein MoE En 10.7Bx4 | 4K / 72.3 GB | 1915 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐