Model Type |
| ||||||
Additional Notes |
| ||||||
Training Details |
| ||||||
Input Output |
|
LLM Name | PiVoT MoE |
Repository ๐ค | https://huggingface.co/maywell/PiVoT-MoE |
Model Size | 36.1b |
Required VRAM | 72.3 GB |
Updated | 2025-09-23 |
Maintainer | maywell |
Model Type | mixtral |
Model Files | |
Model Architecture | MixtralForCausalLM |
License | cc-by-nc-4.0 |
Context Length | 32768 |
Model Max Length | 32768 |
Transformers Version | 4.36.1 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <s> |
Vocabulary Size | 32000 |
Torch Data Type | bfloat16 |
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
PiVoT MoE GGUF | 9 | 197 | 12 GB |
PiVoT MoE AWQ | 2 | 10 | 19 GB |
PiVoT MoE GPTQ | 1 | 7 | 18 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Umbra V3 MoE 4x11b 2ex | 32K / 72.3 GB | 286 | 4 |
Umbra V3 MoE 4x11b 2ex | 32K / 72.3 GB | 5 | 4 |
Umbra V3 MoE 4x11b | 32K / 72.3 GB | 5 | 5 |
Umbra V2.1 MoE 4x10.7 | 32K / 72.3 GB | 6 | 6 |
Mixolar 4x7b | 4K / 72.3 GB | 9780 | 3 |
Smartsolmix 4x10.7B V1 | 4K / 72.3 GB | 1858 | 0 |
Orca SOLAR 4x10.7B | 4K / 72.3 GB | 1738 | 0 |
MetaModel MoE | 4K / 72.3 GB | 1914 | 0 |
SOLARC MoE 10.7Bx4 | 4K / 144.7 GB | 1917 | 9 |
Frankenstein MoE En 10.7Bx4 | 4K / 72.3 GB | 1915 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐