S1 Llama 3.2 3Bx4 MoE is an open-source language model by cloudyu. Features: 9.6b LLM, VRAM: 38GB, Context: 128K, License: llama3, MoE, LLM Explorer Score: 0.23.
| LLM Name | S1 Llama 3.2 3Bx4 MoE |
| Repository ๐ค | https://huggingface.co/cloudyu/S1-Llama-3.2-3Bx4-MoE |
| Model Size | 9.6b |
| Required VRAM | 38 GB |
| Updated | 2026-03-05 |
| Maintainer | cloudyu |
| Model Type | mixtral |
| Model Files | |
| Model Architecture | MixtralForCausalLM |
| License | llama3 |
| Context Length | 131072 |
| Model Max Length | 131072 |
| Transformers Version | 4.48.2 |
| Tokenizer Class | PreTrainedTokenizerFast |
| Padding Token | <|begin_of_text|> |
| Vocabulary Size | 128256 |
| Torch Data Type | float32 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐