Orion MoE8x7B is an open-source language model by OrionStarAI. Features: LLM, VRAM: 96.2GB, Context: 8K, LLM Explorer Score: 0.16.
| LLM Name | Orion MoE8x7B |
| Repository ๐ค | https://huggingface.co/OrionStarAI/Orion-MoE8x7B |
| Required VRAM | 96.2 GB |
| Updated | 2026-04-10 |
| Maintainer | OrionStarAI |
| Model Type | orion_moe |
| Model Files | |
| Supported Languages | en zh ja ko |
| Model Architecture | OrionMOECausalLM |
| Context Length | 8192 |
| Model Max Length | 8192 |
| Transformers Version | 4.37.1 |
| Tokenizer Class | OrionTokenizer |
| Beginning of Sentence Token | <s> |
| End of Sentence Token | </s> |
| Unk Token | <unk> |
| Vocabulary Size | 113664 |
| Torch Data Type | bfloat16 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐