| LLM Name | Phi 2 MoE Test | 
| Repository ๐ค | https://huggingface.co/AbstractPerspective/Phi-2_MoE_test | 
| Merged Model | Yes | 
| Model Size | 4.5b | 
| Required VRAM | 8.9 GB | 
| Updated | 2025-09-23 | 
| Maintainer | AbstractPerspective | 
| Model Type | phi-msft | 
| Model Files | |
| Model Architecture | PhiForCausalLM | 
| Model Max Length | 2048 | 
| Transformers Version | 4.37.0 | 
| Tokenizer Class | CodeGenTokenizer | 
| Padding Token | <|endoftext|> | 
| Vocabulary Size | 51200 | 
| Torch Data Type | float16 | 
| Activation Function | gelu_new | 
| Best Alternatives | Context / RAM | Downloads | Likes | 
|---|---|---|---|
| Phixtral 2x2 8 | 0K / 8.9 GB | 17 | 149 | 
| Samantha Phixtral 2x2 8 | 0K / 8.9 GB | 13 | 1 | 
| Phixtral 4x2 8odd | 0K / 8.9 GB | 6 | 3 | 
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐