MiniMax M2.7 JANG 2L is an open-source language model by JANGQ-AI. Features: 18.6b LLM, VRAM: 27.4GB, Context: 192K, License: other, LLM Explorer Score: 0.42.
| LLM Name | MiniMax M2.7 JANG 2L |
| Repository ๐ค | https://huggingface.co/JANGQ-AI/MiniMax-M2.7-JANG_2L |
| Model Size | 18.6b |
| Required VRAM | 27.4 GB |
| Updated | 2026-04-13 |
| Maintainer | JANGQ-AI |
| Model Type | minimax_m2 |
| Model Files | |
| Model Architecture | MiniMaxM2ForCausalLM |
| License | other |
| Context Length | 196608 |
| Model Max Length | 196608 |
| Transformers Version | 4.46.1 |
| Vocabulary Size | 200064 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| MiniMax M2.7 JANG 2L CRACK | 192K / 27.4 GB | 458 | 4 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐