| Additional Notes |
| ||||||
| Training Details |
|
| LLM Name | Acolyte 22B |
| Repository ๐ค | https://huggingface.co/rAIfle/Acolyte-22B |
| Base Model(s) | |
| Merged Model | Yes |
| Model Size | 22b |
| Required VRAM | 44.7 GB |
| Updated | 2025-09-23 |
| Maintainer | rAIfle |
| Model Type | mistral |
| Instruction-Based | Yes |
| Model Files | |
| Model Architecture | MistralForCausalLM |
| Context Length | 131072 |
| Model Max Length | 131072 |
| Transformers Version | 4.44.2 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | [control_748] |
| Vocabulary Size | 32768 |
| LoRA Model | Yes |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| MS Schisandra 22B V0.2 | 128K / 44.7 GB | 3 | 9 |
| ...ntheon RP Pure 1.6.2 22B Small | 128K / 44.7 GB | 3 | 32 |
| MS Meadowlark 22B | 128K / 44.7 GB | 14 | 16 |
| Beeper King 22B | 128K / 44.7 GB | 6 | 7 |
| ... V4x1.6.2RP Cydonia VXXX 22B 8 | 128K / 44.7 GB | 5 | 5 |
| MS Inky 2409 22B | 128K / 44.7 GB | 7 | 0 |
| ... V4x1.6.2RP Cydonia VXXX 22B 6 | 128K / 44.7 GB | 5 | 3 |
| MS Fujin 2409 22B | 128K / 44.7 GB | 6 | 0 |
| MS Physician 2409 22B | 128K / 44.7 GB | 6 | 0 |
| MS Dampf 2409 22B | 128K / 44.7 GB | 5 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐