1.7B MixtureVitae 300BT V1 Decontaminated 16K Merged is an open-source language model by ali-elganzory. Features: 1.7b LLM, VRAM: 3.4GB, Context: 16K, Merged, LLM Explorer Score: 0.33.
| LLM Name | 1.7B MixtureVitae 300BT V1 Decontaminated 16K Merged |
| Repository 🤗 | https://huggingface.co/ali-elganzory/1.7b-MixtureVitae-300BT-v1-decontaminated-16k-merged |
| Base Model(s) | |
| Merged Model | Yes |
| Model Size | 1.7b |
| Required VRAM | 3.4 GB |
| Updated | 2026-04-20 |
| Maintainer | ali-elganzory |
| Model Type | opensci |
| Model Files | |
| Model Architecture | OpensciForCausalLM |
| Context Length | 16384 |
| Model Max Length | 16384 |
| Transformers Version | 4.57.6 |
| Tokenizer Class | GPTNeoXTokenizer |
| Padding Token | <end_of_turn> |
| Vocabulary Size | 50304 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ...t 16k SFT Tulu3 Decontaminated | 16K / 3.4 GB | 312 | 0 |
| ....7B Comma0.1 300BT Longsft 16k | 16K / 3.4 GB | 193 | 0 |
| ... Web Curated 100BT Longsft 16k | 16K / 3.4 GB | 19 | 0 |
| ...ae 300BT V1 Decontaminated 16K | 16K / 3.4 GB | 24 | 0 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟