HY MT1.5 7B is an open-source language model by tencent. Features: 7b LLM, VRAM: 16.1GB, Context: 256K, LLM Explorer Score: 0.35.
| LLM Name | HY MT1.5 7B |
| Repository ๐ค | https://huggingface.co/tencent/HY-MT1.5-7B |
| Model Size | 7b |
| Required VRAM | 16.1 GB |
| Updated | 2026-03-28 |
| Maintainer | tencent |
| Model Type | hunyuan_v1_dense |
| Model Files | |
| Supported Languages | zh en fr pt es ja tr ru ar ko th it de vi ms id tl hi pl cs nl km my fa gu ur te mr he bn ta uk bo kk mn ug |
| Model Architecture | HunYuanDenseV1ForCausalLM |
| Context Length | 262144 |
| Model Max Length | 262144 |
| Transformers Version | 4.57.1 |
| Tokenizer Class | PreTrainedTokenizerFast |
| Padding Token | <|pad|> |
| Vocabulary Size | 128167 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Hunyuan MT 7B | 32K / 16.1 GB | 6287 | 551 |
| Hunyuan MT Chimera 7B | 32K / 16.1 GB | 991 | 90 |
| Hunyuan 7B Instruct 0124 | 32K / 5 GB | 102 | 50 |
| ...ihui Hunyuan MT 7B Abliterated | 32K / 15 GB | 653 | 1 |
| ...yuan MT Chimera 7B Abliterated | 32K / 15 GB | 92 | 4 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐