Model Type |
| ||||||
Additional Notes |
| ||||||
Training Details |
|
LLM Name | Japanese Mpt 7B |
Repository ๐ค | https://huggingface.co/lightblue/japanese-mpt-7b |
Model Size | 7b |
Required VRAM | 13.3 GB |
Updated | 2025-09-23 |
Maintainer | lightblue |
Model Type | mpt |
Model Files | |
Model Architecture | MPTForCausalLM |
License | apache-2.0 |
Model Max Length | 2048 |
Transformers Version | 4.30.2 |
Tokenizer Class | GPTNeoXTokenizer |
Vocabulary Size | 50432 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Mpt 7B Chat | 0K / 13.3 GB | 80920 | 518 |
Mpt 7B | 0K / 13.3 GB | 18460 | 1173 |
Mpt 7B Storywriter | 0K / 13.3 GB | 2261 | 839 |
Mpt 7B Instruct | 0K / 13.3 GB | 7946 | 470 |
Mpt 7B Int8 Ov | 0K / 0 GB | 13 | 0 |
Shears Mpt 7B 50 Base | 0K / 13.3 GB | 71 | 2 |
Mpt 7B | 0K / 26.5 GB | 5132 | 1 |
Mpt 7B 8K | 0K / 13.3 GB | 1915 | 26 |
Mpt 7B 8K Chat | 0K / 13.3 GB | 1942 | 40 |
Mpt 7B 8K Instruct | 0K / 13.3 GB | 2012 | 27 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐