Model Type |
| |||||||||||||||
Training Details |
|
LLM Name | Mpt 1B Redpajama 200B Dolly |
Repository ๐ค | https://huggingface.co/mosaicml/mpt-1b-redpajama-200b-dolly |
Model Size | 1b |
Required VRAM | 5.2 GB |
Updated | 2025-09-15 |
Maintainer | mosaicml |
Model Type | mosaic_gpt |
Model Files | |
Model Architecture | MosaicGPT |
License | cc-by-sa-3.0 |
Model Max Length | 2048 |
Transformers Version | 4.27.4 |
Tokenizer Class | GPTNeoXTokenizer |
Vocabulary Size | 50432 |
Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Mpt 1B Redpajama 200B | 0K / 5.2 GB | 2956 | 2 |
Mpt 1B Redpajama 200B | 0K / 5.2 GB | 252 | 92 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐