Gpt4all Mpt is an open-source language model by nomic-ai. Features: LLM, VRAM: 26.6GB, License: apache-2.0, LLM Explorer Score: 0.07.
| Model Type |
| ||||||
| Supported Languages |
| ||||||
| Training Details |
|
| LLM Name | Gpt4all Mpt |
| Repository ๐ค | https://huggingface.co/nomic-ai/gpt4all-mpt |
| Required VRAM | 26.6 GB |
| Updated | 2025-12-03 |
| Maintainer | nomic-ai |
| Model Type | mpt |
| Model Files | |
| Supported Languages | en |
| Model Architecture | MPTForCausalLM |
| License | apache-2.0 |
| Model Max Length | 2048 |
| Transformers Version | 4.28.1 |
| Tokenizer Class | GPTNeoXTokenizer |
| Vocabulary Size | 50432 |
| Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Tiny Mpt Random Remote Code | 0K / 0 GB | 283 | 0 |
| WangchanLion7B | 0K / 29.8 GB | 12 | 8 |
| Replit Code Instruct Glaive | 0K / 10.4 GB | 8 | 88 |
| Results Sharded Bf16 5GB | 0K / 13.4 GB | 5 | 0 |
| Replit Coder | 0K / 5.2 GB | 6 | 0 |
| Mpt Mini Shakespeare | 0K / 0 GB | 96 | 1 |
| PhoGPT 7B5 GGUF | 0K / 17 GB | 32 | 3 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐