Model Type |
| ||||||||||||
Use Cases |
| ||||||||||||
Additional Notes |
| ||||||||||||
Supported Languages |
| ||||||||||||
Training Details |
| ||||||||||||
Input Output |
|
LLM Name | Mistral Nemo Base 2407 |
Repository ๐ค | https://huggingface.co/mistralai/Mistral-Nemo-Base-2407 |
Model Size | 12.2b |
Required VRAM | 24.5 GB |
Updated | 2025-09-10 |
Maintainer | mistralai |
Model Files | |
Supported Languages | en fr de es it pt ru zh ja |
Gated Model | Yes |
Model Architecture | MambaSSM |
License | MNPL-0.1 |
Tokenizer Class | PreTrainedTokenizerFast |
Vocabulary Size | 131072 |
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
Normistral 11B Warm | 9 | 197 | 22 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Mistral Nemo Instruct 2407 | 0K / 24.5 GB | 120543 | 1588 |
MistralAI Nemo Instruct ChatML | 0K / 24.5 GB | 203 | 2 |
...Nemo Instruct 2407 Abliterated | 0K / 24.5 GB | 107 | 3 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐