Normistral 11B Warm by norallm

 »  All LLMs  »  norallm  »  Normistral 11B Warm   URL Share it on

Normistral 11B Warm is an open-source language model by norallm. Features: 11b LLM, VRAM: 22.9GB, Context: 1000K, License: apache-2.0, Quantized, LLM Explorer Score: 0.19.

  Arxiv:2412.06484 Base model:mistralai/mistral-n... Base model:quantized:mistralai...   Bokmaal   Da   En   Endpoints compatible   Fo   Gguf   Is   Mistral   Nb   Nn   No   Norwegian   Nynorsk   Pytorch   Quantized   Region:us   Safetensors   Sami   Se   Sharded   Sv   Tensorflow

Normistral 11B Warm Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Normistral 11B Warm Parameters and Internals

LLM NameNormistral 11B Warm
Repository 🤗https://huggingface.co/norallm/normistral-11b-warm 
Base Model(s)  Mistral Nemo Base 2407   mistralai/Mistral-Nemo-Base-2407
Model Size11b
Required VRAM22.9 GB
Updated2026-04-10
Maintainernorallm
Model Typemistral
Model Files  4.6 GB: 1-of-5   4.7 GB: 2-of-5   4.6 GB: 3-of-5   4.7 GB: 4-of-5   4.3 GB: 5-of-5   5.6 GB   6.9 GB   8.1 GB   9.4 GB   12.1 GB   22.9 GB
Supported Languagesnb nn no se sv da en is fo
GGUF QuantizationYes
Quantization Typegguf
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length1024000
Model Max Length1024000
Transformers Version4.44.0.dev0
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size51200
Torch Data Typebfloat16

Best Alternatives to Normistral 11B Warm

Best Alternatives
Context / RAM
Downloads
Likes
Huginn V5.10.7B32K / 21.4 GB00
... 11B V2.2 Instruct EXL2 4.5bit32K / 6.5 GB2923
...B V2.2 Instruct HQQ 4bit 128gs32K / 6.2 GB303
...B V2.2 Instruct HQQ 8bit 128gs32K / 11.7 GB143
...t 3.1 Frankenmerge 11B Gguf Q832K / 8.3 GB171
Bielik 11B V2.2 Instruct GPTQ32K / 6.2 GB2233
Bielik 11B V2.2 Instruct AWQ32K / 6.2 GB1763
Bielik 11B V2.2 Instruct FP832K / 11.4 GB1013
... 11B V2.2 Instruct Quanto 8bit32K / 12 GB304
Bielik 11B V2.2 Instruct W8A832K / 11.5 GB193
Note: green Score (e.g. "73.2") means that the model is better than norallm/normistral-11b-warm.

Rank the Normistral 11B Warm Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53185 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a