Ministral 8B Instruct 2410 by mistralai

 ยป  All LLMs  ยป  mistralai  ยป  Ministral 8B Instruct 2410   URL Share it on

  De   En   Es   Fr   Instruct   It   Ja   Ko   Mistral   Mistral-common   Pt   Region:us   Ru   Safetensors   Sharded   Tensorflow   Vllm   Zh

Ministral 8B Instruct 2410 Benchmarks

Ministral 8B Instruct 2410 (mistralai/Ministral-8B-Instruct-2410)
๐ŸŒŸ Advertise your project ๐Ÿš€

Ministral 8B Instruct 2410 Parameters and Internals

Model Type 
instruct fine-tuned model
Use Cases 
Areas:
local intelligence, on-device computing, at-the-edge use cases
Applications:
research
Primary Use Cases:
non-commercial research purposes
Limitations:
Only for non-commercial research purposes
Additional Notes 
Trained with a 128k context window utilizing interleaved sliding-window attention.
Supported Languages 
en (English), fr (French), de (German), es (Spanish), it (Italian), pt (Portuguese), zh (Chinese), ja (Japanese), ru (Russian), ko (Korean)
Training Details 
Context Length:
128000
Model Architecture:
Dense Transformer
Input Output 
Input Format:
V3-Tekken tokenizer format
Accepted Modalities:
text
Output Format:
text
Performance Tips:
Use Mistral Inference or vLLM for optimized performance.
Release Notes 
Version:
8B-Instruct-2410
Date:
2023-10
Notes:
Release of Ministral-8B-Instruct-2410 under Mistral Research License
LLM NameMinistral 8B Instruct 2410
Repository ๐Ÿค—https://huggingface.co/mistralai/Ministral-8B-Instruct-2410 
Model Size8b
Required VRAM16.1 GB
Updated2025-07-26
Maintainermistralai
Instruction-BasedYes
Model Files  16.0 GB   5.0 GB: 1-of-4   5.0 GB: 2-of-4   5.0 GB: 3-of-4   1.1 GB: 4-of-4
Supported Languagesen fr de es it pt zh ja ru ko
Gated ModelYes
Model ArchitectureMambaSSM
LicenseMNPL-0.1

Quantized Models of the Ministral 8B Instruct 2410

Model
Likes
Downloads
VRAM
...inistral 8B Instruct 2410 GGUF2693 GB

Best Alternatives to Ministral 8B Instruct 2410

Best Alternatives
Context / RAM
Downloads
Likes
Ministral 8B Instruct 2410 HF0K / 16.1 GB80

Rank the Ministral 8B Instruct 2410 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50068 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124