TinyMistral 6x248M by M4-ai

 ยป  All LLMs  ยป  M4-ai  ยป  TinyMistral 6x248M   URL Share it on

  Autotrain compatible Base model:felladrin/tinymistr... Base model:jtatman/tinymistral... Base model:locutusque/tinymist... Base model:locutusque/tinymist... Base model:locutusque/tinymist... Base model:locutusque/tinymist... Base model:merge:felladrin/tin... Base model:merge:jtatman/tinym... Base model:merge:locutusque/ti... Base model:merge:locutusque/ti... Base model:merge:locutusque/ti... Base model:merge:locutusque/ti...   Dataset:nampdn-ai/mini-pes2o   Endpoints compatible Felladrin/tinymistral-248m-sft...   Frankenmoe   Instruct Jtatman/tinymistral-v2-pycoder...   Lazymergekit   Locutusque/tinymistral-248m-v2 Locutusque/tinymistral-248m-v2... Locutusque/tinymistral-248m-v2... Locutusque/tinymistral-248m-v2...   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors

TinyMistral 6x248M Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
TinyMistral 6x248M (M4-ai/TinyMistral-6x248M)
๐ŸŒŸ Advertise your project ๐Ÿš€

TinyMistral 6x248M Parameters and Internals

Model Type 
MOE, text generation
Training Details 
Data Sources:
nampdn-ai/mini-peS2o
Data Volume:
600,000 examples
Model Architecture:
Mixture of Experts
Input Output 
Performance Tips:
Do not use Inference API; follow recommended parameters for best performance.
LLM NameTinyMistral 6x248M
Repository ๐Ÿค—https://huggingface.co/M4-ai/TinyMistral-6x248M 
Base Model(s)  TinyMistral 248M V2   Locutusque/TinyMistral-248M-v2.5   Locutusque/TinyMistral-248M-v2.5-Instruct   ...stral V2 Pycoder Instruct 248M   Felladrin/TinyMistral-248M-SFT-v4   TinyMistral 248M V2 Instruct   Locutusque/TinyMistral-248M-v2   Locutusque/TinyMistral-248M-v2.5   Locutusque/TinyMistral-248M-v2.5-Instruct   jtatman/tinymistral-v2-pycoder-instruct-248m   Felladrin/TinyMistral-248M-SFT-v4   Locutusque/TinyMistral-248M-v2-Instruct
Model Size1b
Required VRAM4 GB
Updated2025-09-23
MaintainerM4-ai
Model Typemixtral
Instruction-BasedYes
Model Files  4.0 GB
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<|bos|>
Vocabulary Size32005
Torch Data Typefloat32

Best Alternatives to TinyMistral 6x248M

Best Alternatives
Context / RAM
Downloads
Likes
TinyMistral 6x248M Instruct32K / 4 GB64610
Note: green Score (e.g. "73.2") means that the model is better than M4-ai/TinyMistral-6x248M.

Rank the TinyMistral 6x248M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51534 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124