TinyMistral 248M Instruct by Locutusque

 ยป  All LLMs  ยป  Locutusque  ยป  TinyMistral 248M Instruct   URL Share it on

TinyMistral 248M Instruct is an open-source language model by Locutusque. Features: 248m LLM, VRAM: 1GB, Context: 32K, License: apache-2.0, Instruction-Based, HF Score: 28.2, LLM Explorer Score: 0.16, Arc: 24.3, HellaSwag: 27.5, MMLU: 25.2, TruthfulQA: 41.9, WinoGrande: 50.2.

  Autotrain compatible Base model:finetune:locutusque... Base model:locutusque/tinymist...   Dataset:berkeley-nest/nectar Dataset:locutusque/instructmix...   En   Endpoints compatible   Instruct   Mistral   Pytorch   Region:us   Safetensors

TinyMistral 248M Instruct Benchmarks

TinyMistral 248M Instruct (Locutusque/TinyMistral-248M-Instruct)
๐ŸŒŸ Advertise your project ๐Ÿš€

TinyMistral 248M Instruct Parameters and Internals

Model Type 
text-generation
Additional Notes 
During validation, this model achieved an average perplexity of 3.23 on Locutusque/InstructMix dataset. It has so far been trained on approximately 608,000 examples. More epochs are planned for this model.
Supported Languages 
en ()
Training Details 
Data Sources:
Locutusque/InstructMixCleaned, berkeley-nest/Nectar
Methodology:
Fully fine-tuned on Locutusque/InstructMix.
LLM NameTinyMistral 248M Instruct
Repository ๐Ÿค—https://huggingface.co/Locutusque/TinyMistral-248M-Instruct 
Base Model(s)  TinyMistral 248M   Locutusque/TinyMistral-248M
Model Size248m
Required VRAM1 GB
Updated2025-11-07
MaintainerLocutusque
Model Typemistral
Instruction-BasedYes
Model Files  1.0 GB   1.0 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.35.1
Tokenizer ClassLlamaTokenizer
Padding Token[PAD]
Vocabulary Size32005
Torch Data Typefloat32

Quantized Models of the TinyMistral 248M Instruct

Model
Likes
Downloads
VRAM
Tinymistralredq0380 GB
Tinymistralredq000 GB

Best Alternatives to TinyMistral 248M Instruct

Best Alternatives
Context / RAM
Downloads
Likes
...adrin TinyMistral248M Instruct32K / 0.5 GB23527
TinyMistral 248M V2.5 Instruct32K / 1 GB211
Tinymistv132K / 0.5 GB170
...istral 248M V2.5 Instruct Orpo32K / 0.5 GB70
TinyMistral 248M V2 Instruct32K / 0.5 GB5237
...stral V2 Pycoder Instruct 248M32K / 1 GB93
...mistral 248M Hypnosis Instruct32K / 0.5 GB41
...al V2 Pycoder Instruct 248M V132K / 0.5 GB51
...istral Magicoder Instruct 248M32K / 0.5 GB102
Tinymistv132K / 0.5 GB00
Note: green Score (e.g. "73.2") means that the model is better than Locutusque/TinyMistral-248M-Instruct.

Rank the TinyMistral 248M Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a