MagpieLM 4B Chat V0.1 by Magpie-Align

 ยป  All LLMs  ยป  Magpie-Align  ยป  MagpieLM 4B Chat V0.1   URL Share it on

MagpieLM 4B Chat V0.1 is an open-source language model by Magpie-Align. Features: 4b LLM, VRAM: 9GB, Context: 128K, License: other, LLM Explorer Score: 0.16.

  Arxiv:2406.08464   Arxiv:2411.07133   Alignment-handbook Base model:finetune:magpie-ali... Base model:magpie-align/magpie...   Conversational Dataset:magpie-align/magpielm-... Dataset:magpie-align/magpielm-...   Dpo   Endpoints compatible   Generated from trainer   Llama   Region:us   Safetensors   Sharded   Tensorflow   Trl

MagpieLM 4B Chat V0.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
MagpieLM 4B Chat V0.1 (Magpie-Align/MagpieLM-4B-Chat-v0.1)
๐ŸŒŸ Advertise your project ๐Ÿš€

MagpieLM 4B Chat V0.1 Parameters and Internals

Model Type 
text-generation
Use Cases 
Primary Use Cases:
friendly AI assistant
Limitations:
primarily understands and generates content in English, outputs may contain factual errors or logical inconsistencies, may reflect biases present in the training data, not specifically designed for complex reasoning tasks, may produce unsafe or inappropriate content
Additional Notes 
This model was developed using the transformers library.
Training Details 
Data Sources:
Magpie-Align/MagpieLM-SFT-Data-v0.1, Magpie-Align/MagpieLM-DPO-Data-v0.1
Methodology:
SFT and DPO
Context Length:
8192
Model Architecture:
Llama3.1
Input Output 
Input Format:
Llama 3 chat template
Accepted Modalities:
text
Performance Tips:
Please use the Llama 3 chat template for the best performance.
Release Notes 
Version:
v0.1
Notes:
Initial release of Llama3.1-MagpieLM-4B-Chat-v0.1.
LLM NameMagpieLM 4B Chat V0.1
Repository ๐Ÿค—https://huggingface.co/Magpie-Align/MagpieLM-4B-Chat-v0.1 
Base Model(s)  Magpie-Align/MagpieLM-4B-SFT-v0.1   Magpie-Align/MagpieLM-4B-SFT-v0.1
Model Size4b
Required VRAM9 GB
Updated2026-04-11
MaintainerMagpie-Align
Model Typellama
Model Files  5.0 GB: 1-of-2   4.0 GB: 2-of-2   0.0 GB
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length131072
Model Max Length131072
Transformers Version4.45.0.dev0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to MagpieLM 4B Chat V0.1

Best Alternatives
Context / RAM
Downloads
Likes
4Bcpt256K / 8.8 GB50
HoldMy4BKTO256K / 8.8 GB50
Xgen Small 4B Instruct R256K / 17.7 GB183
Xgen Small 4B Base R256K / 17.7 GB142
SJT 4B146K / 7.6 GB50
...lama 3.1 Nemotron Nano 4B V1.1128K / 9 GB20991113
Impish LLAMA 4B128K / 9 GB113342
Nemotron W 4b MagLight 0.1128K / 9.2 GB133
Loxa 4B128K / 16 GB260
Nemotron W 4b Halo 0.1128K / 9.2 GB33
Note: green Score (e.g. "73.2") means that the model is better than Magpie-Align/MagpieLM-4B-Chat-v0.1.

Rank the MagpieLM 4B Chat V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a