Pygmalion 2 7B by PygmalionAI

 »  All LLMs  »  PygmalionAI  »  Pygmalion 2 7B   URL Share it on

Pygmalion 2 7B is an open-source language model by PygmalionAI. Features: 7b LLM, VRAM: 13.5GB, Context: 4K, License: llama2, Instruction-Based, HF Score: 51.1, LLM Explorer Score: 0.13, Arc: 54, HellaSwag: 78.2, MMLU: 49.1, TruthfulQA: 43.8, WinoGrande: 75.1, GSM8K: 6.4.

Dataset:databricks/databricks-... Dataset:jondurbin/airoboros-gp... Dataset:norquinal/claude multi...   Dataset:open-orca/openorca   Dataset:pygmalionai/pippa   Deploy:azure   En   Instruct   Llama   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Pygmalion 2 7B Benchmarks

Pygmalion 2 7B Parameters and Internals

Model Type 
text generation, instruct
Use Cases 
Primary Use Cases:
Fictional writing, Conversation, Roleplaying, Story writing
Limitations:
Not fine-tuned to be safe and harmless. Output can be factually wrong, offensive, or socially unacceptable.
Considerations:
Intended for entertainment purposes, not for factual or safe outputs.
Additional Notes 
Freely available for both commercial and non-commercial use
Supported Languages 
en (advanced)
Training Details 
Data Sources:
PygmalionAI/PIPPA, Open-Orca/OpenOrca, Norquinal/claude_multiround_chat_30k, jondurbin/airoboros-gpt4-1.4.1, databricks/databricks-dolly-15k
Methodology:
Supervised fine-tuning over regular instruction data and roleplay data.
Input Output 
Input Format:
<|system|>, <|user|>, <|model|> tokens for different roles.
Accepted Modalities:
text
Output Format:
Text output in response to input following the role scheme.
LLM NamePygmalion 2 7B
Repository 🤗https://huggingface.co/PygmalionAI/pygmalion-2-7b 
Model Size7b
Required VRAM13.5 GB
Updated2026-04-07
MaintainerPygmalionAI
Model Typellama
Instruction-BasedYes
Model Files  10.0 GB: 1-of-2   3.5 GB: 2-of-2   10.0 GB: 1-of-2   3.5 GB: 2-of-2
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.33.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the Pygmalion 2 7B

Model
Likes
Downloads
VRAM
Pygmalion 2 7B GGUF3426632 GB
Pygmalion 2 7B AWQ593 GB
Pygmalion 2 7B GPTQ15293 GB

Best Alternatives to Pygmalion 2 7B

Best Alternatives
Context / RAM
Downloads
Likes
1241024K / 16.1 GB930
A5.41024K / 16.1 GB120
A3.41024K / 16.1 GB130
A2.41024K / 16.1 GB120
... Qwen2.5llamaify 7B V23.1 200K195K / 15.2 GB245
SuperNeuralDreadDevil 8B128K / 16.1 GB231
Falcon3 7B Instruct32K / 14.8 GB1815078
Falcon3 Jessi V0.4 7B Slerp32K / 14.9 GB99
Jessi V0.4 Falcon3 7B Instruct32K / 14.8 GB220
Jessi V0.5 Falcon3 7B Instruct32K / 14.8 GB100
Note: green Score (e.g. "73.2") means that the model is better than PygmalionAI/pygmalion-2-7b.

Rank the Pygmalion 2 7B Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53570 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a