Pygmalion 2 7B AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Pygmalion 2 7B AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq Base model:pygmalionai/pygmali... Base model:quantized:pygmalion... Dataset:databricks/databricks-... Dataset:jondurbin/airoboros-gp... Dataset:norquinal/claude multi...   Dataset:open-orca/openorca   Dataset:pygmalionai/pippa   En   Instruct   Llama   Quantized   Region:us   Safetensors

Pygmalion 2 7B AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pygmalion 2 7B AWQ (TheBloke/Pygmalion-2-7B-AWQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pygmalion 2 7B AWQ Parameters and Internals

Model Type 
text generation, instruct
Use Cases 
Primary Use Cases:
Fictional writing for entertainment purposes
Limitations:
Not fine-tuned to be safe and harmless, May produce socially unacceptable or undesirable text, Outputs might often be factually wrong or misleading
Additional Notes 
Model is based and fine-tuned on Llama-2 with a focus on fictional writing and conversation, and may include lewd or offensive text.
Training Details 
Data Sources:
PygmalionAI/PIPPA, Open-Orca/OpenOrca, Norquinal/claude_multiround_chat_30k, jondurbin/airoboros-gpt4-1.4.1, databricks/databricks-dolly-15k
Methodology:
Supervised fine-tuning over a mixture of regular instruction data alongside roleplay, fictional stories and conversations with synthetically generated instructions.
Model Architecture:
Llama-2 based
Input Output 
Input Format:
Prompts using tokens: <|system|>, <|user|>, and <|model|>
LLM NamePygmalion 2 7B AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Pygmalion-2-7B-AWQ 
Model NamePygmalion 2 7B
Model CreatorPygmalionAI
Base Model(s)  Pygmalion 2 7B   PygmalionAI/pygmalion-2-7b
Model Size7b
Required VRAM3.9 GB
Updated2025-09-14
MaintainerTheBloke
Model Typellama
Instruction-BasedYes
Model Files  3.9 GB
Supported Languagesen
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.33.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Pygmalion 2 7B AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Llama 2 7B 32K Instruct AWQ32K / 3.9 GB132
CodeLlama 7B Instruct AWQ16K / 3.9 GB15744
...eechless Tora Code 7B V1.0 AWQ16K / 3.9 GB41
...ama 7B Instruct Hf W4 G128 AWQ16K / 3.9 GB50
CausalLM 7B AWQ8K / 5.8 GB63
Leo Hessianai 7B Chat AWQ8K / 3.9 GB221
...essianai 7B Chat Bilingual AWQ8K / 3.9 GB42
...epseek Math 7B Instruct AWQ Q44K / 4.8 GB130
Llama 2 7B Ft Instruct Es AWQ4K / 3.9 GB41
Swallow 7B Instruct AWQ4K / 4.1 GB41
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Pygmalion-2-7B-AWQ.

Rank the Pygmalion 2 7B AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51368 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124