Pygmalion 2 13B SuperCOT GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Pygmalion 2 13B SuperCOT GPTQ   URL Share it on

  4-bit   Autotrain compatible Base model:quantized:royallab/... Base model:royallab/pygmalion-...   En   Gptq   Llama   Llama2   Quantized   Region:us   Safetensors

Pygmalion 2 13B SuperCOT GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pygmalion 2 13B SuperCOT GPTQ (TheBloke/Pygmalion-2-13B-SuperCOT-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pygmalion 2 13B SuperCOT GPTQ Parameters and Internals

Model Type 
llama
Use Cases 
Areas:
niche roleplaying
Primary Use Cases:
text adventure game
Limitations:
supplying factual information or advice in any form
Training Details 
Data Sources:
wikitext
Input Output 
Input Format:
Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Response:
LLM NamePygmalion 2 13B SuperCOT GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Pygmalion-2-13B-SuperCOT-GPTQ 
Model NamePygmalion 2 13B SuperCOT
Model CreatorThe Royal Lab
Base Model(s)  Pygmalion 2 13B SuperCOT   royallab/Pygmalion-2-13b-SuperCOT
Model Size13b
Required VRAM7.3 GB
Updated2025-08-18
MaintainerTheBloke
Model Typellama
Model Files  7.3 GB
Supported Languagesen
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.34.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Pygmalion 2 13B SuperCOT GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Yarn Llama 2 13B 128K GPTQ128K / 7.3 GB1216
LongAlign 13B 64K GPTQ64K / 7.3 GB41
...boros L2 13B 2 1 YaRN 64K GPTQ64K / 7.3 GB93
Yarn Llama 2 13B 64K GPTQ64K / 7.3 GB121
OrcaMaid V3 13B 32K GPTQ32K / 7.3 GB103
OrcaMaid V2 FIX 13B 32K GPTQ32K / 7.3 GB74
EverythingLM 13B 16K GPTQ16K / 7.3 GB1613
Tinybra 13B GPTQ 32g 4BIT16K / 8 GB101
Tinybra 13B GPTQ 4BIT16K / 7 GB60
LlongOrca 13B 16K GPT16K / 7.3 GB60
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Pygmalion-2-13B-SuperCOT-GPTQ.

Rank the Pygmalion 2 13B SuperCOT GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50729 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124