Pygmalion 2 13B SuperCOT2 by royallab

 ยป  All LLMs  ยป  royallab  ยป  Pygmalion 2 13B SuperCOT2   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Llama   Llama2   Region:us   Safetensors   Sharded   Tensorflow

Pygmalion 2 13B SuperCOT2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pygmalion 2 13B SuperCOT2 (royallab/Pygmalion-2-13b-SuperCOT2)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pygmalion 2 13B SuperCOT2 Parameters and Internals

Model Type 
text-generation
Use Cases 
Primary Use Cases:
text adventure games, roleplaying scenarios
Limitations:
not intended for supplying factual information or advice
Additional Notes 
This model shows biases similar to niche roleplaying forums on the Internet.
Training Details 
Methodology:
This model is a merge of existing models using tools like EzTrainer and zaraki-tools.
LLM NamePygmalion 2 13B SuperCOT2
Repository ๐Ÿค—https://huggingface.co/royallab/Pygmalion-2-13b-SuperCOT2 
Model Size13b
Required VRAM26 GB
Updated2025-09-16
Maintainerroyallab
Model Typellama
Model Files  9.9 GB: 1-of-3   9.9 GB: 2-of-3   6.2 GB: 3-of-3
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.34.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the Pygmalion 2 13B SuperCOT2

Model
Likes
Downloads
VRAM
Pygmalion 2 13B SuperCOT2 GGUF22215 GB
Pygmalion 2 13B SuperCOT2 AWQ167 GB
Pygmalion 2 13B SuperCOT2 GPTQ577 GB
Pygmalion 2 13B SuperCOT EXL2256 GB

Best Alternatives to Pygmalion 2 13B SuperCOT2

Best Alternatives
Context / RAM
Downloads
Likes
Luminaura RP 13B128K / 26 GB50
Yarn Llama 2 13B 128K128K / 26 GB156112
Agent Llama2 13B 80K80K / 26.4 GB80
Chat Llama2 13B 80K80K / 52.8 GB80
LongAlign 13B 64K64K / 26 GB11713
LongAlign 13B 64K Base64K / 26 GB933
LongAlign 13B 64K64K / 26 GB1113
LongAlign 13B 64K Base64K / 26 GB63
Openbuddy Llama2 13B V15p1 64K64K / 26.1 GB34
Openbuddy Llama2 13b64k V1564K / 26.1 GB32
Note: green Score (e.g. "73.2") means that the model is better than royallab/Pygmalion-2-13b-SuperCOT2.

Rank the Pygmalion 2 13B SuperCOT2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51408 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124