Pygmalion 7B SuperHOT 8K Fp16 by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Pygmalion 7B SuperHOT 8K Fp16   URL Share it on

  Autotrain compatible   Conversational   Custom code   En   Ext 8k   Fp16   Llama   Pytorch   Quantized   Region:us   Sharded

Pygmalion 7B SuperHOT 8K Fp16 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pygmalion 7B SuperHOT 8K Fp16 (TheBloke/Pygmalion-7B-SuperHOT-8K-fp16)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pygmalion 7B SuperHOT 8K Fp16 Parameters and Internals

Model Type 
text generation, conversational
Use Cases 
Areas:
entertainment, fictional conversation
Applications:
chatbots, character-based interactions
Primary Use Cases:
fictional conversation
Limitations:
May produce socially unacceptable or undesirable text, Outputs might often be factually wrong or misleading
Considerations:
It was not fine-tuned to be safe and harmless.
Additional Notes 
Model was created by merging TehVenom's Pygmalion with Kaio Ken's SuperHOT 8K for expanded context.
Supported Languages 
English (fluent)
Training Details 
Data Sources:
Pygmalion-6B-v8-pt4, superhot-7b-8k-no-rlhf-test
Data Volume:
1200 samples
Methodology:
LoRA training
Context Length:
8192
Hardware Used:
4-bit base model
Model Architecture:
Provides conversational context using a sliding window of chat history
Input Output 
Input Format:
character persona and chat history
Accepted Modalities:
text
Output Format:
text generates end-of-text token
Release Notes 
Version:
1.0
Notes:
Merged model with 8K context capability.
LLM NamePygmalion 7B SuperHOT 8K Fp16
Repository ๐Ÿค—https://huggingface.co/TheBloke/Pygmalion-7B-SuperHOT-8K-fp16 
Model Size7b
Required VRAM13.5 GB
Updated2025-08-18
MaintainerTheBloke
Model Typellama
Model Files  10.0 GB: 1-of-2   3.5 GB: 2-of-2
Supported Languagesen
Context Length8k
Quantization Typefp16
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length8192
Model Max Length8192
Transformers Version4.30.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Pygmalion 7B SuperHOT 8K Fp16

Best Alternatives
Context / RAM
Downloads
Likes
Smaugv0.1 6.0bpw H6 EXL2195K / 26.4 GB64
Smaugv0.1 5.0bpw H6 EXL2195K / 22.3 GB53
Smaugv0.1 8.0bpw H8 EXL2195K / 34.9 GB61
Smaugv0.1 3.0bpw H6 EXL2195K / 13.9 GB51
Smaugv0.1 4.65bpw H6 EXL2195K / 20.8 GB51
Smaugv0.1 4.0bpw H6 EXL2195K / 18 GB41
DeepSeek Prover V2 7B 4bit64K / 3.9 GB9644
Mistral 7B Openplatypus 1K32K / 29 GB17650
Mistral 7B OpenOrca 1K32K / 29 GB17673
...rnlm2 20B Llama 4.0bpw H6 EXL232K / 11 GB51
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Pygmalion-7B-SuperHOT-8K-fp16.

Rank the Pygmalion 7B SuperHOT 8K Fp16 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50729 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124