Dolly Shygmalion 6B 4bit 128g by Ancestral

 ยป  All LLMs  ยป  Ancestral  ยป  Dolly Shygmalion 6B 4bit 128g   URL Share it on

Dolly Shygmalion 6B 4bit 128g is an open-source language model by Ancestral. Features: 6b LLM, VRAM: 4GB, License: apache-2.0, Quantized, LLM Explorer Score: 0.07.

  4bit   Conversational   En   Gptj   Gptq   Quantized   Region:us

Dolly Shygmalion 6B 4bit 128g Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Dolly Shygmalion 6B 4bit 128g (Ancestral/Dolly_Shygmalion-6b-4bit-128g)
๐ŸŒŸ Advertise your project ๐Ÿš€

Dolly Shygmalion 6B 4bit 128g Parameters and Internals

Model Type 
text generation, conversational
Additional Notes 
The model was processed using GPTQ quantization methodology for 4-bit inference with a group size of 128.
Supported Languages 
en (high)
Release Notes 
Version:
gptq-v2
Notes:
GPTQ quantization performed using the command: python3 gptj.py models/Dolly_Shygmalion-6b c4 --wbits 4 --groupsize 128 --save_safetensors models/Dolly_Shygmalion-6b-4bit-128g.safetensors
LLM NameDolly Shygmalion 6B 4bit 128g
Repository ๐Ÿค—https://huggingface.co/Ancestral/Dolly_Shygmalion-6b-4bit-128g 
Model Size6b
Required VRAM4 GB
Updated2026-04-07
MaintainerAncestral
Model Typegptj
Model Files  4.0 GB
Supported Languagesen
Quantization Type4bit
Model ArchitectureGPTJForCausalLM
Licenseapache-2.0
Model Max Length1024
Transformers Version4.28.0.dev0
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50400
Torch Data Typefloat16
Activation Functiongelu_new
Errorsreplace

Best Alternatives to Dolly Shygmalion 6B 4bit 128g

Best Alternatives
Context / RAM
Downloads
Likes
Model0K / 6.2 GB60
...oduct NER GPT J 6B 4bit Merged0K / 2.5 GB120
...nese Novel GPT J 6B F16 Marisa0K / 12.2 GB33
Kakaobrain Kogpt 6B 8bit0K / 6.7 GB112
Pygmalion 6b Dev 4bit 128g0K / 4 GB25120
GPT J 6B Skein 4bit 128g0K / 4 GB171
GPT J 6B Alpaca Gpt40K / 24.3 GB2920
Pygmalion 6B 4bit 128g0K / 4 GB53
Pygmalion 6B 4bit 128g0K / 4 GB2640
...O Pygway V8p4 Dev 6B 4bit 128g0K / 4 GB82
Note: green Score (e.g. "73.2") means that the model is better than Ancestral/Dolly_Shygmalion-6b-4bit-128g.

Rank the Dolly Shygmalion 6B 4bit 128g Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a