Chinese Plus Pygmalion 7B GPTQ by coyude

 ยป  All LLMs  ยป  coyude  ยป  Chinese Plus Pygmalion 7B GPTQ   URL Share it on

  4bit   Autotrain compatible   En   Endpoints compatible   Gptq   Llama   Pytorch   Quantized   Region:us   Zh

Chinese Plus Pygmalion 7B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Chinese Plus Pygmalion 7B GPTQ (coyude/Chinese-plus-Pygmalion-7b-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Chinese Plus Pygmalion 7B GPTQ Parameters and Internals

Model Type 
text generation
Additional Notes 
The model is intended to enhance the Chinese language capability of the pygmalion-7b by incorporating LoRA adaptors.
Supported Languages 
Chinese (Enhanced proficiency), English (Native proficiency)
Input Output 
Performance Tips:
Compatible with AutoGPTQ and GPTQ-for-LLaMa. If using GPTQ-for-LLaMa, set Wbits=4 groupsize=128 model_type=llama.
LLM NameChinese Plus Pygmalion 7B GPTQ
Repository ๐Ÿค—https://huggingface.co/coyude/Chinese-plus-Pygmalion-7b-GPTQ 
Model Size7b
Required VRAM4.2 GB
Updated2025-09-23
Maintainercoyude
Model Typellama
Model Files  4.2 GB
Supported Languageszh en
GPTQ QuantizationYes
Quantization Typegptq|4bit
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.28.1
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size49954
Torch Data Typefloat16

Best Alternatives to Chinese Plus Pygmalion 7B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Yarn Llama 2 7B 128K GPTQ128K / 3.9 GB117
Yarn Llama 2 7B 64K GPTQ64K / 3.9 GB111
... 7B 32K Instructions V4 Marlin32K / 4.1 GB80
Aixcoder 7B GPTQ32K / 4.5 GB51
Calm2 7B Chat GPTQ32K / 4.4 GB85
...Calm2 7B Chat GPTQ Calib Ja 1K32K / 4.4 GB85
Llama 2 7B 32K Instruct GPTQ32K / 3.9 GB1427
Codebear 7B 4bit16K / 3.9 GB61
CodeLlama 7B Instruct GPTQ16K / 3.9 GB133346
...a 7B Instruct GPTQ Calib Ja 1K16K / 3.9 GB60
Note: green Score (e.g. "73.2") means that the model is better than coyude/Chinese-plus-Pygmalion-7b-GPTQ.

Rank the Chinese Plus Pygmalion 7B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51536 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124