Bloom 7B Chunhua by wptoux

 ยป  All LLMs  ยป  wptoux  ยป  Bloom 7B Chunhua   URL Share it on

  Art   Autotrain compatible   Bloom Dataset:bellegroup/train 0.5m ...   Endpoints compatible   Pytorch   Region:us   Sharded   Zh
Model Card on HF ๐Ÿค—: https://huggingface.co/wptoux/bloom-7b-chunhua 

Bloom 7B Chunhua Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Bloom 7B Chunhua (wptoux/bloom-7b-chunhua)
๐ŸŒŸ Advertise your project ๐Ÿš€

Bloom 7B Chunhua Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
research
Limitations:
Output may be inaccurate or biased
Additional Notes 
The model has strong understanding capabilities in ancient Chinese due to its extensive dataset.
Supported Languages 
zh (ancient Chinese)
Training Details 
Data Sources:
scripta-sinica, BelleGroup/train_0.5M_CN
Data Volume:
~13 billion characters (scripta-sinica)
Methodology:
fine-tuning
Model Architecture:
Based on Bloom architecture
Input Output 
Input Format:
Formatted according to ChatML structure
Accepted Modalities:
text
Output Format:
Ancient Chinese text responses
LLM NameBloom 7B Chunhua
Repository ๐Ÿค—https://huggingface.co/wptoux/bloom-7b-chunhua 
Model Size7b
Required VRAM16.4 GB
Updated2025-09-23
Maintainerwptoux
Model Typebloom
Model Files  0.0 GB: 1-of-10   2.1 GB: 2-of-10   1.9 GB: 3-of-10   1.9 GB: 4-of-10   2.0 GB: 5-of-10   1.9 GB: 6-of-10   1.9 GB: 7-of-10   1.9 GB: 8-of-10   0.7 GB: 9-of-10   2.1 GB: 10-of-10
Supported Languageszh
Model ArchitectureBloomForCausalLM
Licenseapache-2.0
Transformers Version4.27.1
Tokenizer ClassBloomTokenizer
Padding Token<pad>
Vocabulary Size250880
Torch Data Typefloat16

Best Alternatives to Bloom 7B Chunhua

Best Alternatives
Context / RAM
Downloads
Likes
Yayi 7B0K / 28.2 GB195530
...Qa Llama 2 7B Chat Hf Uld Loss0K / 2.2 GB50
...lama 2 7B Chat Hf Text Teacher0K / 2.2 GB50
...tral 7B Instruct V0.2 Uld Loss0K / 2.2 GB50
... 7B Instruct V0.2 Text Teacher0K / 2.2 GB50
Gogpt 7B Bloom0K / 32.3 GB7743
Phoenix Inst Chat 7B0K / 16.2 GB183743
Vietcuna 7B 2k50K / 14.2 GB50
Bloom Xp30K / 32.4 GB50
Vietcuna 7B V30K / 14.2 GB11888
Note: green Score (e.g. "73.2") means that the model is better than wptoux/bloom-7b-chunhua.

Rank the Bloom 7B Chunhua Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51555 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124