Bloom 7b1 is an open-source language model by bigscience. Features: 7.1b LLM, VRAM: 14.2GB, License: bigscience-bloom-rail-1.0, HF Score: 39.2, LLM Explorer Score: 0.1, Arc: 41.1, HellaSwag: 62, MMLU: 26.3, TruthfulQA: 38.9, WinoGrande: 65.4, GSM8K: 1.4.
| Model Type |
| |||||||||||||||||||||
| Use Cases |
| |||||||||||||||||||||
| Additional Notes |
| |||||||||||||||||||||
| Supported Languages |
| |||||||||||||||||||||
| Training Details |
| |||||||||||||||||||||
| Responsible Ai Considerations |
| |||||||||||||||||||||
| Input Output |
| |||||||||||||||||||||
| Release Notes |
|
| LLM Name | Bloom 7b1 |
| Repository ๐ค | https://huggingface.co/bigscience/bloom-7b1 |
| Model Size | 7.1b |
| Required VRAM | 14.2 GB |
| Updated | 2026-03-15 |
| Maintainer | bigscience |
| Model Type | bloom |
| Model Files | |
| Supported Languages | ak ar as bm bn ca code en es eu fr gu hi id ig ki kn lg ln ml mr ne ny or pa pt rn rw sn st sw ta te tn ts tw ur vi wo xh yo zh zu |
| Model Architecture | BloomForCausalLM |
| License | bigscience-bloom-rail-1.0 |
| Transformers Version | 4.22.2 |
| Tokenizer Class | BloomTokenizerFast |
| Padding Token | <pad> |
| Vocabulary Size | 250880 |
| Torch Data Type | float16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| Bloom 7b1 GPTQ 4bit G128 | 2 | 8 | 7 GB |
| Bloom 7b1 Gptq 4bit | 2 | 27 | 7 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Bloomz 7b1 Mt Sft Chat | 0K / 14.2 GB | 772 | 16 |
| ...ft Fpft Multilingual Bloom 7b1 | 0K / 28.2 GB | 10 | 1 |
| Bloomz 7b1 | 0K / 14.1 GB | 2864 | 147 |
| Bloomz 7b1 Mt | 0K / 14.1 GB | 2399 | 143 |
| Bloomz 7b1 P3 | 0K / 14.1 GB | 135 | 6 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐