Bloom 1b7 is an open-source language model by bigscience. Features: 1.7b LLM, VRAM: 3.4GB, License: bigscience-bloom-rail-1.0, HF Score: 34, LLM Explorer Score: 0.12, Arc: 30.6, HellaSwag: 47.6, MMLU: 27.5, TruthfulQA: 41.3, WinoGrande: 56, GSM8K: 0.8.
| Model Type |
| |||||||||||||||||||||
| Use Cases |
| |||||||||||||||||||||
| Additional Notes |
| |||||||||||||||||||||
| Supported Languages |
| |||||||||||||||||||||
| Training Details |
| |||||||||||||||||||||
| Input Output |
|
| LLM Name | Bloom 1b7 |
| Repository ๐ค | https://huggingface.co/bigscience/bloom-1b7 |
| Model Size | 1.7b |
| Required VRAM | 3.4 GB |
| Updated | 2025-12-07 |
| Maintainer | bigscience |
| Model Type | bloom |
| Model Files | |
| Supported Languages | ak ar as bm bn ca code en es eu fr gu hi id ig ki kn lg ln ml mr ne ny or pa pt rn rw sn st sw ta te tn ts tw ur vi wo xh yo zh zu |
| Model Architecture | BloomForCausalLM |
| License | bigscience-bloom-rail-1.0 |
| Transformers Version | 4.20.0 |
| Tokenizer Class | BloomTokenizerFast |
| Padding Token | <pad> |
| Vocabulary Size | 250880 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| Bloom 1b7 GPTQ 4bit G128 | 0 | 9 | 2 GB |
| Bloom 1b7 Gptq 4bit | 0 | 12 | 2 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Merged DPO Model | 0K / 6.8 GB | 10 | 0 |
| Bloom20 | 0K / 3.4 GB | 5 | 0 |
| Mnlp DPO Model Bloom | 0K / 6.8 GB | 5 | 0 |
| Bloom 1b7 Fp32 | 0K / 6.8 GB | 14 | 0 |
| Aira 2 Portuguese 1B7 | 0K / 0 GB | 153 | 2 |
| Bloomz 1b7 | 0K / 3.4 GB | 857 | 23 |
| Bloom 1b7 Intermediate | 0K / 3.4 GB | 88 | 0 |
| Bloom Lora 8bit | 0K / 2.3 GB | 40 | 0 |
| Bloom 1b7 8bit | 0K / 2.2 GB | 474 | 6 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐