Bloom 560M by bigscience

 »  All LLMs  »  bigscience  »  Bloom 560M   URL Share it on

Bloom 560M is an open-source language model by bigscience. Features: 560m LLM, VRAM: 1.1GB, License: bigscience-bloom-rail-1.0, HF Score: 30.1, LLM Explorer Score: 0.16, Arc: 24.7, HellaSwag: 37.2, MMLU: 24.2, TruthfulQA: 42.4, WinoGrande: 51.9, GSM8K: 0.3.

  Arxiv:1909.08053   Arxiv:2108.12409   Arxiv:2110.02861   Ak   Ar   As   Bloom   Bm   Bn   Ca   Code   Deploy:azure   En   Endpoints compatible   Es   Eu   Fon   Fr   Gu   Hi   Id   Ig   Jax   Ki   Kn   Lg   Ln   Ml   Mr   Ne   Nso   Ny   Onnx   Or   Pa   Pt   Pytorch   Region:us   Rn   Rw   Safetensors   Sn   St   Sw   Ta   Te   Tn   Ts   Tum   Tw   Ur   Vi   Wo   Xh   Yo   Zh   Zhs   Zht   Zu

Bloom 560M Benchmarks

Bloom 560M Parameters and Internals

Model Type 
text-generation
Supported Languages 
ak (Akan), ar (Arabic), as (Assamese), bm (Bambara), bn (Bengali), ca (Catalan), code (Programming Languages), en (English), es (Spanish), eu (Basque), fon (Fon), fr (French), gu (Gujarati), hi (Hindi), id (Indonesian), ig (Igbo), ki (Kikuyu), kn (Kannada), lg (Luganda), ln (Lingala), ml (Malayalam), mr (Marathi), ne (Nepali), nso (Northern Sotho), ny (Nyanja), or (Oriya), pa (Punjabi), pt (Portuguese), rn (Rundi), rw (Kinyarwanda), sn (Shona), st (Southern Sotho), sw (Swahili), ta (Tamil), te (Telugu), tn (Tswana), ts (Tsonga), tum (Tumbuka), tw (Twi), ur (Urdu), vi (Vietnamese), wo (Wolof), xh (Xhosa), yo (Yoruba), zh (Chinese), zhs (Simplified Chinese), zht (Traditional Chinese), zu (Zulu)
Training Details 
Data Sources:
45 natural languages, 12 programming languages
Data Volume:
1.5TB of pre-processed text
Methodology:
Byte-level Byte Pair Encoding (BPE) algorithm
Context Length:
2048
Training Time:
March 11, 2022, to July 5, 2022
Hardware Used:
384 A100 80GB GPUs
Model Architecture:
Modified from Megatron-LM GPT2
LLM NameBloom 560M
Repository 🤗https://huggingface.co/bigscience/bloom-560m 
Model Size560m
Required VRAM1.1 GB
Updated2026-04-22
Maintainerbigscience
Model Typebloom
Model Files  1.1 GB   1.1 GB
Supported Languagesak ar as bm bn ca code en es eu fr gu hi id ig ki kn lg ln ml mr ne ny or pa pt rn rw sn st sw ta te tn ts tw ur vi wo xh yo zh zu
Model ArchitectureBloomForCausalLM
Licensebigscience-bloom-rail-1.0
Transformers Version4.20.0
Tokenizer ClassBloomTokenizerFast
Padding Token<pad>
Vocabulary Size250880

Best Alternatives to Bloom 560M

Best Alternatives
Context / RAM
Downloads
Likes
Bloomz 560M0K / 1.1 GB1100330137
Train Test Bloom5600K / 2.2 GB50
Bloomz 560M Sft Chat0K / 1.1 GB94510
Promt Generator0K / 2.2 GB140942
Bloom 560M RLHF V20K / 1.1 GB10453
Bloom 560M RLHF0K / 1.1 GB10571
Train Test0K / 2.2 GB300
Guitester0K / 2.2 GB50
Product Description Fr0K / 2.2 GB50
Gogpt 560M0K / 1 GB12421
Note: green Score (e.g. "73.2") means that the model is better than bigscience/bloom-560m.

Rank the Bloom 560M Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53089 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a