Bloom 1b1 by bigscience

 ยป  All LLMs  ยป  bigscience  ยป  Bloom 1b1   URL Share it on

  Arxiv:1909.08053   Arxiv:2108.12409   Arxiv:2110.02861   Ak   Ar   As   Autotrain compatible   Bloom   Bm   Bn   Ca   Code   En   Endpoints compatible   Es   Eu   Fon   Fr   Gu   Hi   Id   Ig   Jax   Ki   Kn   Lg   Ln   Ml   Mr   Ne   Nso   Ny   Onnx   Or   Pa   Pt   Pytorch   Region:us   Rn   Rw   Safetensors   Sn   St   Sw   Ta   Te   Tn   Ts   Tum   Tw   Ur   Vi   Wo   Xh   Yo   Zh   Zhs   Zht   Zu
Model Card on HF ๐Ÿค—: https://huggingface.co/bigscience/bloom-1b1 

Bloom 1b1 Benchmarks

๐ŸŒŸ Advertise your project ๐Ÿš€

Bloom 1b1 Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
Public research, Non-commercial entities
Applications:
Text generation, Language model research
Primary Use Cases:
Text generation
Limitations:
Not for use in high-stakes settings
Considerations:
Users should disclose when content is generated by the model.
Additional Notes 
Training supercomputer uses nuclear energy, leveraging green energy for reduced environmental impact.
Supported Languages 
ak (unknown), ar (unknown), as (unknown), bm (unknown), bn (unknown), ca (unknown), code (unknown), en (unknown), es (unknown), eu (unknown), fon (unknown), fr (unknown), gu (unknown), hi (unknown), id (unknown), ig (unknown), ki (unknown), kn (unknown), lg (unknown), ln (unknown), ml (unknown), mr (unknown), ne (unknown), nso (unknown), ny (unknown), or (unknown), pa (unknown), pt (unknown), rn (unknown), rw (unknown), sn (unknown), st (unknown), sw (unknown), ta (unknown), te (unknown), tn (unknown), ts (unknown), tum (unknown), tw (unknown), ur (unknown), vi (unknown), wo (unknown), xh (unknown), yo (unknown), zh (unknown), zhs (unknown), zht (unknown), zu (unknown)
Training Details 
Data Sources:
45 natural languages, 12 programming languages
Data Volume:
1.5TB of pre-processed text
Methodology:
Modified from Megatron-LM GPT2
Context Length:
2048
Training Time:
Started 11th March, 2022, ended 5th July, 2022
Hardware Used:
384 A100 80GB GPUs (48 nodes)
Model Architecture:
Modified Megatron-LM GPT2 with ALiBI positional encodings
Responsible Ai Considerations 
Fairness:
The model may overrepresent some viewpoints and underrepresent others, and contain stereotypes.
Transparency:
Details about training data and processes are publicly available.
Accountability:
BigScience holds accountability for the model's outputs.
Mitigation Strategies:
Use restrictions are defined to mitigate risks of harmful use.
Input Output 
Input Format:
Token sequences up to 2048 tokens.
Accepted Modalities:
text
Output Format:
Generated text.
Release Notes 
Version:
1.0
Date:
26-May-2022
Notes:
Initial release of the BLOOM model with multilingual capabilities.
LLM NameBloom 1b1
Repository ๐Ÿค—https://huggingface.co/bigscience/bloom-1b1 
Model Size1.1b
Required VRAM2.1 GB
Updated2025-06-09
Maintainerbigscience
Model Typebloom
Model Files  2.1 GB   2.1 GB
Supported Languagesak ar as bm bn ca code en es eu fr gu hi id ig ki kn lg ln ml mr ne ny or pa pt rn rw sn st sw ta te tn ts tw ur vi wo xh yo zh zu
Model ArchitectureBloomForCausalLM
Licensebigscience-bloom-rail-1.0
Transformers Version4.20.0
Tokenizer ClassBloomTokenizerFast
Padding Token<pad>
Vocabulary Size250880
Bloom 1b1 (bigscience/bloom-1b1)

Best Alternatives to Bloom 1b1

Best Alternatives
Context / RAM
Downloads
Likes
Viqgen Bloomz 1b1 Lorasft0K / 4.3 GB160
Bloom Chatml Id0K / 4.3 GB181
...ai Bloom 1b1 Text2prompt Sd V20K / 2.1 GB9621
Pai Bloom 1b1 Text2prompt Sd0K / 2.1 GB7036
Bloomz 1b10K / 2.1 GB313033
Bloom 1b1 Intermediate0K / 2.1 GB12730
Note: green Score (e.g. "73.2") means that the model is better than bigscience/bloom-1b1.

Rank the Bloom 1b1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124