LLM Name | Bloom 1b7 Fp32 |
Repository ๐ค | https://huggingface.co/LazarusNLP/bloom-1b7-fp32 |
Model Size | 1.7b |
Required VRAM | 6.8 GB |
Updated | 2025-06-09 |
Maintainer | LazarusNLP |
Model Type | bloom |
Model Files | |
Supported Languages | ak ar as bm bn ca code en es eu fr gu hi id ig ki kn lg ln ml mr ne ny or pa pt rn rw sn st sw ta te tn ts tw ur vi wo xh yo zh zu |
Model Architecture | BloomForCausalLM |
License | bigscience-bloom-rail-1.0 |
Transformers Version | 4.35.2 |
Tokenizer Class | BloomTokenizer |
Padding Token | <pad> |
Vocabulary Size | 250880 |
Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Merged DPO Model | 0K / 6.8 GB | 13 | 0 |
Bloom20 | 0K / 3.4 GB | 11 | 0 |
Mnlp DPO Model Bloom | 0K / 6.8 GB | 12 | 0 |
Bloom 1b7 | 0K / 3.4 GB | 26582 | 122 |
Aira 2 Portuguese 1B7 | 0K / 0 GB | 22 | 2 |
Bloomz 1b7 | 0K / 3.4 GB | 1856 | 23 |
Bloom 1b7 Intermediate | 0K / 3.4 GB | 20 | 0 |
Bloom 1b7 8bit | 0K / 2.2 GB | 1016 | 6 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐