BLOOMChat 176B V2 by sambanovasystems

 ยป  All LLMs  ยป  sambanovasystems  ยป  BLOOMChat 176B V2   URL Share it on

  Autotrain compatible   Bloom   Endpoints compatible   Pytorch   Region:us   Sharded

BLOOMChat 176B V2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
BLOOMChat 176B V2 (sambanovasystems/BLOOMChat-176B-v2)
๐ŸŒŸ Advertise your project ๐Ÿš€

BLOOMChat 176B V2 Parameters and Internals

Model Type 
Language Model
Use Cases 
Areas:
Commercial, Research
Limitations:
Mission-critical applications, Applications that involve safety, Making highly important decisions, Automated pipelines
Additional Notes 
Bloch is still in early development and can be prone to mistakes and hallucinations. It supports multiple languages and has been tested on a variety of conversational and question-answering tasks.
Supported Languages 
Multilingual (Multiple languages)
Training Details 
Data Sources:
mc4 3.1.0, RefinedWeb, StarCoder, The PILE, Pile of Law, EDGAR, arXiv papers, YouTube transcripts, OIG dataset, Dolly 2.0, Oasst1
Methodology:
Finetuning from BLOOM on long-sequence multilingual data and assistant-style conversation datasets.
Hardware Used:
SambaNova Reconfigurable Dataflow Unit (RDU)
LLM NameBLOOMChat 176B V2
Repository ๐Ÿค—https://huggingface.co/sambanovasystems/BLOOMChat-176B-v2 
Model Size176b
Required VRAM155.7 GB
Updated2025-10-07
Maintainersambanovasystems
Model Typebloom
Model Files  7.2 GB: 1-of-107   3.3 GB: 2-of-107   3.3 GB: 3-of-107   3.3 GB: 4-of-107   3.3 GB: 5-of-107   3.3 GB: 6-of-107   3.3 GB: 7-of-107   3.3 GB: 8-of-107   3.3 GB: 9-of-107   3.3 GB: 10-of-107   3.3 GB: 11-of-107   3.3 GB: 12-of-107   3.3 GB: 13-of-107   3.3 GB: 14-of-107   3.3 GB: 15-of-107   3.3 GB: 16-of-107   3.3 GB: 17-of-107   3.3 GB: 18-of-107   3.3 GB: 19-of-107   3.3 GB: 20-of-107   3.3 GB: 21-of-107   3.3 GB: 22-of-107   3.3 GB: 23-of-107   3.3 GB: 24-of-107   3.3 GB: 25-of-107   3.3 GB: 26-of-107   3.3 GB: 27-of-107   3.3 GB: 28-of-107   3.3 GB: 29-of-107   3.3 GB: 30-of-107   3.3 GB: 31-of-107   3.3 GB: 32-of-107   3.3 GB: 33-of-107   3.3 GB: 34-of-107   3.3 GB: 35-of-107   3.3 GB: 36-of-107   3.3 GB: 37-of-107   3.3 GB: 38-of-107   3.3 GB: 39-of-107   3.3 GB: 40-of-107   3.3 GB: 41-of-107   3.3 GB: 42-of-107   3.3 GB: 43-of-107   3.3 GB: 44-of-107   3.3 GB: 45-of-107   3.3 GB: 46-of-107
Model ArchitectureBloomForCausalLM
Licenseother
Transformers Version4.25.0
Vocabulary Size250880

Best Alternatives to BLOOMChat 176B V2

Best Alternatives
Context / RAM
Downloads
Likes
BLOOMChat 176B V10K / 360 GB1912365
Bloom Finnish 176B0K / 4.9 GB136
BLOOMChat 176B V1 8bit0K / 187.7 GB152
BLOOMChat 176B V1 GPTQ0K / 4.2 GB1031
Bloomz 176B GPTQ0K / 4.2 GB1219

Rank the BLOOMChat 176B V2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51540 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124