Poro 34B by LumiOpen

 ยป  All LLMs  ยป  LumiOpen  ยป  Poro 34B   URL Share it on

  Arxiv:2404.01856   Autotrain compatible   Bloom   Dataset:allenai/dolma   Dataset:bigcode/starcoderdata Dataset:cerebras/slimpajama-62...   Dataset:mc4   En   Endpoints compatible   Fi   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/LumiOpen/Poro-34B 

Poro 34B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Poro 34B (LumiOpen/Poro-34B)
๐ŸŒŸ Advertise your project ๐Ÿš€

Poro 34B Parameters and Internals

Model Type 
text generation
Additional Notes 
Poro is an advanced language model optimized primarily for English, Finnish, and code.
Supported Languages 
fi (high), en (high)
Training Details 
Data Sources:
cerebras/SlimPajama-627B, bigcode/starcoderdata, mc4, allenai/dolma
Data Volume:
1 trillion tokens
Methodology:
BLOOM architecture with ALiBi embeddings
Context Length:
2048
Hardware Used:
LUMI supercomputer, 512 AMD MI250X GPUs
Model Architecture:
decoder-only transformer
Input Output 
Accepted Modalities:
text
LLM NamePoro 34B
Repository ๐Ÿค—https://huggingface.co/LumiOpen/Poro-34B 
Model Size34b
Required VRAM68 GB
Updated2025-06-17
MaintainerLumiOpen
Model Typebloom
Model Files  4.7 GB: 1-of-14   4.9 GB: 2-of-14   4.9 GB: 3-of-14   4.9 GB: 4-of-14   4.9 GB: 5-of-14   4.9 GB: 6-of-14   4.9 GB: 7-of-14   4.9 GB: 8-of-14   4.9 GB: 9-of-14   4.9 GB: 10-of-14   4.9 GB: 11-of-14   4.9 GB: 12-of-14   4.9 GB: 13-of-14   4.5 GB: 14-of-14
Supported Languagesfi en
Model ArchitectureBloomForCausalLM
Licenseapache-2.0
Transformers Version4.37.2
Tokenizer ClassBloomTokenizer
Padding Token<pad>
Vocabulary Size128000
Torch Data Typebfloat16

Quantized Models of the Poro 34B

Model
Likes
Downloads
VRAM
Poro 34B GPTQ34220 GB
Poro 34B GGUF433014 GB
Poro 34B AWQ21421 GB

Best Alternatives to Poro 34B

Best Alternatives
Context / RAM
Downloads
Likes
Poro 34B Chat0K / 68 GB186412
Poro 34B GPTQ0K / 20.3 GB423
Poro 34B AWQ0K / 21 GB142
Poro 34B AWQ0K / 21 GB691
Note: green Score (e.g. "73.2") means that the model is better than LumiOpen/Poro-34B.

Rank the Poro 34B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48225 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124