Poro 34B GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Poro 34B GPTQ   URL Share it on

  4-bit   Autotrain compatible   Base model:lumiopen/poro-34b Base model:quantized:lumiopen/...   Bloom   Dataset:allenai/dolma   Dataset:bigcode/starcoderdata Dataset:cerebras/slimpajama-62...   Dataset:mc4   Gptq   Quantized   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/TheBloke/Poro-34B-GPTQ 

Poro 34B GPTQ Benchmarks

Poro 34B GPTQ (TheBloke/Poro-34B-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Poro 34B GPTQ Parameters and Internals

Model Type 
bloom, decoder-only transformer
Use Cases 
Areas:
research, testing
Limitations:
No meaningful proficiency in languages other than English, Finnish, and code
Considerations:
Poro is a release of a partially trained model, and special care should be taken when using any output.
Additional Notes 
Poro is a research checkpoint, training has not been completed.
Supported Languages 
English (fluent), Finnish (fluent)
Training Details 
Data Sources:
cerebras/SlimPajama-627B, bigcode/starcoderdata, mc4, allenai/dolma
Data Volume:
1 trillion tokens (500 billion as of this release)
Methodology:
BLOOM architecture with ALiBi embeddings for context length extrapolation
Context Length:
2048
Hardware Used:
LUMI supercomputer using 512 AMD MI250X GPUs
Model Architecture:
Generative pretrained transformer using a BLOOM architecture
LLM NamePoro 34B GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Poro-34B-GPTQ 
Model NamePoro 34B
Model CreatorLumiOpen
Base Model(s)  Poro 34B   LumiOpen/Poro-34B
Model Size34b
Required VRAM20.3 GB
Updated2025-06-18
MaintainerTheBloke
Model Typebloom
Model Files  20.3 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureBloomForCausalLM
Licenseapache-2.0
Transformers Version4.35.2
Tokenizer ClassBloomTokenizer
Padding Token<pad>
Vocabulary Size128000
Torch Data Typebfloat16

Best Alternatives to Poro 34B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Poro 34B Chat0K / 68 GB186412
Poro 34B0K / 68 GB1966116
Poro 34B AWQ0K / 21 GB132
Poro 34B AWQ0K / 21 GB681
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Poro-34B-GPTQ.

Rank the Poro 34B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48225 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124