Pythia 6.9B by EleutherAI

 ยป  All LLMs  ยป  EleutherAI  ยป  Pythia 6.9B   URL Share it on

Pythia 6.9B is an open-source language model by EleutherAI. Features: 6.9b LLM, VRAM: 13.8GB, Context: 2K, License: apache-2.0, LLM Explorer Score: 0.14.

  Arxiv:2101.00027   Arxiv:2201.07311   Arxiv:2304.01373   Dataset:eleutherai/pile   Deploy:azure   En   Endpoints compatible   Gpt neox   Pythia   Pytorch   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/EleutherAI/pythia-6.9b 

Pythia 6.9B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pythia 6.9B (EleutherAI/pythia-6.9b)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pythia 6.9B Parameters and Internals

Model Type 
transformer-based, language model
Use Cases 
Areas:
Research, Scientific Experiments
Primary Use Cases:
Behavior and functionality study of large language models
Limitations:
Not intended for deployment, May produce harmful or offensive text, English-language only
Considerations:
Conduct risk and bias assessment before use.
Additional Notes 
Non-embedding parameters: 6,444,163,072.
Supported Languages 
English (Fluent)
Training Details 
Data Sources:
the Pile
Data Volume:
299,892,736,000 tokens
Methodology:
Trained on deduplicated and non-deduplicated data, with checkpoints. Same hyperparameters for deduped and non-deduped models.
Model Architecture:
Transformer-based
Release Notes 
Version:
Current Release
Date:
January 2023
Notes:
Rename of models. Uniform batch size training. Added early checkpoints. Used Flash Attention.
LLM NamePythia 6.9B
Repository ๐Ÿค—https://huggingface.co/EleutherAI/pythia-6.9b 
Model Size6.9b
Required VRAM13.8 GB
Updated2026-04-04
MaintainerEleutherAI
Model Typegpt_neox
Model Files  9.9 GB: 1-of-2   3.9 GB: 2-of-2   9.9 GB: 1-of-2   3.9 GB: 2-of-2
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.24.0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50432
Torch Data Typefloat16

Best Alternatives to Pythia 6.9B

Best Alternatives
Context / RAM
Downloads
Likes
Pythia 6.9B Deduped 8K8K / 13.8 GB61
...I Pythia 6.9B Deduped Sft Tldr2K / 27.4 GB4740
Ppo Tldr 6.9B2K / 13.8 GB80
Pythia 6.9B Deduped2K / 13.8 GB68738
...I Pythia 6.9B Deduped Sft Tldr2K / 13.8 GB400
Oasst Pythia 6.9B 4000 Steps2K / 13.8 GB8790
Pythia 6.9B HC32K / 27.7 GB82
Open Instruct Pythia 6.9B Tulu2K / 27.6 GB7236
...an Large Pythia 6.9B Dev Phase2K / 27.6 GB33
....9B Deduped Synthetic Instruct2K / 27.5 GB26704
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/pythia-6.9b.

Rank the Pythia 6.9B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a