Pythia 160M by EleutherAI

 ยป  All LLMs  ยป  EleutherAI  ยป  Pythia 160M   URL Share it on

  Arxiv:2101.00027   Arxiv:2201.07311   Arxiv:2304.01373   Autotrain compatible   Dataset:eleutherai/pile   En   Endpoints compatible   Gpt neox   Pythia   Pytorch   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/EleutherAI/pythia-160m 

Pythia 160M Benchmarks

Pythia 160M (EleutherAI/pythia-160m)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pythia 160M Parameters and Internals

Model Type 
Transformer-based Language Model
Use Cases 
Areas:
Scientific Research, Interpretability Research
Applications:
Research on behavior, functionality, limitations of large language models
Primary Use Cases:
Controlled scientific experiments
Limitations:
Not suitable for human-facing interactions, English language-only models, unsuitable for generating text in other languages, Not fine-tuned for genre prose or commercial chatbots
Considerations:
Conduct risk and bias assessment if fine-tuning; evaluate risks before deployment
Additional Notes 
Pythia model suite renamed in January 2023 for clarity
Supported Languages 
English (Native)
Training Details 
Data Sources:
The Pile, 22 diverse sources including arXiv, CommonCrawl, Project Gutenberg, YouTube subtitles, GitHub
Data Volume:
299,892,736,000 tokens
Model Architecture:
GPT-NeoX
Responsible Ai Considerations 
Fairness:
Documented biases with regards to gender, religion, and race (as per Pile paper).
Input Output 
Input Format:
String of text for next token prediction.
Accepted Modalities:
text
Output Format:
String (one token at a time)
Release Notes 
Version:
Current Release
Date:
January 2023
Notes:
Pyhtia-160M retrained to address hyperparameter discrepancies
LLM NamePythia 160M
Repository ๐Ÿค—https://huggingface.co/EleutherAI/pythia-160m 
Model Size160m
Required VRAM0.4 GB
Updated2025-07-26
MaintainerEleutherAI
Model Typegpt_neox
Model Files  0.4 GB   0.4 GB
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.24.0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typefloat16

Best Alternatives to Pythia 160M

Best Alternatives
Context / RAM
Downloads
Likes
Pythia 160M C2s8K / 0.6 GB396
Pythia 160M Xsum Roya2K / 0.6 GB50
Pythia 160m Sft2K / 0 GB100
Sheared Pythia 160M2K / 0.7 GB94
Pythia 160M Dolphin Extended2K / 0.3 GB190
Pythia 160M Deduped2K / 0.4 GB383613
Pythia 160M Storytelling2K / 0.3 GB100
Pythia160m Sft Tldr2K / 0.6 GB100
Pythia 160m Ft CookingRecipes2K / 0.6 GB60
Ppo2K / 0.3 GB70
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/pythia-160m.

Rank the Pythia 160M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50068 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124