Pythia 1B Deduped by EleutherAI

 ยป  All LLMs  ยป  EleutherAI  ยป  Pythia 1B Deduped   URL Share it on

Pythia 1B Deduped is an open-source language model by EleutherAI. Features: 1b LLM, VRAM: 2.1GB, Context: 2K, License: apache-2.0, HF Score: 32.8, LLM Explorer Score: 0.14, Arc: 29.1, HellaSwag: 49.7, MMLU: 24.3, TruthfulQA: 38.9, WinoGrande: 53.6, GSM8K: 1.1.

  Arxiv:2101.00027   Arxiv:2201.07311   Arxiv:2304.01373 Dataset:eleutherai/the pile de...   Deploy:azure   En   Endpoints compatible   Gpt neox   Pythia   Pytorch   Region:us   Safetensors

Pythia 1B Deduped Benchmarks

Pythia 1B Deduped (EleutherAI/pythia-1b-deduped)
๐ŸŒŸ Advertise your project ๐Ÿš€

Pythia 1B Deduped Parameters and Internals

Model Type 
Transformer-based Language Model
Use Cases 
Areas:
Research
Applications:
Interpretability research
Primary Use Cases:
Scientific experiments on language models, Promoting interpretability research
Limitations:
Not suitable for deployment, Not suitable for translation or generating text in languages other than English
Considerations:
Please conduct your own risk and bias assessment when applying the model.
Additional Notes 
Please note that all models in the *Pythia* suite were renamed in January 2023.
Training Details 
Data Sources:
EleutherAI/the_pile_deduplicated
LLM NamePythia 1B Deduped
Repository ๐Ÿค—https://huggingface.co/EleutherAI/pythia-1b-deduped 
Model Size1b
Required VRAM2.1 GB
Updated2026-03-11
MaintainerEleutherAI
Model Typegpt_neox
Model Files  2.1 GB   2.1 GB
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.24.0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typefloat16

Best Alternatives to Pythia 1B Deduped

Best Alternatives
Context / RAM
Downloads
Likes
C2S Scale Pythia 1B Pt8K / 0 GB14807
Pythia 2.8B Deduped Rp 710M 4K4K / 11.7 GB21
Pythia 1.4B Deduped Rp 280M 4K4K / 6.1 GB51
Pythia 1.4B Deduped Rp 420M 4K4K / 6.1 GB31
Chatml Test2K / 2.1 GB160
Pythia 1B Kto Iter02K / 2 GB60
Pythia 1B Self Kto Iter02K / 2 GB60
...eduped Tldr Preference Sft Trl2K / 2 GB140
Pythia 1B Deduped Tldr Sft2K / 2 GB1530
...rAI Pythia 1B Deduped Sft Tldr2K / 4 GB15650
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/pythia-1b-deduped.

Rank the Pythia 1B Deduped Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a