Pythia 1B V0 is an open-source language model by EleutherAI. Features: 1b LLM, VRAM: 2.1GB, Context: 2K, License: apache-2.0, LLM Explorer Score: 0.05.
Behavior analysis, functionality and limitations study of large language models
Limitations:
Not suitable for deployment, English-only, might generate undesired outputs
Considerations:
Model intended for interpretability research, not for real-world deployment interactions.
Additional Notes
All Pythia models trained for the equivalent of 143000 steps. The model is not fine-tuned for downstream tasks like writing prose or commercial chatbot interaction.
Supported Languages
en (proficient)
Training Details
Data Sources:
the Pile
Data Volume:
299,892,736,000 tokens
Model Architecture:
Transformer
Input Output
Input Format:
String input
Accepted Modalities:
text
Output Format:
Generated text tokens
Performance Tips:
Curate generated outputs before use in applications.
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/pythia-1b-v0.
Rank the Pythia 1B V0 Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52473 in total.