Writing assistance, Creative writing and art, Entertainment
Primary Use Cases:
Text generation
Limitations:
Bias related to race and gender, Not suitable for applications where factual correctness is required
Considerations:
Careful consideration of the bias and context of usage. Not recommended for direct human interaction without thorough bias assessment.
Additional Notes
DistilGPT2 is designed to facilitate faster, resource-efficient text generation applications.
Supported Languages
English (native)
Training Details
Data Sources:
OpenWebTextCorpus (an open-source reproduction of OpenAIβs WebText dataset)
Methodology:
Knowledge distillation
Hardware Used:
8 16GB V100 GPUs
Model Architecture:
Transformer
Responsible Ai Considerations
Fairness:
DistilGPT2 suffers from persistent bias issues similar to those described for GPT-2. The distilled versions have shown reductions in toxicity and bias compared to their teacher models, yet still present statistically significant bias.
Mitigation Strategies:
Ongoing research to mitigate bias using additional techniques for distilled models.
Note: green Score (e.g. "73.2") means that the model is better than distilbert/distilgpt2.
Rank the Distilgpt2 Capabilities
π Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! π
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.