EleutherAI GPT Neox 20B 4bits is an open-source language model by RichardErkhov. Features: 20b LLM, VRAM: 12.5GB, Context: 2K, License: apache-2.0, LLM Explorer Score: 0.13.
EleutherAI GPT Neox 20B 4bits Parameters and Internals
Model Type
Transformer-based Language Model, causal-lm
Use Cases
Areas:
Research, Scientific uses
Primary Use Cases:
Extract features useful for downstream tasks
Limitations:
Not intended for deployment as-is, Not fine-tuned for downstream tasks such as writing genre prose or commercial chatbots, English-language only
Considerations:
Use in accordance with the Apache 2.0 license. Conduct your own risk and bias assessment when fine-tuning for specific tasks.
Additional Notes
The model was trained on 'the Pile', which was not deduplicated before use. Contains texts with biases regarding gender, religion, and race. Curating outputs before presenting to a human reader is recommended.
Supported Languages
en (Full)
Training Details
Data Sources:
EleutherAI/pile
Methodology:
Autoregressive training using GPT-NeoX library
Context Length:
2048
Model Architecture:
Architecture resembles GPT-3 and is almost identical to that of GPT-J-6B.
Note: green Score (e.g. "73.2") means that the model is better than RichardErkhov/EleutherAI_-_gpt-neox-20b-4bits.
Rank the EleutherAI GPT Neox 20B 4bits Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52758 in total.