| Model Type | | Transformer-based autoregressive language model |
|
| Use Cases |
| Areas: | | research and experimentation with Open LLMs |
|
| Applications: | | Conversational agents, Interactive storytelling, Educational tool |
|
| Limitations: | | biases present in its training data |
|
| Considerations: | | Users should be aware of these potential biases and limitations. |
|
|
| Additional Notes | | Model fine-tuned specifically for interactive chatting applications |
|
| Supported Languages | |
| Training Details |
| Data Sources: | |
| Data Volume: | | 52K instruction-following demonstrations |
|
| Methodology: | |
| Context Length: | |
| Model Architecture: | |
|
| Input Output |
| Input Format: | |
| Accepted Modalities: | |
| Output Format: | |
| Performance Tips: | | Fine-tuned for chatting, ensure input format aligns with expectations |
|
|