| Model Type | |
| Use Cases |
| Areas: | | Research, Chat applications |
|
|
| Additional Notes | | This is the 9th in a series of models designed to replicate the prose quality of the Claude 3 models. |
|
| Supported Languages | | English (High proficiency) |
|
| Training Details |
| Data Sources: | | anthracite-org/stheno-filtered-v1.1, anthracite-org/kalo-opus-instruct-22k-no-refusal, lodrick-the-lafted/NopmWritingStruct, Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned, Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned |
|
| Data Volume: | |
| Methodology: | | Instruct tuned with ChatML formatting. |
|
| Training Time: | |
| Hardware Used: | |
| Model Architecture: | | AutoModelForCausalLM with transformers |
|
|
| Input Output |
| Input Format: | |
| Accepted Modalities: | |
| Output Format: | |
| Performance Tips: | | A minimum 'p' value of 0.2 is recommended for optimal performance. |
|
|