AshhLimaRP Mistral 7B GPTQ is an open-source language model by TheBloke. Features: 7b LLM, VRAM: 4.2GB, Context: 32K, License: apache-2.0, Quantized, LLM Explorer Score: 0.1.
AshhLimaRP Mistral 7B GPTQ Parameters and Internals
Model Type
mistral
Additional Notes
LimaRP is a longform-oriented, novel-style roleplaying chat model.
Training Details
Data Volume:
2000 training samples up to 9k tokens length
Methodology:
finetuned on Ashhwriter-Mistral-7B
Context Length:
8750
Hardware Used:
2x NVidia A40 GPUs
Input Output
Input Format:
### Instruction:
Character's Persona: bot character description
User's persona: user character description
Scenario: what happens in the story
Play the role of Character. You must engage in a roleplaying chat with User below this line. Do not write dialogues and narration for User. Character should respond with messages of medium length.
### Input:
User: {prompt}
### Response:
Character:
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/AshhLimaRP-Mistral-7B-GPTQ.
Rank the AshhLimaRP Mistral 7B GPTQ Capabilities
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 53127 in total.