Sensualize Mixtral AWQ is an open-source language model by TheBloke. Features: 46.7b LLM, VRAM: 24.7GB, Context: 32K, License: cc-by-nc-4.0, Quantized, LLM Explorer Score: 0.11.
Experimental model, trained using Alpaca format. Roleplay based model, specifically ERP type, with varied performance depending on prompts. Recommended settings include Universal-Light or Universal-Creative in SillyTavern.
Training Details
Data Sources:
Randomised subset of Full120k - 60K Samples, Own NSFW Instruct & De-Alignment Data
Data Volume:
80M Tokens
Methodology:
Alpaca format using Charles Goddard's ZLoss and Megablocks-based fork of transformers.
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Sensualize-Mixtral-AWQ.
Rank the Sensualize Mixtral AWQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.