Sensualize Mixtral AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Sensualize Mixtral AWQ   URL Share it on

Sensualize Mixtral AWQ is an open-source language model by TheBloke. Features: 46.7b LLM, VRAM: 24.7GB, Context: 32K, License: cc-by-nc-4.0, Quantized, LLM Explorer Score: 0.11.

  4-bit   Awq Base model:quantized:sao10k/se... Base model:sao10k/sensualize-m... Dataset:nobodyexistsontheinter...   Mixtral   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Sensualize Mixtral AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Sensualize Mixtral AWQ (TheBloke/Sensualize-Mixtral-AWQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Sensualize Mixtral AWQ Parameters and Internals

Model Type 
mixtral
Additional Notes 
Experimental model, trained using Alpaca format. Roleplay based model, specifically ERP type, with varied performance depending on prompts. Recommended settings include Universal-Light or Universal-Creative in SillyTavern.
Training Details 
Data Sources:
Randomised subset of Full120k - 60K Samples, Own NSFW Instruct & De-Alignment Data
Data Volume:
80M Tokens
Methodology:
Alpaca format using Charles Goddard's ZLoss and Megablocks-based fork of transformers.
Training Time:
12 hours
Hardware Used:
2xA100s at batch size 5, grad 5
Input Output 
Input Format:
### Instruction: {system_message} ### Input: {prompt} ### Response:
Output Format:
Response
Performance Tips:
With the right settings this model can shine. I recommend Universal-Light or Universal-Creative in SillyTavern.
LLM NameSensualize Mixtral AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Sensualize-Mixtral-AWQ 
Model NameSensualize Mixtral
Model CreatorSaofiq
Base Model(s)  Sensualize Mixtral Bf16   Sao10K/Sensualize-Mixtral-bf16
Model Size46.7b
Required VRAM24.7 GB
Updated2026-04-03
MaintainerTheBloke
Model Typemixtral
Model Files  10.0 GB: 1-of-3   10.0 GB: 2-of-3   4.7 GB: 3-of-3
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Sensualize Mixtral AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Dolphin 2.7 Mixtral 8x7b AWQ32K / 24.7 GB367123
Open Gpt4 8x7B AWQ32K / 24.7 GB38432
Mixtral 8x7b V0.1 AWQ32K / 24.7 GB271811
Mixtral Instruct AWQ32K / 24.7 GB62043
H2ogpt Mixtral 8x7b 32K AWQ32K / 24.7 GB150
... Hermes 2 Mixtral 8x7B DPO AWQ32K / 24.7 GB11122
Synatra Mixtral 8x7B AWQ32K / 24.7 GB23
SauerkrautLM Mixtral 8x7B AWQ32K / 27.4 GB61
Functionary Medium V2.2 AWQ32K / 24.7 GB31
... Hermes 2 Mixtral 8x7B SFT AWQ32K / 24.7 GB33
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Sensualize-Mixtral-AWQ.

Rank the Sensualize Mixtral AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a