Mistral 11B OmniMix GPTQ is an open-source language model by TheBloke. Features: 11b LLM, VRAM: 6GB, Context: 32K, License: cc-by-nc-4.0, Quantized, LLM Explorer Score: 0.1.
This model appears to be primarily for testing the merge and layer rotation capabilities and isn't explicitly optimized for any particular practical application at the moment.
Additional Notes
This model is quantized and multifaceted with multiple GPTQ (Group-based Post-training Quantization) configurations tailored for different VRAM usage and accuracy trade-offs.
Training Details
Methodology:
Merge and layer toying involving multiple base models to achieve higher scoring.
Model Architecture:
Combination of multiple Mistral 7B models using layer slicing and slerp merge methods with specific filter parameters for each component.
Input Output
Input Format:
<|system|> Below is an instruction that describes a task. Write a response that appropriately completes the request. <|user|> {prompt} <|assistant|>
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Mistral-11B-OmniMix-GPTQ.
Rank the Mistral 11B OmniMix GPTQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52394 in total.