Minotaur 15B GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Minotaur 15B GPTQ   URL Share it on

  Arxiv:1911.02150   Arxiv:2205.14135   Arxiv:2207.14255   Arxiv:2305.06161   4-bit   Autotrain compatible   Code   Codegen Dataset:bigcode/the-stack-dedu...   Dataset:camel-ai/biology   Dataset:camel-ai/chemistry   Dataset:camel-ai/math   Dataset:camel-ai/physics Dataset:ehartford/wizardlm alp...   Dataset:gsm8k   Dataset:hellaswag Dataset:metaeval/scienceqa tex... Dataset:openai/summarize from ...   Dataset:qingyisi/alpaca-cot   Dataset:riddle sense Dataset:teknium/gpteacher-gene... Dataset:tiiuae/falcon-refinedw...   Dataset:winglian/evals   Gpt bigcode   Gptq   Instruct   Quantized   Region:us   Safetensors

Minotaur 15B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Minotaur 15B GPTQ Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
research, commercial applications
Applications:
text generation, code generation
Primary Use Cases:
instruct fine-tuning, prose generation, code generation
Limitations:
Not suitable for non-English text, potential biases and stereotypes
Considerations:
Use caution with generated code as it may contain errors or inefficiencies.
Additional Notes 
Model utilizes Fill-in-the-Middle objective for more varied generation.
Supported Languages 
English (high)
Training Details 
Data Sources:
ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered, QingyiSi/Alpaca-CoT, teknium/GPTeacher-General-Instruct, metaeval/ScienceQA_text_only, openai/summarize_from_feedback, camel-ai/math, camel-ai/physics, camel-ai/chemistry, camel-ai/biology, winglian/evals, hellaswag, riddle_sense, gsm8k
Methodology:
Fine-tuned from StarCoder on open datasets focusing on instruct fine-tuning. Utilizes QLoRA techniques.
Context Length:
8192
Training Time:
Approx. 30 hours per epoch
Hardware Used:
4XA100 80GB
Model Architecture:
GPT model with multi-query attention and Fill-in-the-Middle objective.
Responsible Ai Considerations 
Fairness:
Not aligned to human preferences; can produce problematic outputs.
Transparency:
Users should apply the proper attribution to generated code.
Accountability:
Users are responsible for complying with licensing requirements.
Input Output 
Input Format:
Prompts following the 'USER:' and 'ASSISTANT:' format.
Accepted Modalities:
text
Output Format:
Text
Performance Tips:
Use latest versions of required software for optimal performance.
Release Notes 
Version:
unknown
Date:
unknown
Notes:
Minotaur 15B trained with Axolotl on 4XA100 80GB over approximately 30 hours per epoch.
LLM NameMinotaur 15B GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/minotaur-15B-GPTQ 
Model Size15b
Required VRAM9.2 GB
Updated2025-06-09
MaintainerTheBloke
Model Typegpt_bigcode
Instruction-BasedYes
Model Files  9.2 GB
GPTQ QuantizationYes
Quantization Typegptq
Generates CodeYes
Model ArchitectureGPTBigCodeForCausalLM
Transformers Version4.28.1
Tokenizer ClassGPT2Tokenizer
Vocabulary Size49152
Torch Data Typefloat32
Activation Functiongelu
Minotaur 15B GPTQ (TheBloke/minotaur-15B-GPTQ)

Best Alternatives to Minotaur 15B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
WizardCoderSQL 15B V1.00K / 31.2 GB181
...r Skeleton Wizard Coder Merged0K / 31.2 GB2613
...der Natsql Wizard Coder Merged0K / 32.7 GB241
Minotaur 15B0K / 31.2 GB1915
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/minotaur-15B-GPTQ.

Rank the Minotaur 15B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124