OpenOrca Zephyr 7B GPTQ by TheBloke

 Β»  All LLMs  Β»  TheBloke  Β»  OpenOrca Zephyr 7B GPTQ   URL Share it on

OpenOrca Zephyr 7B GPTQ is an open-source language model by TheBloke. Features: 7b LLM, VRAM: 4.2GB, Context: 32K, License: cc-by-nc-4.0, Quantized, LLM Explorer Score: 0.11.

  4-bit   Autotrain compatible Base model:quantized:weyaxi/op... Base model:weyaxi/openorca-zep...   Gptq   Mistral   Quantized   Region:us   Safetensors

OpenOrca Zephyr 7B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
OpenOrca Zephyr 7B GPTQ (TheBloke/OpenOrca-Zephyr-7B-GPTQ)
🌟 Advertise your project πŸš€

OpenOrca Zephyr 7B GPTQ Parameters and Internals

Model Type 
mistral
Additional Notes 
Quantization details: Multiple GPTQ parameter options are available; specific bit rates, group sizes, and other parameters are highlighted in the detailed documentation. Suitable for both CPU and GPU inference comprising various quantization forms like 4-bit, 8-bit with distinct VRAM trade-offs.
Input Output 
Input Format:
<|system|> {system_message}~~ <|user|> {prompt}~~ <|assistant|>
Output Format:
Varies based on inference execution
LLM NameOpenOrca Zephyr 7B GPTQ
Repository πŸ€—https://huggingface.co/TheBloke/OpenOrca-Zephyr-7B-GPTQ 
Model NameOpenOrca Zephyr 7B
Model CreatorYağız ÇalΔ±k
Base Model(s)  OpenOrca Zephyr 7B   Weyaxi/OpenOrca-Zephyr-7B
Model Size7b
Required VRAM4.2 GB
Updated2025-09-23
MaintainerTheBloke
Model Typemistral
Model Files  4.2 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureMistralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to OpenOrca Zephyr 7B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
...enHermes 2.5 Mistral 7B Marlin32K / 4.1 GB5842
Zephyr 7B Beta Marlin32K / 4.1 GB310
Mistral 7B Instruct V0.3 GPTQ32K / 4.2 GB628241
Mistral 7B Instruct V0.2 GPTQ32K / 4.2 GB4736054
...ral 7B Instruct V0.3 GPTQ 4bit32K / 4.2 GB294618
...ral 7B Instruct V0.3 GPTQ 4bit32K / 4.2 GB209523
...istral 7B Pruned50 GPTQ Marlin32K / 4 GB60
Mistral 7B Unsloth Gptq 8bit32K / 7.7 GB70
Cosmosage V232K / 4.2 GB74
...phyr 7B Beta Assistant V1 Gptq32K / 4.2 GB11
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/OpenOrca-Zephyr-7B-GPTQ.

Rank the OpenOrca Zephyr 7B GPTQ Capabilities

πŸ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum β€” our secure, self-hosted AI agent for server management.
Release v20260328a