Rezephyr Merged 4bit by BarraHome

 »  All LLMs  »  BarraHome  »  Rezephyr Merged 4bit   URL Share it on

Rezephyr Merged 4bit is an open-source language model by BarraHome. Features: 7.5b LLM, VRAM: 4.1GB, Context: 32K, License: apache-2.0, Quantized, Merged, LLM Explorer Score: 0.12.

  Merged Model   4-bit   4bit Base model:quantized:unsloth/z...   Base model:unsloth/zephyr-sft   Bitsandbytes   Dataset:yahma/alpaca-cleaned   En   Endpoints compatible   Mistral   Quantized   Region:us   Safetensors   Text-embeddings-inference   Trl   Unsloth

Rezephyr Merged 4bit Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Rezephyr Merged 4bit Parameters and Internals

Model Type 
text-generation-inference, transformers, unsloth, mistral, trl
Additional Notes 
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.
Training Details 
Data Sources:
yahma/alpaca-cleaned
LLM NameRezephyr Merged 4bit
Repository 🤗https://huggingface.co/BarraHome/rezephyr_merged_4bit 
Base Model(s)  Zephyr Sft   unsloth/zephyr-sft
Merged ModelYes
Model Size7.5b
Required VRAM4.1 GB
Updated2026-04-20
MaintainerBarraHome
Model Typemistral
Model Files  4.1 GB
Supported Languagesen
Quantization Type4bit
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.1
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the Rezephyr Merged 4bit

Model
Likes
Downloads
VRAM
Rezephyr DPO0514 GB
Rezephyr DPO GGUF0125 GB

Best Alternatives to Rezephyr Merged 4bit

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral AI SwahiliTron 4BIT32K / 4.1 GB110
Zephyr DPO 4bit32K / 4.1 GB850
ClimateChat32K / 15 GB491
...iceday Finetune Articles V4 V332K / 4.7 GB70
Technical Analysis Unsloth V332K / 4.1 GB963
0dAI 7.5B V2 4bpw32K / 3.8 GB53
0dAI 7.5B V2 8bpw32K / 7.3 GB90
Mistral MoE Lora32K / 4.1 GB51
Legislinho32K / 4.1 GB282
Mymodel V232K / 30.1 GB50
Note: green Score (e.g. "73.2") means that the model is better than BarraHome/rezephyr_merged_4bit.

Rank the Rezephyr Merged 4bit Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53089 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a