Zarablend MX L2 7B GGML by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Zarablend MX L2 7B GGML   URL Share it on

Base model:finetune:zarakiquem... Base model:zarakiquemparte/zar...   Ggml   Llama   Llama2   Quantized   Region:us

Zarablend MX L2 7B GGML Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Zarablend MX L2 7B GGML (TheBloke/Zarablend-MX-L2-7B-GGML)
๐ŸŒŸ Advertise your project ๐Ÿš€

Zarablend MX L2 7B GGML Parameters and Internals

Model Type 
llama
Use Cases 
Areas:
Research, Commercial applications
Primary Use Cases:
Text generation
Limitations:
Not intended for supplying factual information or advice
Additional Notes 
The model was created by merging associated models Nous Hermes Llama2 7b and Airoboros L2 7B GPT4 m2.0 using provided scripts. The merged result was then combined with LimaRP LLama2 7B Lora.
Training Details 
Methodology:
Model merging using scripts
Input Output 
Input Format:
Alpaca 2 format
LLM NameZarablend MX L2 7B GGML
Repository ๐Ÿค—https://huggingface.co/TheBloke/Zarablend-MX-L2-7B-GGML 
Model NameZarablend MX L2 7B
Model CreatorZaraki Quem Parte
Base Model(s)  Zarablend Mx L2 7B   zarakiquemparte/zarablend-mx-l2-7b
Model Size7b
Required VRAM2.9 GB
Updated2025-09-23
MaintainerTheBloke
Model Typellama
Model Files  2.9 GB   3.6 GB   3.3 GB   3.0 GB   3.8 GB   4.2 GB   4.1 GB   3.8 GB   4.7 GB   5.1 GB   4.8 GB   4.7 GB   5.5 GB   7.1 GB
GGML QuantizationYes
Quantization Typeggml
Model ArchitectureAutoModel
Licensellama2

Best Alternatives to Zarablend MX L2 7B GGML

Best Alternatives
Context / RAM
Downloads
Likes
Llama 2 7B Chat GGML0K / 2.9 GB868871
Llama 2 GGML Medical Chatbot0K /  GB1040
Llama 2 7B GGML0K / 2.9 GB97220
CodeLlama 7B GGML0K / 3 GB2027
Yarn Llama 2 7B 128K GGML0K / 2.9 GB66
Yarn Llama 2 7B 64K GGML0K / 2.9 GB73
CodeLlama 7B Python GGML0K / 2.9 GB724
CodeLlama 7B Instruct GGML0K / 3 GB1620
Airoboros L2 7B 2.1 GGML0K / 2.9 GB71
Zarafusionex 1.1 L2 7B GGML0K / 2.9 GB52
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Zarablend-MX-L2-7B-GGML.

Rank the Zarablend MX L2 7B GGML Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51535 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124