Laser Dolphin Mixtral 2x7b DPO AWQ by macadeliccc

 ยป  All LLMs  ยป  macadeliccc  ยป  Laser Dolphin Mixtral 2x7b DPO AWQ   URL Share it on

Laser Dolphin Mixtral 2x7b DPO AWQ is an open-source language model by macadeliccc. Features: 2b LLM, VRAM: 7.1GB, Context: 32K, License: cc, MoE, Quantized, LLM Explorer Score: 0.12.

  4-bit   Autotrain compatible   Awq   Endpoints compatible   Mixtral   Moe   Quantized   Region:us   Safetensors

Laser Dolphin Mixtral 2x7b DPO AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Laser Dolphin Mixtral 2x7b DPO AWQ (macadeliccc/laser-dolphin-mixtral-2x7b-dpo-AWQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Laser Dolphin Mixtral 2x7b DPO AWQ Parameters and Internals

Additional Notes 
Quantization: 4-bit
LLM NameLaser Dolphin Mixtral 2x7b DPO AWQ
Repository ๐Ÿค—https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo-AWQ 
Model Size2b
Required VRAM7.1 GB
Updated2025-09-23
Maintainermacadeliccc
Model Typemixtral
Model Files  7.1 GB
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureMixtralForCausalLM
Licensecc
Context Length32768
Model Max Length32768
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Laser Dolphin Mixtral 2x7b DPO AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 7Bx2 MoE AWQ32K / 7.1 GB82
Manbasya 2x7b MoE32K / 7.1 GB600
Note: green Score (e.g. "73.2") means that the model is better than macadeliccc/laser-dolphin-mixtral-2x7b-dpo-AWQ.

Rank the Laser Dolphin Mixtral 2x7b DPO AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52758 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a