WizardLM 2 4x7B MoE by Skylaude

 ยป  All LLMs  ยป  Skylaude  ยป  WizardLM 2 4x7B MoE   URL Share it on

  Autotrain compatible   Endpoints compatible   Merge   Mergekit   Microsoft/wizardlm-2-7b   Mistral   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

WizardLM 2 4x7B MoE Benchmarks

WizardLM 2 4x7B MoE (Skylaude/WizardLM-2-4x7B-MoE)
๐ŸŒŸ Advertise your project ๐Ÿš€

WizardLM 2 4x7B MoE Parameters and Internals

Model Type 
MoE, experimental
Additional Notes 
Quantized versions available for GPU-only, mixed GPU+CPU, or CPU-only inference.
Training Details 
Methodology:
Merges four WizardLM-2-7B models using the random gate mode
Context Length:
8000
Model Architecture:
Mixture of Experts model
Input Output 
Accepted Modalities:
text
Performance Tips:
Set experts per token to 4 for the best results
LLM NameWizardLM 2 4x7B MoE
Repository ๐Ÿค—https://huggingface.co/Skylaude/WizardLM-2-4x7B-MoE 
Model Size24.2b
Required VRAM48.4 GB
Updated2025-09-23
MaintainerSkylaude
Model Typemixtral
Model Files  9.9 GB: 1-of-5   10.0 GB: 2-of-5   9.9 GB: 3-of-5   10.0 GB: 4-of-5   8.6 GB: 5-of-5
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to WizardLM 2 4x7B MoE

Best Alternatives
Context / RAM
Downloads
Likes
Dzakwan MoE 4x7b Beta32K / 48.4 GB90720
Beyonder 4x7B V332K / 48.3 GB906559
Mera Mix 4x7B32K / 48.3 GB972519
Calme 4x7B MoE V0.232K / 48.3 GB97632
Calme 4x7B MoE V0.132K / 48.3 GB90672
MixtureofMerges MoE 4x7b V532K / 48.3 GB90571
MixtureofMerges MoE 4x7b V432K / 48.3 GB90684
Proto Athena 4x7B32K / 48.4 GB50
CognitiveFusion2 4x7B BF1632K / 48.3 GB97253
Proto Athena V0.2 4x7B32K / 48.4 GB60
Note: green Score (e.g. "73.2") means that the model is better than Skylaude/WizardLM-2-4x7B-MoE.

Rank the WizardLM 2 4x7B MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51534 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124