Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B by DavidAU

 ยป  All LLMs  ยป  DavidAU  ยป  Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B   URL Share it on

  All genres   All use cases   Ar   Autotrain compatible   Backyard   Bfloat16   Bn   Context 128k   Conversational   Creative   Creative writing   De   Deep reasoning   Deep thinking   En   Endpoints compatible   Es   Fa   Fiction   Fr   Hi   Id   It   Ja   Ko   Llama 3.1   Llama-3   Llama-3.1   Llama3   Merge   Mergekit   Mixtral   Mixture of experts   Moe   Ms   Ne   Pl   Problem solving   Pt   Reasoning   Region:us   Ro   Role play   Roleplaying   Ru   Safetensors   Sharded   Sillytavern   Sr   Story   Sv   Tensorflow   Tool calls   Tool use   Tr   Uk   Vi   Writing   Zh

Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B (DavidAU/Llama3.1-MOE-4X8B-Gated-IQ-Multi-Tier-Deep-Reasoning-32B)
๐ŸŒŸ Advertise your project ๐Ÿš€

Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B Parameters and Internals

LLM NameLlama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B
Repository ๐Ÿค—https://huggingface.co/DavidAU/Llama3.1-MOE-4X8B-Gated-IQ-Multi-Tier-Deep-Reasoning-32B 
Model Size24.9b
Required VRAM50.1 GB
Updated2025-07-29
MaintainerDavidAU
Model Typemixtral
Model Files  4.9 GB: 1-of-11   5.0 GB: 2-of-11   4.9 GB: 3-of-11   5.0 GB: 4-of-11   5.0 GB: 5-of-11   4.9 GB: 6-of-11   5.0 GB: 7-of-11   5.0 GB: 8-of-11   4.9 GB: 9-of-11   4.4 GB: 10-of-11   1.1 GB: 11-of-11
Supported Languagesen fr de es pt it ja ko ru zh ar fa id ms ne pl ro sr sv tr uk vi hi bn
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length131072
Model Max Length131072
Transformers Version4.50.0.dev0
Tokenizer ClassPreTrainedTokenizer
Padding Token<|begin_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B

Best Alternatives
Context / RAM
Downloads
Likes
...ll Llama 3.1 Mad Scientist 24B128K / 50.1 GB50
L3.1 ClaudeMaid 4x8B128K / 50.1 GB57
L3.1 MoE 4x8B V0.1128K / 50.1 GB63
L3.1 MoE 4x8B V0.2128K / 50.1 GB52
Llama Salad 4x8B V38K / 50.1 GB36
...x8B Dark Planet Rebel FURY 25B8K / 50.1 GB71
L3 MoE 4X8B Grand Horror 25B8K / 50.1 GB50
...oE 4x8B Dark Planet Rising 25B8K / 50.1 GB50
OpenCrystal V4 L3 4x8B8K / 50 GB52
L3 SnowStorm V1.15 4x8B B8K / 49.9 GB511
Note: green Score (e.g. "73.2") means that the model is better than DavidAU/Llama3.1-MOE-4X8B-Gated-IQ-Multi-Tier-Deep-Reasoning-32B.

Rank the Llama3.1 MoE 4X8B Gated IQ Multi Tier Deep Reasoning 32B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50230 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124