L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B by DavidAU

 ยป  All LLMs  ยป  DavidAU  ยป  L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B   URL Share it on

  8x8b   All genres   Autotrain compatible   Bfloat16   Conversational   Creative   Creative writing   Endpoints compatible   Fiction   Fiction story   Fiction writing   Horror   Llama   Llama-3   Llama3   Llama3 moe   Merge   Mergekit   Mixtral   Mixture of experts   Moe   Plot generation   Region:us   Roleplaying   Romance   Rp   Safetensors   Scene continue   Science fiction   Sharded   Story   Story generation   Storytelling   Sub-plot generation   Swearing   Tensorflow   Vivid prosing   Vivid writing   Writing

L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B (DavidAU/L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B)
๐ŸŒŸ Advertise your project ๐Ÿš€

L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B Parameters and Internals

LLM NameL3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B
Repository ๐Ÿค—https://huggingface.co/DavidAU/L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B 
Model Size47.5b
Required VRAM94.9 GB
Updated2025-09-23
MaintainerDavidAU
Model Typemixtral
Model Files  2.9 GB: 1-of-33   2.9 GB: 2-of-33   2.9 GB: 3-of-33   2.9 GB: 4-of-33   2.9 GB: 5-of-33   2.9 GB: 6-of-33   2.9 GB: 7-of-33   2.9 GB: 8-of-33   2.9 GB: 9-of-33   2.9 GB: 10-of-33   2.9 GB: 11-of-33   2.9 GB: 12-of-33   2.9 GB: 13-of-33   2.9 GB: 14-of-33   2.9 GB: 15-of-33   2.9 GB: 16-of-33   2.9 GB: 17-of-33   2.9 GB: 18-of-33   2.9 GB: 19-of-33   2.9 GB: 20-of-33   2.9 GB: 21-of-33   2.9 GB: 22-of-33   2.9 GB: 23-of-33   2.9 GB: 24-of-33   2.9 GB: 25-of-33   2.9 GB: 26-of-33   2.9 GB: 27-of-33   2.9 GB: 28-of-33   2.9 GB: 29-of-33   2.9 GB: 30-of-33   2.9 GB: 31-of-33   2.9 GB: 32-of-33   2.1 GB: 33-of-33
Model ArchitectureMixtralForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.46.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|begin_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B

Best Alternatives
Context / RAM
Downloads
Likes
Spring Chicken 8x8b8K / 95.1 GB22
Llama 3 8B Instruct MoE 48K / 95.2 GB50
Mixllama3 8x8b Instruct V0.18K / 95.3 GB04
... Mirrored Chaos Uncensored 47B128K / 94.9 GB2021
Note: green Score (e.g. "73.2") means that the model is better than DavidAU/L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B.

Rank the L3 MoE 8X8B Dark Planet 8D Mirrored Chaos 47B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51631 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20241124