DeepSeek V3.0324 MoE Pruner E192 Bf16 by tflsxyy

 ยป  All LLMs  ยป  tflsxyy  ยป  DeepSeek V3.0324 MoE Pruner E192 Bf16   URL Share it on

DeepSeek V3.0324 MoE Pruner E192 Bf16 is an open-source language model by tflsxyy. Features: 521b LLM, VRAM: 285.4GB, Context: 160K, License: mit, MoE, LLM Explorer Score: 0.19.

  Arxiv:2412.19437   Conversational   Custom code   Deepseek v3   Endpoints compatible   Moe   Region:us   Safetensors   Sharded   Tensorflow

DeepSeek V3.0324 MoE Pruner E192 Bf16 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
DeepSeek V3.0324 MoE Pruner E192 Bf16 (tflsxyy/DeepSeek-V3-0324-MoE-Pruner-E192-bf16)
๐ŸŒŸ Advertise your project ๐Ÿš€

Rank the DeepSeek V3.0324 MoE Pruner E192 Bf16 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a