Medorca 2x7b by Technoculture

 ยป  All LLMs  ยป  Technoculture  ยป  Medorca 2x7b   URL Share it on

  Merged Model   Autotrain compatible   Endpoints compatible   Epfl-llm/meditron-7b   Microsoft/orca-2-7b   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Medorca 2x7b Benchmarks

Medorca 2x7b (Technoculture/Medorca-2x7b)
๐ŸŒŸ Advertise your project ๐Ÿš€

Medorca 2x7b Parameters and Internals

Model Type 
MoE (Mixture of Experts)
LLM NameMedorca 2x7b
Repository ๐Ÿค—https://huggingface.co/Technoculture/Medorca-2x7b 
Merged ModelYes
Model Size11.1b
Required VRAM35.6 GB
Updated2025-09-20
MaintainerTechnoculture
Model Typemixtral
Model Files  10.0 GB: 1-of-2   10.0 GB: 1-of-3   3.5 GB: 2-of-2   10.0 GB: 2-of-3   2.1 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32003
Torch Data Typefloat32

Best Alternatives to Medorca 2x7b

Best Alternatives
Context / RAM
Downloads
Likes
DevPearl 2x7B16K / 22.3 GB52
Phi3 4x4b Uninitialized4K / 22.2 GB50
Medchator 2x7b4K / 22.1 GB61
Youri 2x7b Dev4K / 22.1 GB54
Youri 2x7b V0.24K / 22.1 GB51
Medtulu 2x7b2K / 22.1 GB16372
Note: green Score (e.g. "73.2") means that the model is better than Technoculture/Medorca-2x7b.

Rank the Medorca 2x7b Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51483 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124