Medtulu 2x7b by Technoculture

 ยป  All LLMs  ยป  Technoculture  ยป  Medtulu 2x7b   URL Share it on

  Allenai/tulu-2-dpo-7b   Autotrain compatible   Endpoints compatible   Merge   Mixtral   Moe   Region:us   Safetensors   Sharded   Technoculture/mt7bi-dpo   Tensorflow

Medtulu 2x7b Benchmarks

Medtulu 2x7b (Technoculture/Medtulu-2x7b)
๐ŸŒŸ Advertise your project ๐Ÿš€

Medtulu 2x7b Parameters and Internals

Model Type 
Mixture of Experts (MoE)
LLM NameMedtulu 2x7b
Repository ๐Ÿค—https://huggingface.co/Technoculture/Medtulu-2x7b 
Model Size11.1b
Required VRAM22.1 GB
Updated2025-09-20
MaintainerTechnoculture
Model Typemixtral
Model Files  10.0 GB: 1-of-3   10.0 GB: 2-of-3   2.1 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32017
Torch Data Typebfloat16

Best Alternatives to Medtulu 2x7b

Best Alternatives
Context / RAM
Downloads
Likes
DevPearl 2x7B16K / 22.3 GB52
Medorca 2x7b4K / 35.6 GB16502
Phi3 4x4b Uninitialized4K / 22.2 GB50
Medchator 2x7b4K / 22.1 GB61
Youri 2x7b Dev4K / 22.1 GB54
Youri 2x7b V0.24K / 22.1 GB51

Rank the Medtulu 2x7b Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51483 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124