Medtulu 4x7B by Technoculture

 ยป  All LLMs  ยป  Technoculture  ยป  Medtulu 4x7B   URL Share it on

  Allenai/tulu-2-dpo-7b   Autotrain compatible Chaoyi-wu/pmc llama 7b 10 epoc...   Endpoints compatible   Epfl-llm/meditron-7b   Medalpaca/medalpaca-7b   Merge   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Medtulu 4x7B Benchmarks

Medtulu 4x7B (Technoculture/Medtulu-4x7B)
๐ŸŒŸ Advertise your project ๐Ÿš€

Medtulu 4x7B Parameters and Internals

Model Type 
text-generation
Additional Notes 
Mediquad-tulu-20B is an MoE model using components like epfl-llm/meditron-7b, medalpaca/medalpaca-7b, chaoyi-wu/PMC_LLAMA_7B_10_epoch, and allenai/tulu-2-dpo-7b.
LLM NameMedtulu 4x7B
Repository ๐Ÿค—https://huggingface.co/Technoculture/Medtulu-4x7B 
Model Size19.7b
Required VRAM39.4 GB
Updated2025-09-21
MaintainerTechnoculture
Model Typemixtral
Model Files  10.0 GB: 1-of-4   9.9 GB: 2-of-4   9.9 GB: 3-of-4   9.6 GB: 4-of-4
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Medtulu 4x7B

Best Alternatives
Context / RAM
Downloads
Likes
CodeLlama 4x7B Experts Base16K / 39.5 GB50
Magician MoE 4x7B16K / 78.8 GB16501
Mediquad 4x7b8K / 39.4 GB16690
Medorca 4x7b4K / 39.4 GB16680
GOAT Adapt MoE 4x7B2K / 39.4 GB53
Note: green Score (e.g. "73.2") means that the model is better than Technoculture/Medtulu-4x7B.

Rank the Medtulu 4x7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51507 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124