MN Sappho E 12B by mergekit-community

 ยป  All LLMs  ยป  mergekit-community  ยป  MN Sappho E 12B   URL Share it on

  Merged Model   Arxiv:2403.19522   Autotrain compatible Base model:latitudegames/wayfa... Base model:mistralai/mistral-n... Base model:mistralai/mistral-n... Base model:pygmalionai/eleusis...   Conversational   Endpoints compatible   Instruct   Mistral   Region:us   Safetensors   Sharded   Tensorflow

MN Sappho E 12B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
MN Sappho E 12B (mergekit-community/MN-Sappho-e-12B)
๐ŸŒŸ Advertise your project ๐Ÿš€

MN Sappho E 12B Parameters and Internals

LLM NameMN Sappho E 12B
Repository ๐Ÿค—https://huggingface.co/mergekit-community/MN-Sappho-e-12B 
Base Model(s)  Mistral Nemo Base 2407   Wayfarer 12B   MN Sappho B 12B   MN Sappho D 12B   Mistral Nemo Instruct 2407   Eleusis 12B   mistralai/Mistral-Nemo-Base-2407   LatitudeGames/Wayfarer-12B   mergekit-community/MN-Sappho-b-12B   mergekit-community/MN-Sappho-d-12B   mistralai/Mistral-Nemo-Instruct-2407   PygmalionAI/Eleusis-12B
Merged ModelYes
Model Size12b
Required VRAM24.5 GB
Updated2025-06-17
Maintainermergekit-community
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Model ArchitectureMistralForCausalLM
Context Length1024000
Model Max Length1024000
Transformers Version4.48.2
Tokenizer ClassGPT2Tokenizer
Padding Token<pad>
Vocabulary Size131072
Torch Data Typebfloat16

Best Alternatives to MN Sappho E 12B

Best Alternatives
Context / RAM
Downloads
Likes
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB717896121
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB347922
MN Slush1000K / 24.5 GB29730
Mistral Nemo Wissenschaft 12B1000K / 24.5 GB16568
Francois PE V2 Huali 12B1000K / 24.5 GB18915
ChatWaifu V1.41000K / 24.5 GB14019
Magnum V4 12B1000K / 24.5 GB10742
ChatWaifu 12B V2.01000K / 24.5 GB5521
Mistral Nemo Bophades 12B1000K / 24.5 GB619
...tral Nemo Gutenberg Doppel 12B1000K / 24.5 GB717
Note: green Score (e.g. "73.2") means that the model is better than mergekit-community/MN-Sappho-e-12B.

Rank the MN Sappho E 12B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48186 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124