DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge by h34v7

 ยป  All LLMs  ยป  h34v7  ยป  DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge   URL Share it on

  Merged Model   Arxiv:2306.01708   Autotrain compatible Base model:gryphe/pantheon-rp-... Base model:pocketdoc/dans-pers...   Conversational   Endpoints compatible   Instruct   Mistral   Region:us   Roleplay   Safetensors   Sharded   Storywriting   Tensorflow

DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge Parameters and Internals

LLM NameDansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge
Repository ๐Ÿค—https://huggingface.co/h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge 
Base Model(s)  Gryphe/Pantheon-RP-1.8-24b-Small-3.1   PocketDoc/Dans-PersonalityEngine-V1.2.0-24b   Gryphe/Pantheon-RP-1.8-24b-Small-3.1   PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
Merged ModelYes
Model Size24b
Required VRAM47.3 GB
Updated2025-06-09
Maintainerh34v7
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-10   4.8 GB: 2-of-10   4.8 GB: 3-of-10   4.9 GB: 4-of-10   4.8 GB: 5-of-10   4.8 GB: 6-of-10   4.9 GB: 7-of-10   4.8 GB: 8-of-10   4.8 GB: 9-of-10   3.8 GB: 10-of-10
Model ArchitectureMistralForCausalLM
Context Length131072
Model Max Length131072
Transformers Version4.51.3
Tokenizer ClassLlamaTokenizerFast
Padding Token<pad>
Vocabulary Size131074
Torch Data Typefloat16
DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge (h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge)

Best Alternatives to DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge

Best Alternatives
Context / RAM
Downloads
Likes
...ral Nemo Instruct 24B Merge V11000K / 24.6 GB90
Cydonia 24B V3128K / 47.3 GB41823
Harbinger 24B128K / 47.3 GB35755
Pantheon RP 1.8 24B Small 3.1128K / 47.3 GB141763
...Small 3.1 24B Instruct 2503 HF128K / 47.3 GB178012
...Small 3.1 24B Instruct 2503 Hf128K / 47.3 GB10872
Zero Mistral 24B128K / 47.3 GB4375
Eurydice 24B V3128K / 47.3 GB1836
Mistral Small 3.1 24B RP128K / 47.3 GB120
Eurydice 24B V2128K / 47.3 GB15014
Note: green Score (e.g. "73.2") means that the model is better than h34v7/DansXPantheon-RP-Engine-V1.2-24b-Small-Instruct-Ties-Merge.

Rank the DansXPantheon RP Engine V1.2 24B Small Instruct Ties Merge Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124