Yi 70B 200K RPMerge Franken by DisOOM

 »  All LLMs  »  DisOOM  »  Yi 70B 200K RPMerge Franken   URL Share it on

Yi 70B 200K RPMerge Franken is an open-source language model by DisOOM. Features: 70b LLM, VRAM: 142.4GB, Context: 195K, License: other, LLM Explorer Score: 0.12.

  Chat   Chi   Conversational   En   Endpoints compatible   Llama   Merge   Mergekit   Region:us   Safetensors   Sharded   Tensorflow   Yi

Yi 70B 200K RPMerge Franken Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Yi 70B 200K RPMerge Franken Parameters and Internals

Model Type 
text generation, conversational
Additional Notes 
This is a frankenmerge created by interleaving layers of the Yi-34B with itself to increase the parameter count and enhance performance.
Supported Languages 
en (English), chi (Chinese)
Training Details 
Methodology:
Interleaving layers of Yi-34B-200K-RPMerge with itself using mergekit
LLM NameYi 70B 200K RPMerge Franken
Repository 🤗https://huggingface.co/DisOOM/Yi-70B-200k-RPMerge-Franken 
Model Size70b
Required VRAM142.4 GB
Updated2026-04-18
MaintainerDisOOM
Model Typellama
Model Files  9.6 GB: 1-of-15   9.7 GB: 2-of-15   9.9 GB: 3-of-15   9.9 GB: 4-of-15   9.9 GB: 5-of-15   9.9 GB: 6-of-15   9.8 GB: 7-of-15   9.9 GB: 8-of-15   9.8 GB: 9-of-15   9.8 GB: 10-of-15   10.0 GB: 11-of-15   9.7 GB: 12-of-15   10.0 GB: 13-of-15   10.0 GB: 14-of-15   4.5 GB: 15-of-15
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length200000
Model Max Length200000
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64002
Torch Data Typefloat16

Best Alternatives to Yi 70B 200K RPMerge Franken

Best Alternatives
Context / RAM
Downloads
Likes
... Chat 1048K Chinese Llama3 70B1024K / 141.9 GB90695
... Chat 1048K Chinese Llama3 70B1024K / 141.9 GB79404
... 3 70B Instruct Gradient 1048K1024K / 141.9 GB42122
Llama3 Function Calling 1048K1024K / 141.9 GB81
...a 3 70B Instruct Gradient 524K512K / 141.9 GB29323
...a 3 70B Instruct Gradient 262K256K / 141.9 GB2256
...ama 3 70B Arimas Story RP V2.0256K / 141.1 GB303
...ama 3 70B Arimas Story RP V1.6256K / 141.2 GB130
...ama 3 70B Arimas Story RP V1.5256K / 141.2 GB213
Llama 3.1 70B Instruct128K / 141.9 GB1008029904
Note: green Score (e.g. "73.2") means that the model is better than DisOOM/Yi-70B-200k-RPMerge-Franken.

Rank the Yi 70B 200K RPMerge Franken Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53232 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a