Yi 34Bx2 MoE 60B by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  Yi 34Bx2 MoE 60B   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow   Yi

Yi 34Bx2 MoE 60B Benchmarks

๐ŸŒŸ Advertise your project ๐Ÿš€

Yi 34Bx2 MoE 60B Parameters and Internals

Model Type 
MoE, Causal LM
Additional Notes 
This model is a variation of the cloudyu/Mixtral_34Bx2_MoE_60B focusing on English and Chinese language capabilities.
Supported Languages 
English (Proficient), Chinese (Proficient)
Input Output 
Input Format:
text input
Accepted Modalities:
text
Output Format:
text output
Performance Tips:
Use GPUs for better performance, with configurations available for both GPU and CPU.
LLM NameYi 34Bx2 MoE 60B
Repository ๐Ÿค—https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B 
Model Size60.8b
Required VRAM121.9 GB
Updated2025-06-09
Maintainercloudyu
Model Typemixtral
Model Files  9.8 GB: 1-of-13   10.0 GB: 2-of-13   10.0 GB: 3-of-13   10.0 GB: 4-of-13   10.0 GB: 5-of-13   10.0 GB: 6-of-13   10.0 GB: 7-of-13   10.0 GB: 8-of-13   10.0 GB: 9-of-13   10.0 GB: 10-of-13   10.0 GB: 11-of-13   10.0 GB: 12-of-13   2.1 GB: 13-of-13
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length200000
Model Max Length200000
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16
Yi 34Bx2 MoE 60B (cloudyu/Yi-34Bx2-MoE-60B)

Quantized Models of the Yi 34Bx2 MoE 60B

Model
Likes
Downloads
VRAM
Yi 34Bx2 MoE 60B GGUF21450 GB

Best Alternatives to Yi 34Bx2 MoE 60B

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B195K / 121.9 GB6261112
Yi 34Bx2 MoE 60B DPO195K / 121.8 GB62363
Bagel Hermes 2x34B195K / 121.9 GB7716
Yi 34Bx2 MoE 200K195K / 121.9 GB62112
...34Bx2 MoE V0.1 Full Linear DPO195K / 121.8 GB172
FusionNet 34Bx2 MoE V0.1195K / 121.2 GB168
... Cloudyu Mixtral 34Bx2 MoE 60B195K / 121.8 GB240
FusionNet 34Bx2 MoE32K / 121.2 GB1358
...DPO TomGrc FusionNet 34Bx2 MoE32K / 121.8 GB384
Nous Hermes 2 MoE 2x34B4K / 121.9 GB1400
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/Yi-34Bx2-MoE-60B.

Rank the Yi 34Bx2 MoE 60B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48023 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124