TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO   URL Share it on

TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO is an open-source language model by cloudyu. Features: 60.8b LLM, VRAM: 121.8GB, Context: 195K, License: apache-2.0, MoE, HF Score: 77.5, LLM Explorer Score: 0.12, Arc: 74.1, HellaSwag: 86.7, MMLU: 76.7, TruthfulQA: 71.3, WinoGrande: 83.4, GSM8K: 72.9.

  Conversational   Endpoints compatible   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow   Yi

TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO Benchmarks

TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO (cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_full_linear_DPO)
๐ŸŒŸ Advertise your project ๐Ÿš€

TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO Parameters and Internals

Model Type 
Mixture of Experts (MoE)
Additional Notes 
Metrics not tested
Training Details 
Methodology:
Fine-tuned using Direct Preference Optimization
Training Time:
1 hour
Hardware Used:
H100 GPU
Model Architecture:
All-linear-parameter fine-tuned MoE
LLM NameTomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO
Repository ๐Ÿค—https://huggingface.co/cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_full_linear_DPO 
Model Size60.8b
Required VRAM121.8 GB
Updated2026-03-29
Maintainercloudyu
Model Typemixtral
Model Files  4.9 GB: 1-of-25   4.8 GB: 2-of-25   4.9 GB: 3-of-25   4.8 GB: 4-of-25   4.9 GB: 5-of-25   4.8 GB: 6-of-25   4.9 GB: 7-of-25   5.0 GB: 8-of-25   5.0 GB: 9-of-25   5.0 GB: 10-of-25   5.0 GB: 11-of-25   5.0 GB: 12-of-25   5.0 GB: 13-of-25   5.0 GB: 14-of-25   5.0 GB: 15-of-25   5.0 GB: 16-of-25   5.0 GB: 17-of-25   5.0 GB: 18-of-25   5.0 GB: 19-of-25   5.0 GB: 20-of-25   5.0 GB: 21-of-25   5.0 GB: 22-of-25   5.0 GB: 23-of-25   5.0 GB: 24-of-25   2.8 GB: 25-of-25
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length200000
Model Max Length200000
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B195K / 121.9 GB8683111
Yi 34Bx2 MoE 60B DPO195K / 121.8 GB82253
Bagel Hermes 2x34B195K / 121.9 GB10516
Yi 34Bx2 MoE 200K195K / 121.9 GB82342
Yi 34Bx2 MoE 60B195K / 121.9 GB815564
FusionNet 34Bx2 MoE V0.1195K / 121.2 GB598
... Cloudyu Mixtral 34Bx2 MoE 60B195K / 121.8 GB840
FusionNet 34Bx2 MoE32K / 121.2 GB6739
...DPO TomGrc FusionNet 34Bx2 MoE32K / 121.8 GB1144
Nous Hermes 2 MoE 2x34B4K / 121.9 GB7350
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/TomGrc_FusionNet_34Bx2_MoE_v0.1_full_linear_DPO.

Rank the TomGrc FusionNet 34Bx2 MoE V0.1 Full Linear DPO Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a