FusionNet 34Bx2 MoE V0.1 by TomGrc

 ยป  All LLMs  ยป  TomGrc  ยป  FusionNet 34Bx2 MoE V0.1   URL Share it on

FusionNet 34Bx2 MoE V0.1 is an open-source language model by TomGrc. Features: 60.8b LLM, VRAM: 121.2GB, Context: 195K, License: mit, MoE, HF Score: 77.4, LLM Explorer Score: 0.12, Arc: 73.7, HellaSwag: 86.5, MMLU: 76.7, TruthfulQA: 71, WinoGrande: 83.4, GSM8K: 73.

  Conversational   En   Endpoints compatible   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

FusionNet 34Bx2 MoE V0.1 Benchmarks

FusionNet 34Bx2 MoE V0.1 (TomGrc/FusionNet_34Bx2_MoE_v0.1)
๐ŸŒŸ Advertise your project ๐Ÿš€

FusionNet 34Bx2 MoE V0.1 Parameters and Internals

LLM NameFusionNet 34Bx2 MoE V0.1
Repository ๐Ÿค—https://huggingface.co/TomGrc/FusionNet_34Bx2_MoE_v0.1 
Model Size60.8b
Required VRAM121.2 GB
Updated2025-12-07
MaintainerTomGrc
Model Typemixtral
Model Files  3.9 GB: 1-of-32   3.8 GB: 2-of-32   3.8 GB: 3-of-32   3.8 GB: 4-of-32   3.8 GB: 5-of-32   3.8 GB: 6-of-32   3.8 GB: 7-of-32   3.8 GB: 8-of-32   3.8 GB: 9-of-32   3.8 GB: 10-of-32   3.8 GB: 11-of-32   3.8 GB: 12-of-32   3.8 GB: 13-of-32   3.8 GB: 14-of-32   3.8 GB: 15-of-32   3.8 GB: 16-of-32   3.8 GB: 17-of-32   3.8 GB: 18-of-32   3.8 GB: 19-of-32   3.8 GB: 20-of-32   3.8 GB: 21-of-32   3.8 GB: 22-of-32   3.8 GB: 23-of-32   3.8 GB: 24-of-32   3.8 GB: 25-of-32   3.8 GB: 26-of-32   3.8 GB: 27-of-32   3.8 GB: 28-of-32   4.0 GB: 29-of-32   4.0 GB: 30-of-32   3.9 GB: 31-of-32   2.8 GB: 32-of-32
Supported Languagesen
Model ArchitectureMixtralForCausalLM
Licensemit
Context Length200000
Model Max Length200000
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to FusionNet 34Bx2 MoE V0.1

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B195K / 121.9 GB8683111
Yi 34Bx2 MoE 60B DPO195K / 121.8 GB82253
Bagel Hermes 2x34B195K / 121.9 GB10516
Yi 34Bx2 MoE 200K195K / 121.9 GB82342
Yi 34Bx2 MoE 60B195K / 121.9 GB815564
...34Bx2 MoE V0.1 Full Linear DPO195K / 121.8 GB1062
... Cloudyu Mixtral 34Bx2 MoE 60B195K / 121.8 GB840
FusionNet 34Bx2 MoE32K / 121.2 GB6739
...DPO TomGrc FusionNet 34Bx2 MoE32K / 121.8 GB1144
Nous Hermes 2 MoE 2x34B4K / 121.9 GB7350
Note: green Score (e.g. "73.2") means that the model is better than TomGrc/FusionNet_34Bx2_MoE_v0.1.

Rank the FusionNet 34Bx2 MoE V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a