FusionNet 34Bx2 MoE AWQ is an open-source language model by TheBloke. Features: 60.8b LLM, VRAM: 32.8GB, Context: 32K, License: mit, MoE, Quantized, LLM Explorer Score: 0.12.
This model is tuned for MoE method which boosts performance significantly. For AutoAWQ inference, AutoAWQ 0.1.8 or later versions should be installed for compatibility.
Supported Languages
en (fine-tuned)
Training Details
Data Sources:
VMware Open Instruct
Methodology:
fine-tuned using MoE method
Context Length:
8192
Model Architecture:
FusionNet with 60.8B parameters, utilizing MoE (Mixture of Experts) method
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/FusionNet_34Bx2_MoE-AWQ.
Rank the FusionNet 34Bx2 MoE AWQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.