Yi 34Bx2 MoE 60B 5.0bpw H6 EXL2 is an open-source language model by LoneStriker. Features: 60b LLM, VRAM: 38.8GB, Context: 195K, License: cc-by-nc-4.0, MoE, Quantized, LLM Explorer Score: 0.11.
Yi 34Bx2 MoE 60B 5.0bpw H6 EXL2 Parameters and Internals
Model Type
Mixture of Experts (MoE), Causal Language Model
Use Cases
Areas:
research, non-commercial applications
Additional Notes
The Yi based MOE 2x34B model is a slightly different version of cloudyu/Mixtral_34Bx2_MoE_60B and integrates elements from jondurbin/bagel-dpo-34b-v0.2 and SUSTech/SUS-Chat-34B models.
Supported Languages
English (proficient), Chinese (proficient)
Input Output
Input Format:
Textual prompts
Accepted Modalities:
text
Output Format:
Text
Performance Tips:
Ensure that the input prompt is in English or Chinese for best performance.
Rank the Yi 34Bx2 MoE 60B 5.0bpw H6 EXL2 Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.