MoE Finetuned 4B is an open-source language model by Klevin. Features: 7b LLM, VRAM: 14.5GB, Context: 32K, License: apache-2.0, MoE, Quantized, Fine-Tuned, Instruction-Based, LLM Explorer Score: 0.15.
Note: green Score (e.g. "73.2") means that the model is better than Klevin/MOE-Finetuned-4B.
Rank the MoE Finetuned 4B Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 51648 in total.