Aya 23 8B Quantized is an open-source language model by jadechoghari. Features: 8b LLM, VRAM: 8.1GB, Context: 8K, License: cc-by-nc-4.0, LLM Explorer Score: 0.15.
instruction fine-tuned model, multilingual large language model
Additional Notes
Quality might vary compared to the base model. Efficiency evaluated with specific hardware and configurations.
Supported Languages
en (English), fr (French), de (German), es (Spanish), it (Italian), pt (Portuguese), ja (Japanese), ko (Korean), zh (Chinese), ar (Arabic), el (Greek), fa (Persian), pl (Polish), id (Indonesian), cs (Czech), he (Hebrew), hi (Hindi), nl (Dutch), ro (Romanian), ru (Russian), tr (Turkish), uk (Ukrainian), vi (Vietnamese)
Note: green Score (e.g. "73.2") means that the model is better than jadechoghari/aya-23-8B-quantized.
Rank the Aya 23 8B Quantized Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.