Phi 3 Medium 128K Instruct 3.0bpw H6 EXL2 is an open-source language model by LoneStriker. Features: LLM, VRAM: 5.6GB, Context: 128K, License: mit, Quantized, Instruction-Based, LLM Explorer Score: 0.14.
Not evaluated for all downstream purposes, consider AI limitations., Accurate, safe, and fair use in high-risk scenarios require additional evaluations.
Considerations:
Adhere to laws and regulations; implement debiasing techniques in applications.
Supported Languages
Multilingual (English (primary language), other languages (worse performance))
Training Details
Data Sources:
Publicly available documents, Filtered documents, High-quality educational data, Code, Synthetic data, Textbook-like data
Data Volume:
4.8 trillion tokens
Methodology:
Supervised fine-tuning (SFT) and Direct Preference Optimization (DPO)
Context Length:
128000
Training Time:
42 days
Hardware Used:
512 H100-80G GPUs
Model Architecture:
Dense decoder-only Transformer model
Responsible Ai Considerations
Fairness:
These models can over- or under-represent groups or reinforce demeaning stereotypes.
Transparency:
Phi series models might be unreliable or offensive.
Mitigation Strategies:
Developers should apply debiasing techniques and evaluate for fairness, safety, and accuracy.
Note: green Score (e.g. "73.2") means that the model is better than LoneStriker/Phi-3-medium-128k-instruct-3.0bpw-h6-exl2.
Rank the Phi 3 Medium 128K Instruct 3.0bpw H6 EXL2 Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.