Phi 3.5 Mini Instruct Onnx is an open-source language model by microsoft. Features: LLM, Context: 128K, License: mit, Instruction-Based, LLM Explorer Score: 0.16.
Phi 3.5 Mini Instruct Onnx Parameters and Internals
Model Type
ONNX
Use Cases
Areas:
commercial use, research
Applications:
general purpose AI systems
Primary Use Cases:
memory/compute constrained environments, latency bound scenarios, strong reasoning (especially code, math and logic)
Considerations:
Developers should consider common limitations of language models as they select use cases and evaluate and mitigate for accuracy, safety, and fairness.
Additional Notes
Activation Aware Quantization (AWQ) works by identifying the top 1% most salient weights that are most important for maintaining accuracy and quantizing the remaining 99% of weights.
Training Details
Data Sources:
synthetic data, filtered publicly available websites
Methodology:
supervised fine-tuning, proximal policy optimization, direct preference optimization
Context Length:
128000
Release Notes
Version:
Phi-3.5-Mini-Instruct ONNX
Notes:
Update over the instruction-tuned Phi-3 Mini ONNX model release.
Note: green Score (e.g. "73.2") means that the model is better than microsoft/Phi-3.5-mini-instruct-onnx.
Rank the Phi 3.5 Mini Instruct Onnx Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.