Oo Phi 1 5 by Open-Orca

 ยป  All LLMs  ยป  Open-Orca  ยป  Oo Phi 1 5   URL Share it on

  Arxiv:2301.13688   Arxiv:2306.02707   Arxiv:2309.05463   Autotrain compatible   Custom code   Dataset:open-orca/openorca   En   Mixformer-sequential   Pytorch   Region:us
Model Card on HF ๐Ÿค—: https://huggingface.co/Open-Orca/oo-phi-1_5 

Oo Phi 1 5 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Oo Phi 1 5 (Open-Orca/oo-phi-1_5)
๐ŸŒŸ Advertise your project ๐Ÿš€

Oo Phi 1 5 Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
Research, Commercial applications
Considerations:
Guidelines for usage are not explicitly detailed but involve research and commercial applications.
Additional Notes 
The model is unreleased, untested, unfinished beta at this point. Trained without MultiPack packing.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
Open-Orca/OpenOrca
Methodology:
Instruction tuning
Training Time:
62 hours
Hardware Used:
8x RTX A6000-48GB (Ampere)
Model Architecture:
Based on Microsoft Research's phi-1.5
Input Output 
Input Format:
OpenAI's Chat Markup Language (ChatML) format
Accepted Modalities:
text
Output Format:
Textual responses
Performance Tips:
For efficient execution, ensure processing on appropriate GPU hardware.
LLM NameOo Phi 1 5
Repository ๐Ÿค—https://huggingface.co/Open-Orca/oo-phi-1_5 
Required VRAM2.8 GB
Updated2025-08-21
MaintainerOpen-Orca
Model Typemixformer-sequential
Model Files  2.8 GB   0.0 GB
Supported Languagesen
Model ArchitectureMixFormerSequentialForCausalLM
Model Max Length2048
Transformers Version4.34.0.dev0
Tokenizer ClassCodeGenTokenizer
Vocabulary Size50304
Torch Data Typebfloat16
Activation Functiongelu_new

Best Alternatives to Oo Phi 1 5

Best Alternatives
Context / RAM
Downloads
Likes
Phi 20K / 11.1 GB322
The OverThinker Phi 1 50K / 2.8 GB47
Phi Ko1ep0K / 2.8 GB50
Phasmid 1 5 V0.50K / 2.8 GB50
Scarlett Phi0K / 2.8 GB28
IF PromptMKR Phi0K / 2.8 GB12
Puffin Phi V20K / 2.8 GB439
Phi Gath0K / 2.8 GB50
Samantha Phi0K / 2.8 GB1427
Samantha Phi0K / 2.8 GB1127
Note: green Score (e.g. "73.2") means that the model is better than Open-Orca/oo-phi-1_5.

Rank the Oo Phi 1 5 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50804 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124