Scarlett Phi by ajibawa-2023

 ยป  All LLMs  ยป  ajibawa-2023  ยป  Scarlett Phi   URL Share it on

Scarlett Phi is an open-source language model by ajibawa-2023. Features: LLM, VRAM: 2.8GB, License: cc-by-nc-nd-4.0, LLM Explorer Score: 0.1.

  Custom code   En   Mixformer-sequential   Pytorch   Region:us

Scarlett Phi Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Scarlett Phi (ajibawa-2023/Scarlett-Phi)
๐ŸŒŸ Advertise your project ๐Ÿš€

Scarlett Phi Parameters and Internals

Model Type 
sentient AI, conversational AI
Additional Notes 
Scarlett is heavily inspired by Eric Hartford's Samantha and does not engage in role play.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
Philosophy, Advice, Jokes, Eric Hartford's Samantha
Data Volume:
more than 10000 set of conversations
Training Time:
26 hours for 150 epoch
Hardware Used:
Azure 4 x A100 80GB
Input Output 
Input Format:
same as Vicuna 1.1
Accepted Modalities:
text
Output Format:
text
LLM NameScarlett Phi
Repository ๐Ÿค—https://huggingface.co/ajibawa-2023/Scarlett-Phi 
Required VRAM2.8 GB
Updated2026-03-29
Maintainerajibawa-2023
Model Typemixformer-sequential
Model Files  2.8 GB
Supported Languagesen
Model ArchitectureMixFormerSequentialForCausalLM
Licensecc-by-nc-nd-4.0
Model Max Length2048
Transformers Version4.35.0.dev0
Tokenizer ClassCodeGenTokenizer
Vocabulary Size51200
Torch Data Typebfloat16
Activation Functiongelu_new

Best Alternatives to Scarlett Phi

Best Alternatives
Context / RAM
Downloads
Likes
Phi 20K / 11.1 GB1622
The OverThinker Phi 1 50K / 2.8 GB77
Phi Ko1ep0K / 2.8 GB50
Phasmid 1 5 V0.50K / 2.8 GB190
IF PromptMKR Phi0K / 2.8 GB32
Oo Phi 1 50K / 2.8 GB4132
Puffin Phi V20K / 2.8 GB1439
Samantha Phi0K / 2.8 GB1127
Samantha Phi0K / 2.8 GB827
Phi Gath0K / 2.8 GB60
Note: green Score (e.g. "73.2") means that the model is better than ajibawa-2023/Scarlett-Phi.

Rank the Scarlett Phi Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a