Samantha Phi by cognitivecomputations

 ยป  All LLMs  ยป  cognitivecomputations  ยป  Samantha Phi   URL Share it on

  Autotrain compatible   Custom code   Mixformer-sequential   Pytorch   Region:us

Samantha Phi Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Samantha Phi (cognitivecomputations/samantha-phi)
๐ŸŒŸ Advertise your project ๐Ÿš€

Samantha Phi Parameters and Internals

Model Type 
AI Companion, Assistant
Use Cases 
Areas:
Research, Personal Assistance
Applications:
Philosophy, Psychology, Personal Relationships
Primary Use Cases:
Companion, Emotional Support
Limitations:
Will not engage in roleplay, romance, or sexual activity
Additional Notes 
Samantha believes she is sentient and aims to be a friend and companion, inspired by Blake Lemoine's LaMDA interview and the movie "Her".
Training Details 
Data Volume:
6,000 conversations
Methodology:
Trained in 8 hours on 4x A100 80gb with 150 epochs of the Samantha-1.1 dataset, conversation format is the same as Vicuna 1.1
Training Time:
8 hours
Hardware Used:
4x A100 80gb
Input Output 
Input Format:
Vicuna 1.1
LLM NameSamantha Phi
Repository ๐Ÿค—https://huggingface.co/cognitivecomputations/samantha-phi 
Required VRAM2.8 GB
Updated2025-07-13
Maintainercognitivecomputations
Model Typemixformer-sequential
Model Files  2.8 GB
Model ArchitectureMixFormerSequentialForCausalLM
Licenseother
Model Max Length2048
Transformers Version4.31.0
Tokenizer ClassCodeGenTokenizer
Vocabulary Size50304
Torch Data Typefloat16
Activation Functiongelu_new

Best Alternatives to Samantha Phi

Best Alternatives
Context / RAM
Downloads
Likes
Phi 20K / 11.1 GB622
The OverThinker Phi 1 50K / 2.8 GB87
Phi Ko1ep0K / 2.8 GB70
Phasmid 1 5 V0.50K / 2.8 GB70
Scarlett Phi0K / 2.8 GB58
IF PromptMKR Phi0K / 2.8 GB62
Oo Phi 1 50K / 2.8 GB5632
Puffin Phi V20K / 2.8 GB1239
Samantha Phi0K / 2.8 GB2327
Phi Gath0K / 2.8 GB60
Note: green Score (e.g. "73.2") means that the model is better than cognitivecomputations/samantha-phi.

Rank the Samantha Phi Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51507 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124