Opt Peter 1.3B by pszemraj

 ยป  All LLMs  ยป  pszemraj  ยป  Opt Peter 1.3B   URL Share it on

Opt Peter 1.3B is an open-source language model by pszemraj. Features: 1.3b LLM, VRAM: 5.3GB, Context: 2K, License: apache-2.0, LLM Explorer Score: 0.01.

  Autotrain compatible   Chatbot   Dialogue   Endpoints compatible   Generated from trainer   Non-commercial   Opt   Pytorch   Region:us   Safetensors   Tensorboard
Model Card on HF ๐Ÿค—: https://huggingface.co/pszemraj/opt-peter-1.3B 

Opt Peter 1.3B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Opt Peter 1.3B (pszemraj/opt-peter-1.3B)
๐ŸŒŸ Advertise your project ๐Ÿš€

Opt Peter 1.3B Parameters and Internals

Model Type 
text-generation, dialogue, chatbot
Use Cases 
Limitations:
Does not allow for commercial use
Considerations:
any statements or claims made by this model do not reflect actual claims/statements by me
Additional Notes 
Exploring to see how OPT does in terms of dialogue/conversational applications. Seems to perform better than GPT-Neo with similar training parameters.
Training Details 
Data Sources:
80k Whatsapp/iMessages
Model Architecture:
fine-tuned version of OPT
LLM NameOpt Peter 1.3B
Repository ๐Ÿค—https://huggingface.co/pszemraj/opt-peter-1.3B 
Model Size1.3b
Required VRAM5.3 GB
Updated2025-09-23
Maintainerpszemraj
Model Typeopt
Model Files  5.3 GB   5.3 GB   0.0 GB
Model ArchitectureOPTForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.19.2
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token</s>
End of Sentence Token</s>
Unk Token</s>
Vocabulary Size50265
Torch Data Typefloat32
Activation Functionrelu
Errorsreplace

Best Alternatives to Opt Peter 1.3B

Best Alternatives
Context / RAM
Downloads
Likes
LLmRa 1.3B V22K / 5.3 GB6640
Opt 1.3B2K / 2.6 GB175545178
Sparse0.5 OPT 1.32K / 2.6 GB18210
... Distilled PromptKD Dolly V1.02K / 2.6 GB60
New Galactica 1.3b Mcq2K / 2.6 GB50
... Galactica 1.3b DPO 0.1beta Ai2K / 2.6 GB51
New Galactica 1.3b Mcq No Rag2K / 2.6 GB60
New Galactica 1.3b Mcq Rag2K / 2.6 GB60
... Galactica 1.3b DPO 0.1beta Ai2K / 2.6 GB50
...tica 1.3b 0.1beta Ai Raft Top32K / 2.6 GB50
Note: green Score (e.g. "73.2") means that the model is better than pszemraj/opt-peter-1.3B.

Rank the Opt Peter 1.3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51648 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260327b