Chat Gpt2 DPO is an open-source language model by Sharathhebbar24. Features: 124.4m LLM, VRAM: 0.5GB, License: apache-2.0, HF Score: 28.6, LLM Explorer Score: 0.18, Arc: 24, HellaSwag: 31.2, MMLU: 25, TruthfulQA: 41.3, WinoGrande: 50.
| LLM Name | Chat Gpt2 DPO |
| Repository ๐ค | https://huggingface.co/Sharathhebbar24/chat_gpt2_dpo |
| Model Size | 124.4m |
| Required VRAM | 0.5 GB |
| Updated | 2026-03-29 |
| Maintainer | Sharathhebbar24 |
| Model Type | gpt2 |
| Model Files | |
| Supported Languages | en |
| Model Architecture | GPT2LMHeadModel |
| License | apache-2.0 |
| Model Max Length | 1024 |
| Transformers Version | 4.40.1 |
| Tokenizer Class | GPT2Tokenizer |
| Padding Token | <|endoftext|> |
| Vocabulary Size | 50257 |
| Torch Data Type | float32 |
| Activation Function | gelu_new |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Gpt2 Scratch | 0K / 0.5 GB | 6 | 0 |
| Phrase To Story Generator | 0K / 0.5 GB | 5 | 0 |
| Gpt2 Hoodie Final | 0K / 0.5 GB | 7 | 0 |
| Autotrain Be6vh G5hv9 | 0K / 0.5 GB | 5 | 0 |
| ArshGpt | 0K / 0.5 GB | 53 | 12 |
| Gpt2 Sft | 0K / 0.5 GB | 10 | 0 |
| Gpt2 Coconut Gsm From Cot7 | 0K / 0.5 GB | 8 | 0 |
| MindMate | 0K / 0.5 GB | 5 | 1 |
| MindMate V2 | 0K / 0.5 GB | 5 | 0 |
| MindMate V1 | 0K / 0.5 GB | 1 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐