Pythia 12B Sft V8 7K Steps is an open-source language model by OpenAssistant. Features: 12b LLM, VRAM: 23.8GB, Context: 2K, License: apache-2.0, HF Score: 42.2, LLM Explorer Score: 0.12, Arc: 44, HellaSwag: 70.3, MMLU: 26.6, TruthfulQA: 36.5, WinoGrande: 65.3, GSM8K: 10.6.
| Model Type |
| |||||||||
| Supported Languages |
| |||||||||
| Training Details |
|
| LLM Name | Pythia 12B Sft V8 7K Steps |
| Repository ๐ค | https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps |
| Model Size | 12b |
| Required VRAM | 23.8 GB |
| Updated | 2026-04-10 |
| Maintainer | OpenAssistant |
| Model Type | gpt_neox |
| Model Files | |
| Supported Languages | en |
| Model Architecture | GPTNeoXForCausalLM |
| License | apache-2.0 |
| Context Length | 2048 |
| Model Max Length | 2048 |
| Transformers Version | 4.28.0.dev0 |
| Tokenizer Class | GPTNeoXTokenizer |
| Vocabulary Size | 50288 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Dolly V2 12B | 2K / 23.8 GB | 3283 | 1955 |
| ...sst Sft 4 Pythia 12B Epoch 3.5 | 2K / 23.8 GB | 1587 | 368 |
| Oasst Rl 1 Pythia 12B | 2K / 23.8 GB | 3 | 4 |
| Pythia 12B | 2K / 23.8 GB | 36490 | 144 |
| Oasst Sft 1 Pythia 12B | 2K / 23.8 GB | 1396 | 278 |
| Pythia 12B Deduped | 2K / 23.8 GB | 4764 | 52 |
| Pythia 12B Sft V8.2.5K Steps | 2K / 23.8 GB | 1651 | 0 |
| H2ogpt Gm Oasst1 En 1024 12B | 2K / 23.8 GB | 1433 | 5 |
| H2ogpt Oasst1 512 12B | 2K / 23.9 GB | 1487 | 29 |
| ...asst Pythia 12B Pretrained Sft | 2K / 23.8 GB | 2202 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐