Stablelm 7B Sft V7 Epoch 3 is an open-source language model by OpenAssistant. Features: 7b LLM, VRAM: 32.4GB, Context: 4K, HF Score: 34.9, LLM Explorer Score: 0.12, Arc: 36, HellaSwag: 55.8, MMLU: 25, TruthfulQA: 37, WinoGrande: 54.9, GSM8K: 0.4.
| Model Type |
| ||||||
| Supported Languages |
| ||||||
| Training Details |
|
| LLM Name | Stablelm 7B Sft V7 Epoch 3 |
| Repository ๐ค | https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3 |
| Model Size | 7b |
| Required VRAM | 32.4 GB |
| Updated | 2026-02-18 |
| Maintainer | OpenAssistant |
| Model Type | gpt_neox |
| Model Files | |
| Supported Languages | en |
| Model Architecture | GPTNeoXForCausalLM |
| Context Length | 4096 |
| Model Max Length | 4096 |
| Transformers Version | 4.28.0.dev0 |
| Tokenizer Class | GPTNeoXTokenizer |
| Vocabulary Size | 50288 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Literature 7B 16384 | 16K / 36 GB | 9 | 15 |
| RedPajama 7B 16384 | 16K / 36 GB | 6 | 4 |
| Stablelm Tuned Alpha 7B | 4K / 31.9 GB | 3348 | 360 |
| Stablelm Base Alpha 7B | 4K / 31.9 GB | 5069 | 208 |
| StableLManticore 7B | 4K / 16 GB | 6 | 1 |
| Pythia 6.9B Deduped 4K | 4K / 27.2 GB | 5 | 10 |
| Stablelm 7B | 4K / 31.9 GB | 5 | 2 |
| RedPajama INCITE 7B Instruct | 2K / 13.8 GB | 2453 | 108 |
| RedPajama INCITE 7B Base | 2K / 13.8 GB | 2441 | 92 |
| Open Calm 7B | 2K / 13.9 GB | 935 | 205 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐