| Model Type |
| ||||||||||||
| Additional Notes |
| ||||||||||||
| Supported Languages |
| ||||||||||||
| Training Details |
| ||||||||||||
| Responsible Ai Considerations |
| ||||||||||||
| Input Output |
|
| LLM Name | Sarashina1 7B |
| Repository ๐ค | https://huggingface.co/sbintuitions/sarashina1-7b |
| Model Size | 7b |
| Required VRAM | 13.9 GB |
| Updated | 2025-10-03 |
| Maintainer | sbintuitions |
| Model Type | gpt_neox |
| Model Files | |
| Supported Languages | ja |
| Model Architecture | GPTNeoXForCausalLM |
| License | mit |
| Context Length | 2048 |
| Model Max Length | 2048 |
| Transformers Version | 4.30.2 |
| Tokenizer Class | PreTrainedTokenizerFast |
| Padding Token | <pad> |
| Vocabulary Size | 51200 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Literature 7B 16384 | 16K / 36 GB | 9 | 15 |
| RedPajama 7B 16384 | 16K / 36 GB | 7 | 4 |
| Stablelm Tuned Alpha 7B | 4K / 31.9 GB | 764 | 360 |
| Stablelm Base Alpha 7B | 4K / 31.9 GB | 2125 | 209 |
| Stablelm 7B Sft V7 Epoch 3 | 4K / 32.4 GB | 1820 | 67 |
| StableLManticore 7B | 4K / 16 GB | 6 | 1 |
| Pythia 6.9B Deduped 4K | 4K / 27.2 GB | 8 | 10 |
| Stablelm 7B | 4K / 31.9 GB | 6 | 2 |
| Open Calm 7B | 2K / 13.9 GB | 7000 | 205 |
| Dolly V2 7B | 2K / 13.8 GB | 2660 | 150 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐