Japanese Gpt2 Medium is an open-source language model by rinna. Features: 361.3m LLM, VRAM: 1.4GB, License: mit, LLM Explorer Score: 0.02.
| LLM Name | Japanese Gpt2 Medium |
| Repository ๐ค | https://huggingface.co/rinna/japanese-gpt2-medium |
| Model Size | 361.3m |
| Required VRAM | 1.4 GB |
| Updated | 2026-03-31 |
| Maintainer | rinna |
| Model Type | gpt2 |
| Model Files | |
| Supported Languages | ja |
| Model Architecture | GPT2LMHeadModel |
| License | mit |
| Transformers Version | 4.8.2 |
| Tokenizer Class | T5Tokenizer |
| Padding Token | [PAD] |
| Vocabulary Size | 32000 |
| Activation Function | gelu_new |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐