| Model Type |
| |||||||||||||||
| Use Cases |
| |||||||||||||||
| Additional Notes |
| |||||||||||||||
| Supported Languages |
| |||||||||||||||
| Training Details |
| |||||||||||||||
| Input Output |
| |||||||||||||||
| Release Notes |
|
| LLM Name | Polyglot Ko Empathy Chat 5.8B |
| Repository π€ | https://huggingface.co/j5ng/polyglot-ko-empathy-chat-5.8b |
| Model Size | 5.8b |
| Required VRAM | 11.9 GB |
| Updated | 2025-10-05 |
| Maintainer | j5ng |
| Model Type | gpt_neox |
| Model Files | |
| Supported Languages | ko |
| Model Architecture | GPTNeoXForCausalLM |
| License | apache-2.0 |
| Context Length | 2048 |
| Model Max Length | 2048 |
| Transformers Version | 4.36.0.dev0 |
| Tokenizer Class | PreTrainedTokenizerFast |
| Padding Token | <|endoftext|> |
| Vocabulary Size | 30080 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Ai Script2 | 2K / 3.8 GB | 6 | 0 |
| Ai Script | 2K / 3.8 GB | 5 | 0 |
| KoAlpaca Polyglot 5.8B | 2K / 11.7 GB | 3084 | 66 |
| KIT 5.8B | 2K / 23.4 GB | 4 | 3 |
| Kullm Polyglot 5.8B V2 | 2K / 23.6 GB | 12569 | 24 |
| KIT 5.8B | 2K / 23.5 GB | 611 | 0 |
| Model Resize Test | 2K / 23.4 GB | 21 | 0 |
| Polyglot Ko 5.8B Inst All | 2K / 23.6 GB | 6 | 2 |
| Polyglot 5.8B CoT E1 | 2K / 23.6 GB | 4890 | 0 |
| ChatSKKU5.8B | 2K / 11.3 GB | 226 | 0 |
π Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! π