| LLM Name | KoRnDAlpaca RAG Polyglot 12.8B | 
| Repository ๐ค | https://huggingface.co/gsjang/KoRnDAlpaca-RAG-Polyglot-12.8B | 
| Model Size | 12.8b | 
| Required VRAM | 51.4 GB | 
| Updated | 2025-10-20 | 
| Maintainer | gsjang | 
| Model Type | gpt_neox | 
| Model Files | |
| Model Architecture | GPTNeoXForCausalLM | 
| License | apache-2.0 | 
| Context Length | 2048 | 
| Model Max Length | 2048 | 
| Transformers Version | 4.36.2 | 
| Tokenizer Class | PreTrainedTokenizerFast | 
| Padding Token | <|endoftext|> | 
| Vocabulary Size | 30003 | 
| Torch Data Type | float32 | 
| Best Alternatives | Context / RAM | Downloads | Likes | 
|---|---|---|---|
| ...therAI Polyglot Ko 12.8B 4bits | 2K / 7.7 GB | 8 | 1 | 
| ...pen Platypus Polyglot Ko 12.8B | 2K / 51.4 GB | 5 | 0 | 
| Polyglot Ko 12.8B Instruct | 2K / 25.9 GB | 3187 | 3 | 
| KoRnDAlpaca RAG Polyglot 12.8B | 2K / 51.4 GB | 5 | 0 | 
| Kullm Polyglot 12.8B V3 | 2K / 25.9 GB | 7 | 5 | 
| Polyglot Ko 12.8B Inst All | 2K / 51.4 GB | 7 | 1 | 
| Koquality Polyglot 12.8B | 2K / 51.4 GB | 5 | 0 | 
| Kyujin Poly Platypus Ko 12.8B | 2K / 25.9 GB | 562 | 2 | 
| Gollm 12.8B Instruct V2.3 | 2K / 25.9 GB | 5 | 0 | 
| Polyglot Ko 12.8B Inst | 2K / 51.4 GB | 6 | 1 | 
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐