Polyglot Ko 12.8B Instruct by etri-xainlp

 »  All LLMs  »  etri-xainlp  »  Polyglot Ko 12.8B Instruct   URL Share it on

Polyglot Ko 12.8B Instruct is an open-source language model by etri-xainlp. Features: 12.8b LLM, VRAM: 0.2GB, Context: 2K, License: apache-2.0, Instruction-Based, LLM Explorer Score: 0.09.

  Endpoints compatible   Gpt neox   Instruct   Ko   Pytorch   Region:us

Polyglot Ko 12.8B Instruct Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Polyglot Ko 12.8B Instruct Parameters and Internals

Model Type 
Instruction-based, Text generation
Additional Notes 
Model trained with Adam optimizer, linear learning rate scheduler.
Supported Languages 
Korean (High)
Training Details 
Data Sources:
Instruction-following dataset (260k)
Methodology:
Fine-tuning
Hardware Used:
multi-GPU(A100 80G)
Input Output 
Input Format:
text prompts
Accepted Modalities:
text
Output Format:
generated text
Performance Tips:
Use distributed multi-GPU setup for optimal performance.
LLM NamePolyglot Ko 12.8B Instruct
Repository 🤗https://huggingface.co/etri-xainlp/polyglot-ko-12.8b-instruct 
Model Size12.8b
Required VRAM0.2 GB
Updated2026-04-30
Maintaineretri-xainlp
Model Typegpt_neox
Instruction-BasedYes
Model Files  26.0 GB   0.0 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB
Supported Languagesko
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.30.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size30003
Torch Data Typefloat16

Best Alternatives to Polyglot Ko 12.8B Instruct

Best Alternatives
Context / RAM
Downloads
Likes
Polyglot Ko 12.8B Instruct2K / 25.9 GB31873
Gollm 12.8B Instruct V2.32K / 25.9 GB140
Gollm 12.8B Instruct V2.12K / 25.9 GB80
Gollm 12.8B Instruct V2.02K / 25.9 GB70
...lm 12.8B Instruct Tendency T452K / 25.9 GB60
...t Ko 12.8B Chang Instruct Chat2K / 25.9 GB1614
Note: green Score (e.g. "73.2") means that the model is better than etri-xainlp/polyglot-ko-12.8b-instruct.

Rank the Polyglot Ko 12.8B Instruct Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 53310 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum — our secure, self-hosted AI agent for server management.
Release v20260328a