KoSoLAR 10.7B V0.2 1.4 Dedup 1 by jingyeom

 ยป  All LLMs  ยป  jingyeom  ยป  KoSoLAR 10.7B V0.2 1.4 Dedup 1   URL Share it on

  32bit   Autotrain compatible   Conversational   Endpoints compatible   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow

KoSoLAR 10.7B V0.2 1.4 Dedup 1 Benchmarks

KoSoLAR 10.7B V0.2 1.4 Dedup 1 (jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup_1)
๐ŸŒŸ Advertise your project ๐Ÿš€

KoSoLAR 10.7B V0.2 1.4 Dedup 1 Parameters and Internals

Additional Notes 
Training objective was instruction tuning with version 1.4.
Training Details 
Data Sources:
๊ณต๊ฐœ ๋ฐ์ดํ„ฐ ์ˆ˜์ง‘
Methodology:
Deduplicating Training Data Makes Language Models Better
LLM NameKoSoLAR 10.7B V0.2 1.4 Dedup 1
Repository ๐Ÿค—https://huggingface.co/jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup_1 
Base Model(s)  jingyeom/KoSoLAR-10.7B-v0.2_1.3_dedup_p   jingyeom/KoSoLAR-10.7B-v0.2_1.3_dedup_p
Model Size10.7b
Required VRAM21.6 GB
Updated2025-08-19
Maintainerjingyeom
Model Typellama
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   5.0 GB: 3-of-5   4.9 GB: 4-of-5   1.9 GB: 5-of-5
Quantization Type32bit
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size40960
Torch Data Typefloat16

Best Alternatives to KoSoLAR 10.7B V0.2 1.4 Dedup 1

Best Alternatives
Context / RAM
Downloads
Likes
... 10.7B Instruct V1.0 128K EXL2128K / 5.6 GB61
...e KoSoLAR 10.7B V0.2 1.4 Dedup4K / 43.2 GB8410
KoSoLAR 10.7B V0.2 1.4 Dedup4K / 21.6 GB120
Solar Test 240517 16bit4K / 21.4 GB50
Frostwind 10.7B V1 EXL24K / 10.9 GB51
Eclectus 1.1 Dedup4K / 21.4 GB8360
Eclectus1.14K / 21.4 GB7340
Solar 10.7B Bnb 4bit4K / 5.9 GB103
...ess 10.7B V1.5B 8.0bpw H8 EXL24K / 10.9 GB51
DaringLotus V2.10.7B 3bpw EXL24K / 4.3 GB51
Note: green Score (e.g. "73.2") means that the model is better than jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup_1.

Rank the KoSoLAR 10.7B V0.2 1.4 Dedup 1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50751 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124