Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0 by llm-jp

 ยป  All LLMs  ยป  llm-jp  ยป  Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0   URL Share it on

  Autotrain compatible   Conversational Dataset:databricks/databricks-... Dataset:llm-jp/databricks-doll...   Dataset:llm-jp/oasst1-21k-en   Dataset:llm-jp/oasst1-21k-ja   Dataset:llm-jp/oasst2-33k-en   Dataset:llm-jp/oasst2-33k-ja   En   Instruct   Ja   Llama   Region:us   Safetensors   Sharded   Tensorflow

Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0 (llm-jp/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0)
๐ŸŒŸ Advertise your project ๐Ÿš€

Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0 Parameters and Internals

Model Type 
Transformer-based Language Model, text-generation
Additional Notes 
The models released here are still in the early stages of our research and development and have not been tuned to ensure outputs align with human intent and safety considerations.
Supported Languages 
languages_supported (en, ja)
Training Details 
Data Sources:
Wikipedia, Common Crawl, The Pile, The Stack
Data Volume:
256 billion tokens
Context Length:
4096
Hardware Used:
128 A100 40GB GPUs
Model Architecture:
Transformer-based Language Model
LLM NameLlm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0
Repository ๐Ÿค—https://huggingface.co/llm-jp/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0 
Model Size13b
Required VRAM27.4 GB
Updated2025-09-23
Maintainerllm-jp
Model Typellama
Instruction-BasedYes
Model Files  5.0 GB: 1-of-6   5.0 GB: 2-of-6   4.9 GB: 3-of-6   4.9 GB: 4-of-6   4.9 GB: 5-of-6   2.7 GB: 6-of-6
Supported Languagesen ja
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.38.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad|LLM-jp>
Vocabulary Size97024
Torch Data Typebfloat16

Best Alternatives to Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0

Best Alternatives
Context / RAM
Downloads
Likes
NexusRaven V2 13B16K / 26 GB1096469
CodeLlama 13B Instruct Hf16K / 26 GB21962154
CodeLlama 13B MORepair16K / 26 GB32
CodeLlama 13B Instruct Hf16K / 26 GB75726
TableLLM 13B16K / 26 GB130729
NexusRaven 13B16K / 26 GB14104
Panda Coder 13B16K / 26 GB613
... Llama 2 13B Instruct Text2sql16K / 26 GB2727
Gen Sim16K / 0.3 GB72
Llama 3 13B Instruct Ft8K / 26.1 GB92
Note: green Score (e.g. "73.2") means that the model is better than llm-jp/llm-jp-13b-instruct-full-dolly-ichikara_004_001_single-oasst-oasst2-v2.0.

Rank the Llm Jp 13B Instruct Full Dolly Ichikara 004 001 Single Oasst Oasst2 V2.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51547 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124