Rwkv 5 World 1b5 by RWKV

 ยป  All LLMs  ยป  RWKV  ยป  Rwkv 5 World 1b5   URL Share it on

  Autotrain compatible   Custom code   Pytorch   Region:us   Rwkv5
Model Card on HF ๐Ÿค—: https://huggingface.co/RWKV/rwkv-5-world-1b5 

Rwkv 5 World 1b5 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Rwkv 5 World 1b5 (RWKV/rwkv-5-world-1b5)
๐ŸŒŸ Advertise your project ๐Ÿš€

Rwkv 5 World 1b5 Parameters and Internals

Model Type 
causal language model
Supported Languages 
Chinese (native), English (secondary)
Input Output 
Input Format:
Instruction: [instruction] Input: [input] Response:
Accepted Modalities:
text
Output Format:
Assistant: [response]
LLM NameRwkv 5 World 1b5
Repository ๐Ÿค—https://huggingface.co/RWKV/rwkv-5-world-1b5 
Required VRAM3.2 GB
Updated2025-09-23
MaintainerRWKV
Model Typerwkv5
Model Files  3.2 GB
Model ArchitectureRwkv5ForCausalLM
Transformers Version4.34.0
Tokenizer ClassRwkv5Tokenizer
Vocabulary Size65536

Rank the Rwkv 5 World 1b5 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51538 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124