Musenet 1B4 L94 D1024 Rwkv Converted by breadlicker45

 ยป  All LLMs  ยป  breadlicker45  ยป  Musenet 1B4 L94 D1024 Rwkv Converted   URL Share it on

Musenet 1B4 L94 D1024 Rwkv Converted is an open-source language model by breadlicker45. Features: LLM, VRAM: 5.3GB, LLM Explorer Score: 0.08.

  Autotrain compatible   Endpoints compatible   Pytorch   Region:us   Rwkv   Sharded

Musenet 1B4 L94 D1024 Rwkv Converted Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Musenet 1B4 L94 D1024 Rwkv Converted (breadlicker45/Musenet-1B4-L94-D1024-rwkv-converted)
๐ŸŒŸ Advertise your project ๐Ÿš€

Musenet 1B4 L94 D1024 Rwkv Converted Parameters and Internals

Additional Notes 
The model is deemed suboptimal because of the way it was trained and is not recommended for use.
Training Details 
Methodology:
The model has issues due to problematic training methods.
LLM NameMusenet 1B4 L94 D1024 Rwkv Converted
Repository ๐Ÿค—https://huggingface.co/breadlicker45/Musenet-1B4-L94-D1024-rwkv-converted 
Required VRAM5.3 GB
Updated2025-09-23
Maintainerbreadlicker45
Model Typerwkv
Model Files  2.0 GB: 1-of-3   2.0 GB: 2-of-3   1.3 GB: 3-of-3
Model ArchitectureRwkvForCausalLM
Transformers Version4.30.2
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size20257
Torch Data Typefloat32

Best Alternatives to Musenet 1B4 L94 D1024 Rwkv Converted

Best Alternatives
Context / RAM
Downloads
Likes
Rwkv Raven 1b50K / 6.1 GB141212
Rwkv 4 1b5 Pile0K / 6.1 GB8996
EagleX 1 7T HF0K / 15 GB1410
... Pileplus 1B5 Evol Instruct V20K / 6.1 GB10400
Music 4 Rwkv Converted0K / 2.3 GB50
Rwkv Musenet Test Untrained40K / 0.9 GB60
Rwkv Musenet Test Untrained30K / 1.8 GB60
Rwkv Musenet Test Untrained0K / 0.1 GB60
Rwkv Music30K / 1.7 GB01
MuseRift0K / 1.7 GB60
Note: green Score (e.g. "73.2") means that the model is better than breadlicker45/Musenet-1B4-L94-D1024-rwkv-converted.

Rank the Musenet 1B4 L94 D1024 Rwkv Converted Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52658 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a