Apertus 3B DPO Wnorm2both is an open-source language model by daslab-testing. Features: 3b LLM, VRAM: 7.7GB, Context: 4K.
| LLM Name | Apertus 3B DPO Wnorm2both |
| Repository 🤗 | https://huggingface.co/daslab-testing/Apertus-3B-DPO-wnorm2both |
| Model Size | 3b |
| Required VRAM | 7.7 GB |
| Updated | 2026-05-12 |
| Maintainer | daslab-testing |
| Model Type | apertus |
| Model Files | |
| Model Architecture | ApertusForCausalLM |
| Context Length | 4096 |
| Model Max Length | 4096 |
| Transformers Version | 5.7.0 |
| Tokenizer Class | TokenizersBackend |
| Padding Token | <pad> |
| Vocabulary Size | 131072 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟