Yi 34B 200K DARE Merge V5 is an open-source language model by brucethemoose. Features: 34b LLM, VRAM: 68.9GB, Context: 195K, License: other, Merged, HF Score: 72, LLM Explorer Score: 0.14, Arc: 66.5, HellaSwag: 85.5, MMLU: 77.2, TruthfulQA: 57.5, WinoGrande: 82.2, GSM8K: 62.9.
| Model Type |
| |
| Additional Notes |
|
| LLM Name | Yi 34B 200K DARE Merge V5 |
| Repository ๐ค | https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v5 |
| Merged Model | Yes |
| Model Size | 34b |
| Required VRAM | 68.9 GB |
| Updated | 2025-09-23 |
| Maintainer | brucethemoose |
| Model Type | llama |
| Model Files | |
| Supported Languages | en |
| Model Architecture | LlamaForCausalLM |
| License | other |
| Context Length | 200000 |
| Model Max Length | 200000 |
| Transformers Version | 4.36.1 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | <unk> |
| Vocabulary Size | 64000 |
| Torch Data Type | bfloat16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| Yi 34B 200K DARE Merge V5 GGUF | 6 | 124 | 14 GB |
| Yi 34B 200K DARE Merge V5 GPTQ | 0 | 5 | 18 GB |
| Yi 34B 200K DARE Merge V5 AWQ | 0 | 5 | 19 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Bagel Hermes 34B Slerp | 195K / 68.9 GB | 8524 | 1 |
| 34B Beta | 195K / 69.2 GB | 7868 | 64 |
| Smaug 34B V0.1 | 195K / 69.2 GB | 8463 | 64 |
| Yi 34B 200K | 195K / 68.9 GB | 9763 | 320 |
| Casual Magnum 34B | 195K / 68.8 GB | 5 | 1 |
| Bagel 34B V0.2 | 195K / 68.7 GB | 2752 | 41 |
| Yi 34B 200K AEZAKMI V2 | 195K / 69.2 GB | 1872 | 12 |
| Faro Yi 34B | 195K / 69.2 GB | 8480 | 6 |
| Bagel DPO 34B V0.5 | 195K / 68.7 GB | 8055 | 17 |
| Smaug 34B V0.1 ExPO | 195K / 69.2 GB | 7108 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐