Yi 34B 200K DARE Merge V5 GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Yi 34B 200K DARE Merge V5 GPTQ   URL Share it on

Yi 34B 200K DARE Merge V5 GPTQ is an open-source language model by TheBloke. Features: 34b LLM, VRAM: 18.6GB, Context: 195K, License: other, Quantized, Merged, LLM Explorer Score: 0.11.

  Merged Model   4-bit Base model:brucethemoose/yi-34... Base model:quantized:brucethem...   En   Gptq   Llama   Quantized   Region:us   Safetensors

Yi 34B 200K DARE Merge V5 GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Yi 34B 200K DARE Merge V5 GPTQ (TheBloke/Yi-34B-200K-DARE-merge-v5-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Yi 34B 200K DARE Merge V5 GPTQ Parameters and Internals

Model Type 
text generation
Input Output 
Input Format:
SYSTEM: {system_message} USER: {prompt} ASSISTANT:
LLM NameYi 34B 200K DARE Merge V5 GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/Yi-34B-200K-DARE-merge-v5-GPTQ 
Model NameYi 34B 200K DARE Merge v5
Model Creatorbrucethemoose
Base Model(s)  Yi 34B 200K DARE Merge V5   brucethemoose/Yi-34B-200K-DARE-merge-v5
Merged ModelYes
Model Size34b
Required VRAM18.6 GB
Updated2026-03-29
MaintainerTheBloke
Model Typellama
Model Files  18.6 GB
Supported Languagesen
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length200000
Model Max Length200000
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to Yi 34B 200K DARE Merge V5 GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Yi 34B 200K RPMerge GPTQ195K / 21.2 GB103
Smaug 34B V0.1 GPTQ195K / 21.2 GB81
Tess 34B V1.5B GPTQ195K / 18.6 GB68
...4B 200K DARE Megamerge V8 GPTQ195K / 18.6 GB93
...y 34B 200K Chat Evaluator GPTQ195K / 18.6 GB183
Deepmoney 34B 200K Base GPTQ195K / 18.6 GB93
...ous Capybara Limarpv3 34B GPTQ195K / 18.6 GB214
Bagel DPO 34B V0.2 GPTQ195K / 18.6 GB102
Bagel 34B V0.2 GPTQ195K / 18.6 GB12
Nontoxic Bagel 34B V0.2 GPTQ195K / 18.6 GB71
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Yi-34B-200K-DARE-merge-v5-GPTQ.

Rank the Yi 34B 200K DARE Merge V5 GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52392 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a