Openbuddy Deepseek 67B V15 Base AWQ by TheBloke

 »  All LLMs  »  TheBloke  »  Openbuddy Deepseek 67B V15 Base AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq Base model:openbuddy/openbuddy... Base model:quantized:openbuddy...   De   En   Fi   Fr   It   Ja   Ko   Llama   Quantized   Region:us   Ru   Safetensors   Sharded   Tensorflow   Zh

Openbuddy Deepseek 67B V15 Base AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Openbuddy Deepseek 67B V15 Base AWQ Parameters and Internals

Model Type 
deepseek
Additional Notes 
This is a part of the Base-series models, trained utilizing approximately 50% of conversational data. It embodies cognitive and dialogue capabilities parallel to the fully-trained OpenBuddy models, yet it hasn’t been extensively fine-tuned for generic conversational tasks.
Supported Languages 
zh (native), en (native), fr (native), de (native), ja (native), ko (native), it (native), ru (native), fi (native)
LLM NameOpenbuddy Deepseek 67B V15 Base AWQ
Repository 🤗https://huggingface.co/TheBloke/openbuddy-deepseek-67b-v15-base-AWQ 
Model NameOpenbuddy Deepseek 67B V15 Base
Model CreatorOpenBuddy
Base Model(s)  ...penbuddy Deepseek 67B V15 Base   OpenBuddy/openbuddy-deepseek-67b-v15-base
Model Size67b
Required VRAM37.5 GB
Updated2025-08-16
MaintainerTheBloke
Model Typedeepseek
Model Files  10.0 GB: 1-of-4   9.9 GB: 2-of-4   10.0 GB: 3-of-4   7.6 GB: 4-of-4
Supported Languageszh en fr de ja ko it ru fi
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length4096
Model Max Length4096
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size102400
Torch Data Typefloat16

Best Alternatives to Openbuddy Deepseek 67B V15 Base AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Deepseek Llm 67B Chat AWQ4K / 37.5 GB10114
Deepseek Llm 67B Base AWQ4K / 37.5 GB362
...67B Spicy 3.1 1 5.0bpw H6 EXL24K / 43.7 GB61
...ek Llm 67B Chat 2.4bpw H6 EXL24K / 22.2 GB52
...k Llm 67B Chat 2.65bpw H6 EXL24K / 24.2 GB51
...ek Llm 67B Chat 3.0bpw H6 EXL24K / 27.1 GB51
Deepseek Llm 67B Chat4K / 135 GB3051203
...penbuddy Deepseek 67B V18.1 4K4K / 135 GB92171
Deepseek Llm 67B Base4K / 135 GB10891125
...penbuddy Deepseek 67B V15 Base4K / 135 GB19440
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/openbuddy-deepseek-67b-v15-base-AWQ.

Rank the Openbuddy Deepseek 67B V15 Base AWQ Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50723 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124