Med Orca Instruct 33B GPTQ by yhyhy3

 ยป  All LLMs  ยป  yhyhy3  ยป  Med Orca Instruct 33B GPTQ   URL Share it on

  4bit   Autotrain compatible   Code   Dataset:bigcode/starcoderdata Dataset:c-s-ale/dolly-15k-inst...   Dataset:ehartford/dolphin Dataset:linhduong/chatdoctor-2... Dataset:sahil2801/code instruc... Dataset:tiiuae/falcon-refinedw... Dataset:togethercomputer/redpa...   En   Endpoints compatible   Gptq   Instruct   Llama   Medical   Quantized   Region:us

Med Orca Instruct 33B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Med Orca Instruct 33B GPTQ (yhyhy3/med-orca-instruct-33b-GPTQ)
๐ŸŒŸ Advertise your project ๐Ÿš€

Med Orca Instruct 33B GPTQ Parameters and Internals

Model Type 
LlamaForCausalLM
Supported Languages 
en (proficient)
Training Details 
Data Sources:
ehartford/dolphin, LinhDuong/chatdoctor-200k, sahil2801/code_instructions_120k, c-s-ale/dolly-15k-instruction-alpaca-format
Training Time:
23 hours
Hardware Used:
8x A6000 on Community Cloud
LLM NameMed Orca Instruct 33B GPTQ
Repository ๐Ÿค—https://huggingface.co/yhyhy3/med-orca-instruct-33b-GPTQ 
Model Size33b
Required VRAM17.6 GB
Updated2025-03-02
Maintaineryhyhy3
Model Typellama
Instruction-BasedYes
Model Files  17.6 GB
Supported Languagesen
GPTQ QuantizationYes
Quantization Typegptq|4bit
Model ArchitectureLlamaForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.31.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Med Orca Instruct 33B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
...epseek Coder 33B Instruct GPTQ16K / 17.4 GB108725
Sorceroboros 33B S2a4 Gptq8K / 17.6 GB133
...er 33B Instruct 4.0bpw H6 EXL216K / 17.1 GB75
...er 33B Instruct 8.0bpw H8 EXL216K / 33.6 GB53
...r 33B Instruct 4.65bpw H6 EXL216K / 19.8 GB71
...er 33B Instruct 3.0bpw H6 EXL216K / 13 GB51
...er 33B Instruct 5.0bpw H6 EXL216K / 21.2 GB51
...ardLM 33B V1.0 Uncensored GPTQ2K / 16.9 GB3342
Deepseek Wizard 33B Slerp16K / 35.3 GB70
Deepseek Coder 33B Instruct16K / 66.5 GB16363538
Note: green Score (e.g. "73.2") means that the model is better than yhyhy3/med-orca-instruct-33b-GPTQ.

Rank the Med Orca Instruct 33B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50767 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124