WizardCoder 15B V1.0 Sharded 8gb by alxxtexxr

 ยป  All LLMs  ยป  alxxtexxr  ยป  WizardCoder 15B V1.0 Sharded 8gb   URL Share it on

  Arxiv:2304.12244   Arxiv:2306.08568   Autotrain compatible   Codegen   Endpoints compatible   Gpt bigcode   Pytorch   Region:us   Sharded

WizardCoder 15B V1.0 Sharded 8gb Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
WizardCoder 15B V1.0 Sharded 8gb (alxxtexxr/WizardCoder-15B-V1.0-sharded-8gb)
๐ŸŒŸ Advertise your project ๐Ÿš€

WizardCoder 15B V1.0 Sharded 8gb Parameters and Internals

Model Type 
text generation, code generation
Use Cases 
Areas:
academic research
Applications:
code generation, language model research
Primary Use Cases:
Code generation from instructions for programming tasks
Limitations:
Model outputs are influenced by uncontrollable randomness, hence accuracy cannot be guaranteed.
Considerations:
The content produced by WizardCoder is influenced by uncontrollable variables such as randomness and therefore, the accuracy of the output cannot be guaranteed.
Additional Notes 
Disclaimer: The project accepts no legal liability for the content of the model output.
Supported Languages 
Python (high proficiency), Java (moderate proficiency)
Training Details 
Data Volume:
78k evolved code instructions
Methodology:
Evol-Instruct method, fine-tuning on code-related instructions
Context Length:
2048
Model Architecture:
WizardCoder model architecture adapted from StarCoder
Input Output 
Input Format:
Structured according to Evol-Instruct methodology for coding tasks
Accepted Modalities:
text
Output Format:
Code solutions following a prompt
Performance Tips:
Inference may require tuning batch size, temperature, and max length for best results.
Release Notes 
Version:
15B-V1.0
Date:
Unknown
Notes:
Initial release of WizardCoder with 57.3 pass@1 score on HumanEval
LLM NameWizardCoder 15B V1.0 Sharded 8gb
Repository ๐Ÿค—https://huggingface.co/alxxtexxr/WizardCoder-15B-V1.0-sharded-8gb 
Model Size15b
Required VRAM62.2 GB
Updated2024-09-28
Maintaineralxxtexxr
Model Typegpt_bigcode
Model Files  7.8 GB: 1-of-9   7.6 GB: 2-of-9   7.6 GB: 3-of-9   7.6 GB: 4-of-9   7.6 GB: 5-of-9   7.6 GB: 6-of-9   7.6 GB: 7-of-9   7.6 GB: 8-of-9   1.2 GB: 9-of-9
Generates CodeYes
Model ArchitectureGPTBigCodeForCausalLM
Licensebigscience-openrail-m
Model Max Length2048
Transformers Version4.31.0
Tokenizer ClassGPT2Tokenizer
Vocabulary Size49153
Torch Data Typefloat32
Activation Functiongelu

Best Alternatives to WizardCoder 15B V1.0 Sharded 8gb

Best Alternatives
Context / RAM
Downloads
Likes
WizardCoder Guanaco 15B V1.10K / 31 GB190312
WizardCoder Guanaco 15B V1.00K / 31 GB19085
WizardCoder 15B V1.00K / 31 GB1469762
MoTCoder 15B V1.00K / 30.9 GB82
Interplay AppCoder0K / 31.2 GB112
Codes 15B Spider0K / 62.1 GB100
Codes 15B Bird With Evidence0K / 62.1 GB120
Ziya Coding 15B V10K / 31.1 GB84
WizardCoderSQL 15B V1.00K / 31.2 GB151
Codes 15B0K / 62.1 GB219
Note: green Score (e.g. "73.2") means that the model is better than alxxtexxr/WizardCoder-15B-V1.0-sharded-8gb.

Rank the WizardCoder 15B V1.0 Sharded 8gb Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50728 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124