Fine Tuned Codegen 16B Verilog by shailja

 ยป  All LLMs  ยป  shailja  ยป  Fine Tuned Codegen 16B Verilog   URL Share it on

  Arxiv:2212.11140   Autotrain compatible   Code   Codegen   Dataset:shailja/verilog github   Endpoints compatible   Model-index   Pytorch   Region:us

Fine Tuned Codegen 16B Verilog Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Fine Tuned Codegen 16B Verilog Parameters and Internals

Model Type 
text-generation
Use Cases 
Primary Use Cases:
Teaching Assistant for Verilog HDL, Generating Verilog code snippets
Limitations:
Not an instruction model, Generated code is not guaranteed to work as intended
Considerations:
Pretrained dataset not filtered for permissive licenses
Additional Notes 
The model generates Verilog code snippets; generated content might require attribution due to the license of original datasets.
Supported Languages 
Verilog (High)
Training Details 
Data Sources:
shailja/Verilog_GitHub
Data Volume:
~72B
Methodology:
Fine-tuning
Training Time:
15 days
Hardware Used:
4 Tesla A100 GPUs
Model Architecture:
GPT-2 model with multi-query attention
Input Output 
Input Format:
Code prompt in Verilog
Accepted Modalities:
text
Output Format:
Generated Verilog code
Performance Tips:
Adding partial line of module header like 'module mux' improves output
LLM NameFine Tuned Codegen 16B Verilog
Repository ๐Ÿค—https://huggingface.co/shailja/fine-tuned-codegen-16B-Verilog 
Model Size16b
Required VRAM32.2 GB
Updated2025-06-09
Maintainershailja
Model Typecodegen
Model Files  32.2 GB   0.0 GB
Generates CodeYes
Model ArchitectureCodeGenForCausalLM
Licensebigcode-openrail-m
Transformers Version4.22.0.dev0
Tokenizer ClassGPT2Tokenizer
Vocabulary Size50295
Torch Data Typefloat16
Activation Functiongelu_new
Fine Tuned Codegen 16B Verilog (shailja/fine-tuned-codegen-16B-Verilog)

Best Alternatives to Fine Tuned Codegen 16B Verilog

Best Alternatives
Context / RAM
Downloads
Likes
Instruct Codegen 16B0K / 32.2 GB3121
Codegen 16B Mono Toolbench0K / 128.4 GB375
Codegen2 16B P0K / 64.3 GB14945
Codegen 16B Multi 6 Parts0K / 32.2 GB250
Codegen 16B Nl Sharded0K / 32.1 GB257
Codegen 16B Nl0K / 32.2 GB19318
Codegen 16B Multi0K / 32.2 GB371120
Codegen 16B Mono0K / 32.2 GB295126
Note: green Score (e.g. "73.2") means that the model is better than shailja/fine-tuned-codegen-16B-Verilog.

Rank the Fine Tuned Codegen 16B Verilog Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48023 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124