Starcoder2 7B GPTQ is an open-source language model by TechxGenus. Features: 7b LLM, VRAM: 4.5GB, Context: 16K, License: bigcode-openrail-m, Quantized, LLM Explorer Score: 0.13.
source code generation in 17 programming languages, completion of code snippets in various contexts
Limitations:
inconsistent code quality, possible introduction of bugs, potentially unsafe or inefficient code
Considerations:
Model works best with proper context and code style input
Additional Notes
The model provides high accuracy for code generation tasks when provided with a well-formed code context. It utilizes advanced transformer architecture optimized to handle programming languages.
Supported Languages
programming_languages (:["over200 languages including major programming languages like Python, JavaScript, etc."])
Training Details
Data Sources:
The Stack v2, Arxiv, Wikipedia
Data Volume:
3.5+ trillion tokens
Methodology:
Fill-in-the-Middle objective
Context Length:
16384
Hardware Used:
432 H100 GPUs
Model Architecture:
Transformer decoder with grouped-query and sliding window attention
Note: green Score (e.g. "73.2") means that the model is better than TechxGenus/starcoder2-7b-GPTQ.
Rank the Starcoder2 7B GPTQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.