Stablecode Completion Alpha 3B 4K GPTQ is an open-source language model by TheBloke. Features: 3b LLM, VRAM: 1.8GB, Context: 4K, License: apache-2.0, Quantized, Code Generating, LLM Explorer Score: 0.09.
Stablecode Completion Alpha 3B 4K GPTQ Parameters and Internals
Model Type
causal-lm, text-generation
Use Cases
Areas:
Research, Commercial applications
Applications:
Code completion
Primary Use Cases:
Single/multiline code completion from a long context window up to 4k tokens.
Limitations:
Not intended for unlawful content or activities with high risk of harm.
Considerations:
Use in conjunction with tools like HuggingFace's VSCode extension for responsible usage.
Additional Notes
TheBloke's LLM work is supported by andreessen horowitz (a16z) grant. Multiple quantisation parameters provided allowing for various hardware compatibility.
Supported Languages
code (Programming languages that topped the StackOverflow developer survey.)
Training Details
Data Sources:
bigcode/starcoder-data
Data Volume:
300 billion tokens
Methodology:
Pre-trained at a context length of 4096 for 300 billion tokens
Context Length:
4096
Model Architecture:
Auto-regressive language models based on transformer decoder architecture, trained under 2D parallelism with ZeRO-1 and uses rotary embedding kernels.
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/stablecode-completion-alpha-3b-4k-GPTQ.
Rank the Stablecode Completion Alpha 3B 4K GPTQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52758 in total.