Azma Deepseek Coder 1.3B Instruct Structured Output Peft Merge is an open-source language model by AswanthCManoj. Features: 1.3b LLM, VRAM: 2.7GB, Context: 16K, Instruction-Based, Code Generating, LLM Explorer Score: 0.11.
| LLM Name | Azma Deepseek Coder 1.3B Instruct Structured Output Peft Merge |
| Repository 🤗 | https://huggingface.co/AswanthCManoj/azma-deepseek-coder-1.3b-instruct-structured-output-peft-merge |
| Model Size | 1.3b |
| Required VRAM | 2.7 GB |
| Updated | 2025-09-23 |
| Maintainer | AswanthCManoj |
| Model Type | llama |
| Instruction-Based | Yes |
| Model Files | |
| Generates Code | Yes |
| Model Architecture | LlamaForCausalLM |
| Context Length | 16384 |
| Model Max Length | 16384 |
| Transformers Version | 4.37.0.dev0 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | stic |
| Vocabulary Size | 32256 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Deepseek Coder 1.3B Instruct | 16K / 2.7 GB | 59571 | 160 |
| Speechless Coder Ds 1.3B | 16K / 2.7 GB | 926 | 0 |
| ...c Deepseek Coder 1.3B Instruct | 16K / 5.4 GB | 5 | 0 |
| Hpc Coder V2.1.3B | 16K / 2.7 GB | 43 | 5 |
| ... 1.3B Instruct Trt Int4 G64 Hf | 16K / 0.9 GB | 5 | 0 |
| Datascience Coder 1.3B | 16K / 2.7 GB | 16 | 1 |
| ...pseek Coder 1.3B Instruct GPTQ | 16K / 0.9 GB | 123 | 7 |
| ...epseek Coder 1.3B Instruct AWQ | 8K / 0.9 GB | 138 | 5 |
| ...Coder 1.3B Function Calling V1 | 16K / 2.7 GB | 946 | 2 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟