Tinyllama PY CODER 4bit Lora 4K V4 by Ramikan-BR

 ยป  All LLMs  ยป  Ramikan-BR  ยป  Tinyllama PY CODER 4bit Lora 4K V4   URL Share it on

  4bit   Autotrain compatible Base model:quantized:ramikan-b... Base model:ramikan-br/tinyllam...   Codegen   En   Endpoints compatible   Gguf   Llama   Lora   Pytorch   Q4   Quantized   Region:us   Safetensors   Sft   Trl   Unsloth

Tinyllama PY CODER 4bit Lora 4k V4 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Tinyllama PY CODER 4bit Lora 4K V4 (Ramikan-BR/tinyllama_PY-CODER-4bit-lora_4k-v4)
๐ŸŒŸ Advertise your project ๐Ÿš€

Tinyllama PY CODER 4bit Lora 4K V4 Parameters and Internals

Model Type 
text-generation-inference, transformers, unsloth
Additional Notes 
This model was finetuned from Ramikan-BR/tinyllama_PY-CODER-bnb-4bit-lora_4k-q4_k_m-v3.
Training Details 
Methodology:
Trained 2x faster with Unsloth and Huggingface's TRL library
LLM NameTinyllama PY CODER 4bit Lora 4k V4
Repository ๐Ÿค—https://huggingface.co/Ramikan-BR/tinyllama_PY-CODER-4bit-lora_4k-v4 
Base Model(s)  Ramikan-BR/tinyllama_PY-CODER-bnb-4bit-lora_4k-q4_k_m-v3   Ramikan-BR/tinyllama_PY-CODER-bnb-4bit-lora_4k-q4_k_m-v3
Model Size1.1b
Required VRAM0.7 GB
Updated2025-10-01
MaintainerRamikan-BR
Model Typellama
Model Files  2.2 GB   2.2 GB   0.7 GB
Supported Languagesen
GGUF QuantizationYes
Quantization Typegguf|q4|4bit
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.40.1
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
LoRA ModelYes
Torch Data Typefloat16

Quantized Models of the Tinyllama PY CODER 4bit Lora 4K V4

Model
Likes
Downloads
VRAM
...llama PY CODER 4bit Lora 4k V50110 GB

Best Alternatives to Tinyllama PY CODER 4bit Lora 4K V4

Best Alternatives
Context / RAM
Downloads
Likes
Tinyllama Coder Py 4bit V104K / 0.7 GB20080
Tinyllama Coder Py V134K / 0.7 GB1120
Tinyllama Coder Py 4bit V34K / 1.2 GB680
Tinyllama Coder Py 4bit V44K / 1.2 GB250
Tinyllama Coder Py V124K / 0.7 GB70
...llama PY CODER 4bit Lora 4k V54K / 0.7 GB110
Tinyllama Coder Py V214K / 0.7 GB1420
Tinyllama Coder Py V234K / 2.2 GB440
Tinyllama Coder Py V114K / 0.7 GB1490
Tinyllama Coder Py V204K / 0.7 GB230
Note: green Score (e.g. "73.2") means that the model is better than Ramikan-BR/tinyllama_PY-CODER-4bit-lora_4k-v4.

Rank the Tinyllama PY CODER 4bit Lora 4K V4 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51611 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124