| LLM Name | MM Alpaca 3B Lora | 
| Repository ๐ค | https://huggingface.co/PahaII/MM-Alpaca-3B-lora | 
| Model Size | 3b | 
| Required VRAM | 0.2 GB | 
| Updated | 2025-09-23 | 
| Maintainer | PahaII | 
| Model Files | |
| Model Architecture | AutoModelForCausalLM | 
| Is Biased | none | 
| PEFT Type | LORA | 
| LoRA Model | Yes | 
| PEFT Target Modules | k_proj|gate_proj|v_proj|up_proj|o_proj|down_proj|q_proj | 
| LoRA Alpha | 16 | 
| LoRA Dropout | 0.05 | 
| R Param | 64 | 
| Best Alternatives | Context / RAM | Downloads | Likes | 
|---|---|---|---|
| Granite 3B Mup | 4K / 14 GB | 374 | 0 | 
| Llama 3.2 3B Mathdaily Chatbot | 0K / 6.5 GB | 5 | 0 | 
| Llama 3.2 3B Mathdaily Chatbot | 0K / 6.5 GB | 5 | 0 | 
| Mamba GPT 3B V2 | 0K / 6.8 GB | 1749 | 16 | 
| Qwen2.5 3b Lora Model | 0K / 0.1 GB | 6 | 0 | 
| SQL Llama3.2 3b Lora Model | 0K / 0.1 GB | 6 | 0 | 
| ...ma 3.2 3B It Ecommerce ChatBot | 0K / 6.5 GB | 208 | 8 | 
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐