| Model Type | 
 | |||
| Additional Notes | 
 | |||
| Training Details | 
 | 
| LLM Name | Llama 2 70B Guanaco Qlora | 
| Repository ๐ค | https://huggingface.co/Mikael110/llama-2-70b-guanaco-qlora | 
| Model Size | 70b | 
| Required VRAM | 1.7 GB | 
| Updated | 2025-10-24 | 
| Maintainer | Mikael110 | 
| Model Files | |
| Supported Languages | en | 
| Model Architecture | Adapter | 
| Is Biased | none | 
| PEFT Type | LORA | 
| LoRA Model | Yes | 
| PEFT Target Modules | down_proj|k_proj|up_proj|gate_proj|o_proj|q_proj|v_proj | 
| LoRA Alpha | 16 | 
| LoRA Dropout | 0.05 | 
| R Param | 64 | 
| Model | Likes | Downloads | VRAM | 
|---|---|---|---|
| Llama 2 70B Guanaco QLoRA GGUF | 0 | 27 | 29 GB | 
| Llama 2 70B Guanaco QLoRA GGUF | 5 | 568 | 29 GB | 
| Llama 2 70B Guanaco QLoRA AWQ | 0 | 5 | 36 GB | 
| Llama 2 70B Guanaco QLoRA GPTQ | 37 | 48 | 36 GB | 
| Llama 2 70B Guanaco QLoRA GGML | 19 | 18 | 28 GB | 
| Best Alternatives | Context / RAM | Downloads | Likes | 
|---|---|---|---|
| Llama 3.1 70B Abliterated Lora | 0K / 1.7 GB | 26757 | 3 | 
| ...aiga Llama3 70b Sft M1 D5 Lora | 0K / 5.9 GB | 0 | 1 | 
| Llama 3 70B Instruct Spider | 0K / 141.9 GB | 6 | 0 | 
| Airoboros 70B 3.3 Peft | 0K / 0.4 GB | 0 | 2 | 
| Llama3v1 | 0K / 0.1 GB | 6 | 0 | 
| Xwin LM 70B V0.1 LORA | 0K / 1.7 GB | 4 | 1 | 
| Euryale 1.3 L2 70B LORA | 0K / 1.7 GB | 4 | 1 | 
| Miqu 1 70B Hermes2.5 Qlora | 0K / 4.8 GB | 7 | 4 | 
| Limarp Miqu 1 70B Qlora | 0K / 1.7 GB | 2 | 4 | 
| Miqu Limarp 70B DPO Safefile | 0K / 38.4 GB | 3 | 1 | 
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐