Guanaco 13B Merged 4bit by Monero

 ยป  All LLMs  ยป  Monero  ยป  Guanaco 13B Merged 4bit   URL Share it on

  Merged Model   4bit   Autotrain compatible Dataset:josephuscheung/guanaco... Dataset:timdettmers/guanaco-13...   Endpoints compatible   Llama   Quantized   Region:us

Guanaco 13B Merged 4bit Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Guanaco 13B Merged 4bit (Monero/Guanaco-13b-Merged-4bit)
๐ŸŒŸ Advertise your project ๐Ÿš€

Guanaco 13B Merged 4bit Parameters and Internals

Model Type 
instruction-following, language model, multimodal
Use Cases 
Areas:
research, commercial applications
Applications:
instruction-following, role-playing, multimodal interactions
Primary Use Cases:
multilingual environments, extended dialogues, role-playing scenarios
Limitations:
Harmful, biased, or explicit content may be generated
Considerations:
Users should be cautious about model outputs that may not adhere to ethical norms.
Additional Notes 
Guanaco focuses on bridging visual and linguistic understanding, expanding its versatility.
Supported Languages 
English (High), Simplified Chinese (High), Traditional Chinese (Taiwan) (High), Traditional Chinese (Hong Kong) (High), Japanese (High), Deutsch (High)
Training Details 
Data Sources:
Alpaca model dataset, GuanacoDataset
Data Volume:
534,530 additional entries
Model Architecture:
LoRa on LLaMA 13B
Input Output 
Input Format:
ChatGPT-like for structured multi-turn dialogues
Accepted Modalities:
text, image
Output Format:
Dependent on input format, includes VQA capabilities
Performance Tips:
Useful to provide verifiable sources for knowledge-based answers.
LLM NameGuanaco 13B Merged 4bit
Repository ๐Ÿค—https://huggingface.co/Monero/Guanaco-13b-Merged-4bit 
Base Model(s)  Guanaco 13B Merged   timdettmers/guanaco-13b-merged
Merged ModelYes
Model Size13b
Required VRAM7.5 GB
Updated2025-09-14
MaintainerMonero
Model Typellama
Model Files  7.5 GB
Quantization Type4bit
Model ArchitectureLlamaForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Guanaco 13B Merged 4bit

Best Alternatives
Context / RAM
Downloads
Likes
Llama13b 32K Illumeet Finetune32K / 26 GB90
...Maid V3 13B 32K 6.0bpw H6 EXL232K / 10 GB51
...Maid V3 13B 32K 8.0bpw H8 EXL232K / 13.2 GB51
WhiteRabbitNeo 13B V116K / 26 GB2627428
CodeLlama 13B Python Fp1616K / 26 GB263525
CodeLlama 13B Instruct Fp1616K / 26 GB265028
...Llama 13B Instruct Hf 4bit MLX16K / 7.8 GB12522
CodeLlama 13B Fp1616K / 26 GB566
Airophin 13B Pntk 16K Fp1616K / 26 GB16574
Codellama 13B Bnb 4bit16K / 7.2 GB425
Note: green Score (e.g. "73.2") means that the model is better than Monero/Guanaco-13b-Merged-4bit.

Rank the Guanaco 13B Merged 4bit Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51368 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124