Manticore 13B Chat Pyg AWQ is an open-source language model by TheBloke. Features: 13b LLM, VRAM: 7.2GB, Context: 2K, License: other, Quantized, Instruction-Based, LLM Explorer Score: 0.1.
Manticore 13B Chat Pyg AWQ Parameters and Internals
Model Type
llama
Use Cases
Applications:
Chatbot applications
Limitations:
Manticore has not been aligned to human preferences with techniques like RLHF, Can produce problematic outputs
Considerations:
Refer to base model LlaMa 13B limitations.
Additional Notes
AWQ is a quantization method supporting 4-bit quantization allowing efficient Transformers-based inference. The model has been quantized for smaller GPUs usage.
Manticore 13B Chat is a Llama 13B model fine-tuned with new datasets, using a de-duped subset of the Pygmalion dataset.
Training Time:
8 hours
Hardware Used:
8xA100 80GB
Model Architecture:
Llama
Input Output
Input Format:
A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: {prompt} ASSISTANT:
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/manticore-13b-chat-pyg-AWQ.
Rank the Manticore 13B Chat Pyg AWQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52473 in total.