Qwen1.5 110B Chat GPTQ Int4 is an open-source language model by study-hjt. Features: 110b LLM, VRAM: 61.5GB, Context: 32K, License: other, Quantized, LLM Explorer Score: 0.13.
Qwen1.5 110B Chat GPTQ Int4 Parameters and Internals
Model Type
text-generation
Additional Notes
Qwen1.5 is a beta version preceding Qwen2, including enhancements over previous models in performance and multilingual support.
Supported Languages
en (Full support)
Training Details
Methodology:
Pretrained with a large dataset; Post-trained with supervised finetuning and direct preference optimization.
Model Architecture:
Transformer-based decoder architecture with SwiGLU activation, attention QKV bias, group query attention, mixture of sliding window attention and full attention. Improved tokenizer adaptive to multiple languages and codes.
Input Output
Performance Tips:
Use provided hyper-parameters in generation_config.json to avoid code switching or other bad cases.
Note: green Score (e.g. "73.2") means that the model is better than study-hjt/Qwen1.5-110B-Chat-GPTQ-Int4.
Rank the Qwen1.5 110B Chat GPTQ Int4 Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.