Qwen2.5 7B Instruct GPTQ Int4 is an open-source language model by Qwen. Features: 7b LLM, VRAM: 5.6GB, Context: 32K, License: apache-2.0, Quantized, Instruction-Based, LLM Explorer Score: 0.26.
Qwen2.5 7B Instruct GPTQ Int4 Parameters and Internals
Model Type
Causal Language Model
Use Cases
Areas:
Research, Commercial applications
Supported Languages
English (Proficient), Chinese (Proficient), French (Proficient), Spanish (Proficient), Portuguese (Proficient), German (Proficient), Italian (Proficient), Russian (Proficient), Japanese (Proficient), Korean (Proficient), Vietnamese (Proficient), Thai (Proficient), Arabic (Proficient)
Training Details
Methodology:
Pretraining & Post-training
Context Length:
131072
Model Architecture:
Transformers with RoPE, SwiGLU, RMSNorm, and Attention QKV bias
Input Output
Accepted Modalities:
Text
Output Format:
Text
Performance Tips:
Add `rope_scaling` configuration only when processing long contexts.
Rank the Qwen2.5 7B Instruct GPTQ Int4 Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 51638 in total.