Chinese Alpaca 33B SuperHOT 8K GPTQ is an open-source language model by TheBloke. Features: 33b LLM, VRAM: 17.4GB, Context: 8K, License: other, Quantized, LLM Explorer Score: 0.08.
Chinese Alpaca 33B SuperHOT 8K GPTQ Parameters and Internals
Model Type
Causal Language Model, Text Generation
Use Cases
Areas:
Research, Commercial applications
Applications:
Chinese language text generation
Primary Use Cases:
Chinese text completion, Extended context language modeling
Additional Notes
This model merges Minlik's Chinese Alpaca 33B with Kaioken's SuperHOT 8K capabilities, offering extended context handling experimental via specific configurations.
Supported Languages
Chinese (High)
Training Details
Data Sources:
Chinese language corpus, Instruction datasets for finetuning
Data Volume:
Not Specified
Methodology:
Quantised to 4bit using GPTQ-for-LLaMa
Context Length:
8192
Training Time:
Not Specified
Hardware Used:
4-bit base model
Model Architecture:
GPTQ quantised LLaMA architecture
Input Output
Input Format:
Descriptive instruction in text format
Accepted Modalities:
text
Output Format:
Text response
Performance Tips:
Set loader to ExLlama and compress_pos_emb correctly for max context use.
Release Notes
Version:
Initial
Date:
Not Specified
Notes:
Includes increased context capability and integration with various libraries for inference.
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-GPTQ.
Rank the Chinese Alpaca 33B SuperHOT 8K GPTQ Capabilities
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Instruction Following and Task Automation
Factuality and Completeness of Knowledge
Censorship and Alignment
Data Analysis and Insight Generation
Text Generation
Text Summarization and Feature Extraction
Code Generation
Multi-Language Support and Translation
What open-source LLMs or SLMs are you in search of? 52721 in total.