Chinese Alpaca Pro 33B Merged is an open-source language model by minlik. Features: 33b LLM, VRAM: 65.7GB, Context: 2K, LLM Explorer Score: 0.08.
| Model Type |
| |||||||||
| Additional Notes |
| |||||||||
| Input Output |
|
| LLM Name | Chinese Alpaca Pro 33B Merged |
| Repository 🤗 | https://huggingface.co/minlik/chinese-alpaca-pro-33b-merged |
| Model Size | 33b |
| Required VRAM | 65.7 GB |
| Updated | 2026-04-11 |
| Maintainer | minlik |
| Model Type | llama |
| Model Files | |
| Model Architecture | LlamaForCausalLM |
| Context Length | 2048 |
| Model Max Length | 2048 |
| Transformers Version | 4.29.0.dev0 |
| Tokenizer Class | LlamaTokenizer |
| Beginning of Sentence Token | <s> |
| End of Sentence Token | </s> |
| Unk Token | <unk> |
| Vocabulary Size | 49954 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ...angled Llama 33M 32K Base V0.1 | 32K / 0.1 GB | 22 | 1 |
| ReflectionCoder DS 33B | 16K / 67 GB | 9742 | 4 |
| Deepseek Wizard 33B Slerp | 16K / 35.3 GB | 10 | 0 |
| ValidateAI 33B Slerp | 16K / 35.4 GB | 6 | 0 |
| WhiteRabbitNeo 33B V1 | 16K / 67 GB | 1189 | 90 |
| ValidateAI 3 33B Ties | 16K / 66.5 GB | 8 | 0 |
| ValidateAI 2 33B AT | 16K / 66.5 GB | 5 | 0 |
| Chronos Divergence 33B | 16K / 65 GB | 19 | 30 |
| ...dy Deepseekcoder 33B V16.1 32K | 16K / 67.1 GB | 768 | 0 |
| Deepseek Coder 33B Instruct | 16K / 66.5 GB | 4356 | 566 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟