| LLM Name | InternLM2 Math Plus 7B 4bit |
| Repository ๐ค | https://huggingface.co/lee-ite/InternLM2-Math-Plus-7B-4bit |
| Base Model(s) | |
| Model Size | 7b |
| Required VRAM | 5.2 GB |
| Updated | 2024-07-04 |
| Maintainer | lee-ite |
| Model Type | internlm2 |
| Model Files | |
| Supported Languages | en zh |
| Quantization Type | 4bit |
| Model Architecture | InternLM2ForCausalLM |
| License | apache-2.0 |
| Context Length | 8192 |
| Model Max Length | 8192 |
| Transformers Version | 4.41.1 |
| Is Biased | 0 |
| Tokenizer Class | InternLM2Tokenizer |
| Padding Token | </s> |
| Vocabulary Size | 92544 |
| Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Internlm2 5 7B Chat 8bit | 32K / 8.2 GB | 70 | 3 |
| Internlm2 5 7B Chat 4bit | 32K / 5.2 GB | 41 | 5 |
| Internlm2 5 7B Chat 4bit | 32K / 4.3 GB | 8 | 2 |
| Internlm2 5 7B | 256K / 15.5 GB | 4762 | 17 |
| Internlm2 5 7B Chat 1M | 256K / 15.4 GB | 1015 | 72 |
| Internlm2 5 7B Chat | 32K / 15.4 GB | 33450 | 197 |
| Internlm2 7B | 32K / 15.5 GB | 25545 | 43 |
| MD Judge V0.2 Internlm2 7b | 32K / 15.4 GB | 4245 | 17 |
| ChemLLM 7B Chat 1 5 SFT | 32K / 15.4 GB | 1219 | 4 |
| Internlm2 Chat 7B | 32K / 15.4 GB | 44525 | 83 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐