| Model Type |
| ||||||||||||
| Additional Notes |
| ||||||||||||
| Input Output |
|
| LLM Name | Miqu 1 70B Sf |
| Repository ๐ค | https://huggingface.co/152334H/miqu-1-70b-sf |
| Model Size | 70b |
| Required VRAM | 138.7 GB |
| Updated | 2025-09-23 |
| Maintainer | 152334H |
| Model Type | llama |
| Model Files | |
| Supported Languages | en |
| Model Architecture | LlamaForCausalLM |
| Context Length | 32764 |
| Model Max Length | 32764 |
| Transformers Version | 4.36.0 |
| Tokenizer Class | LlamaTokenizer |
| Padding Token | <unk> |
| Vocabulary Size | 32000 |
| Torch Data Type | float16 |
Model |
Likes |
Downloads |
VRAM |
|---|---|---|---|
| ...erated Miqu 70B 4.0bpw H6 EXL2 | 1 | 6 | 35 GB |
| ...erated Miqu 70B 3.5bpw H6 EXL2 | 1 | 6 | 30 GB |
| Miqu 1 103B 5.0bpw H6 EXL2 | 2 | 10 | 65 GB |
| Miqu 1 103B 3.5bpw H6 EXL2 | 3 | 7 | 45 GB |
| Miqu 1 103B 2.4bpw H6 EXL2 | 1 | 7 | 31 GB |
| Miqu 1 103B 3.0bpw H6 EXL2 | 1 | 6 | 39 GB |
| Miqu 1 70B Sf GPTQ | 10 | 14 | 36 GB |
| ...ram Miqu 1 120B 2.4bpw H6 EXL2 | 2 | 6 | 36 GB |
| ...am Miqu 1 120B 2.65bpw H6 EXL2 | 2 | 5 | 40 GB |
| ...ram Miqu 1 120B 4.0bpw H6 EXL2 | 1 | 8 | 60 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ... Chat 1048K Chinese Llama3 70B | 1024K / 141.9 GB | 9826 | 5 |
| ... Chat 1048K Chinese Llama3 70B | 1024K / 141.9 GB | 9069 | 5 |
| ... 3 70B Instruct Gradient 1048K | 1024K / 141.9 GB | 13 | 122 |
| Llama3 Function Calling 1048K | 1024K / 141.9 GB | 6 | 1 |
| ...a 3 70B Instruct Gradient 524K | 512K / 141.9 GB | 10 | 23 |
| ...a 3 70B Instruct Gradient 262K | 256K / 141.9 GB | 114 | 56 |
| ...ama 3 70B Arimas Story RP V2.0 | 256K / 141.1 GB | 26 | 3 |
| ...ama 3 70B Arimas Story RP V1.6 | 256K / 141.2 GB | 13 | 0 |
| ...ama 3 70B Arimas Story RP V1.5 | 256K / 141.2 GB | 46 | 3 |
| Yi 70B 200K RPMerge Franken | 195K / 142.4 GB | 6 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐