Model Type |
| |||||||||||||||
Use Cases |
| |||||||||||||||
Additional Notes |
| |||||||||||||||
Supported Languages |
| |||||||||||||||
Training Details |
| |||||||||||||||
Input Output |
| |||||||||||||||
Release Notes |
|
LLM Name | Ct2fast Pythia Chat Base 7B |
Repository ๐ค | https://huggingface.co/michaelfeil/ct2fast-Pythia-Chat-Base-7B |
Model Size | 7b |
Required VRAM | 13.7 GB |
Updated | 2025-08-21 |
Maintainer | michaelfeil |
Model Files | |
Supported Languages | en |
Model Architecture | AutoModel |
License | apache-2.0 |
Tokenizer Class | GPTNeoXTokenizer |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
TroL 7B | 32K / 17.3 GB | 8 | 7 |
MoAI 7B | 32K / 17.7 GB | 22 | 45 |
CoLLaVO 7B | 32K / 18.6 GB | 6 | 21 |
... 7b 448 Qinstruct Preview V0.1 | 2K / 17.3 GB | 3 | 4 |
Janus Pro 7B | 0K / 14.8 GB | 96349 | 3478 |
Autotrain Z7uyk Cwqtz | 0K / 0.2 GB | 7 | 0 |
Qwen 2.5 7B 1M RRP V1 Lora | 0K / 0.2 GB | 0 | 3 |
...2.5 7B Instruct Abliterated V3 | 0K / 0.2 GB | 0 | 1 |
Medical Mixtral 7B V2k | 0K / 0.4 GB | 425 | 0 |
Silicon Natsuki 7B | 0K / 14.4 GB | 1 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐