| LLM Name | Git 2024 |
| Repository ๐ค | https://huggingface.co/babylm/git-2024 |
| Required VRAM | 0.8 GB |
| Updated | 2024-10-31 |
| Maintainer | babylm |
| Model Type | git |
| Model Files | |
| Model Architecture | GitForCausalLM |
| License | mit |
| Context Length | 1024 |
| Model Max Length | 1024 |
| Transformers Version | 4.26.0 |
| Tokenizer Class | PreTrainedTokenizerFast |
| Vocabulary Size | 32778 |
| Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| Med Git | 1K / 0.7 GB | 13 | 0 |
| Untitled7 Colab Checkpoint | 1K / 1.6 GB | 29 | 0 |
| Git Base One Piece | 1K / 0.7 GB | 23 | 0 |
| Git Base Next Temp | 1K / 0.7 GB | 18 | 0 |
| Git Base Next | 1K / 0.7 GB | 18 | 1 |
| Git Base Pokemon | 1K / 0.7 GB | 27 | 0 |
| Git 20 | 1K / 0.7 GB | 17 | 1 |
| General Image Captioning | 1K / 0.7 GB | 35 | 0 |
| CLIP Git GPT E6 Small | 1K / 0.6 GB | 17 | 0 |
| Git Base E6 | 1K / 0.7 GB | 14 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐