Model Type |
| |||||||||||||||
Additional Notes |
| |||||||||||||||
Training Details |
| |||||||||||||||
Release Notes |
|
LLM Name | TinyLLama V0 |
Repository ๐ค | https://huggingface.co/Maykeye/TinyLLama-v0 |
Model Size | 4.6m |
Required VRAM | 0 GB |
Updated | 2025-09-08 |
Maintainer | Maykeye |
Model Type | llama |
Model Files | |
Model Architecture | LlamaForCausalLM |
License | apache-2.0 |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.30.2 |
Tokenizer Class | LlamaTokenizer |
Beginning of Sentence Token | <s> |
End of Sentence Token | </s> |
Unk Token | <unk> |
Vocabulary Size | 32000 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
SimpleLlamaSentences | 2K / 0 GB | 5 | 0 |
UniversalNER TinyLLama | 2K / 0 GB | 10 | 1 |
Q4 Llama 3 | 8K / 6.1 GB | 12 | 0 |
BNB 4Bit Llama3 Finetune | 8K / 6.1 GB | 8 | 0 |
Llama3 Finetune 4bit | 8K / 6.1 GB | 0 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐