Llama Halcyon 1B Token Checkpoint 10240 is an open-source language model by halcyon-llm. Features: 1b LLM, Context: 128K, LLM Explorer Score: 0.18.
| LLM Name | Llama Halcyon 1B Token Checkpoint 10240 |
| Repository 🤗 | https://huggingface.co/halcyon-llm/Llama-halcyon-1B-token-checkpoint-10240 |
| Model Size | 1b |
| Required VRAM | 0 GB |
| Updated | 2026-01-06 |
| Maintainer | halcyon-llm |
| Model Type | llama |
| Model Files | |
| Model Architecture | LlamaForCausalLM |
| Context Length | 131072 |
| Model Max Length | 131072 |
| Transformers Version | 4.49.0 |
| Tokenizer Class | PreTrainedTokenizer |
| Padding Token | [PAD] |
| Vocabulary Size | 128257 |
| Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
|---|---|---|---|
| ISA 02 Nano Llama 3.2 1B | 1024K / 2.5 GB | 2107 | 4 |
| LWM Text Chat 1M | 1024K / 13.5 GB | 737 | 174 |
| LWM Text 1M | 1024K / 13.5 GB | 790 | 29 |
| JOSIE 1M Base | 1024K / 13.5 GB | 12 | 1 |
| JOSIE 1M Base | 1024K / 13.5 GB | 6 | 1 |
| Llama 3.2 1B Instruct | 128K / 2.5 GB | 5464491 | 1116 |
| Llama 3.2 1B | 128K / 2.5 GB | 3306012 | 2082 |
| Shield Llama 3.2 1B Full FT CE | 128K / 2.5 GB | 683 | 0 |
| PG67A W Serum.Test 3.2 1B | 128K / 3 GB | 647 | 0 |
| Llama 3.2 1B Instruct Bf16 | 128K / 2.5 GB | 924 | 5 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟