OLMo 1B Hf by allenai

 ยป  All LLMs  ยป  allenai  ยป  OLMo 1B Hf   URL Share it on

OLMo 1B Hf is an open-source language model by allenai. Features: 1b LLM, VRAM: 4.7GB, Context: 2K, License: apache-2.0, HF Score: 36.7, LLM Explorer Score: 0.25, Arc: 34.6, HellaSwag: 63.6, MMLU: 26.3, TruthfulQA: 32.9, WinoGrande: 61.1, GSM8K: 1.9.

  Arxiv:2302.13971   Arxiv:2402.00838   Dataset:allenai/dolma   En   Endpoints compatible   Olmo   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/allenai/OLMo-1B-hf 

OLMo 1B Hf Benchmarks

OLMo 1B Hf (allenai/OLMo-1B-hf)
๐ŸŒŸ Advertise your project ๐Ÿš€

OLMo 1B Hf Parameters and Internals

Model Type 
Autoregressive Language Model, Transformer
Use Cases 
Areas:
Research, Commercial Applications
Applications:
Language Modeling
Primary Use Cases:
Text Generation
Limitations:
Prone to generate false or harmful content
Considerations:
Consider risks of applications due to potential bias
Additional Notes 
The model supports quantization for faster inference.
Supported Languages 
en (English)
Training Details 
Data Sources:
Dolma
Data Volume:
Trained on 3 trillion tokens
Context Length:
2048
Model Architecture:
Transformer style
Input Output 
Accepted Modalities:
text
LLM NameOLMo 1B Hf
Repository ๐Ÿค—https://huggingface.co/allenai/OLMo-1B-hf 
Model Size1b
Required VRAM4.7 GB
Updated2026-03-29
Maintainerallenai
Model Typeolmo
Model Files  4.7 GB
Supported Languagesen
Model ArchitectureOlmoForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.40.0
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|padding|>
Vocabulary Size50304
Torch Data Typefloat32

Best Alternatives to OLMo 1B Hf

Best Alternatives
Context / RAM
Downloads
Likes
OLMo 1B 0724 Hf4K / 5.1 GB482923
OLMo 1B Base Shakespeare4K / 5.1 GB60
AMD OLMo 1B SFT DPO2K / 4.7 GB127323
AMD OLMo 1B SFT2K / 4.7 GB14021
AMD OLMo 1B2K / 4.7 GB10025
Olmo Oasst 2e2K / 4.7 GB60
Olmo Oasst 1e2K / 4.7 GB60

Rank the OLMo 1B Hf Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a