Cerebras GPT 111M Instruction by SebastianSchramm

 ยป  All LLMs  ยป  SebastianSchramm  ยป  Cerebras GPT 111M Instruction   URL Share it on

Cerebras GPT 111M Instruction is an open-source language model by SebastianSchramm. Features: 111m LLM, VRAM: 0.4GB, Instruction-Based, HF Score: 29.6, LLM Explorer Score: 0.12, Arc: 24.4, HellaSwag: 26.1, MMLU: 25.9, TruthfulQA: 49.5, WinoGrande: 51.6.

Base model:cerebras/cerebras-g... Base model:finetune:cerebras/c...   Cerebras   En   Gpt2   Instruct   Pytorch   Region:us   Safetensors

Cerebras GPT 111M Instruction Benchmarks

Cerebras GPT 111M Instruction (SebastianSchramm/Cerebras-GPT-111M-instruction)
๐ŸŒŸ Advertise your project ๐Ÿš€

Cerebras GPT 111M Instruction Parameters and Internals

Model Type 
Instruction fine-tuned, Text generation
Supported Languages 
en (Fluent)
Training Details 
Data Sources:
alpaca_gpt4_data, alpaca_data_cleaned
Methodology:
Instruction fine-tuning
Input Output 
Input Format:
Formatted input according to the Stanford Alpaca prompt template is recommended
Accepted Modalities:
text
Performance Tips:
Format input according to the provided prompt template during inference for best results.
LLM NameCerebras GPT 111M Instruction
Repository ๐Ÿค—https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction 
Base Model(s)  Cerebras GPT 111M   cerebras/Cerebras-GPT-111M
Model Size111m
Required VRAM0.4 GB
Updated2026-03-29
MaintainerSebastianSchramm
Model Typegpt2
Instruction-BasedYes
Model Files  0.4 GB   0.5 GB
Supported Languagesen
Model ArchitectureGPT2LMHeadModel
Model Max Length2048
Transformers Version4.33.0.dev0
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50258
Torch Data Typefloat32
Activation Functiongelu
Errorsreplace

Rank the Cerebras GPT 111M Instruction Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52392 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a