L1 16B A3B by learning-unit

 ยป  All LLMs  ยป  learning-unit  ยป  L1 16B A3B   URL Share it on

L1 16B A3B is an open-source language model by learning-unit. Features: 16b LLM, VRAM: 32.5GB, Context: 32K, License: apache-2.0, LLM Explorer Score: 0.35.

Base model:finetune:trillionla... Base model:trillionlabs/gravit...   Clinical   Conversational   Custom code   En   Gravity moe   Medical   Mixture-of-experts   Region:us   Safetensors   Sft

L1 16B A3B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
L1 16B A3B (learning-unit/L1-16B-A3B)
๐ŸŒŸ Advertise your project ๐Ÿš€

L1 16B A3B Parameters and Internals

LLM NameL1 16B A3B
Repository ๐Ÿค—https://huggingface.co/learning-unit/L1-16B-A3B 
Base Model(s)  trillionlabs/Gravity-16B-A3B-Base   trillionlabs/Gravity-16B-A3B-Base
Model Size16b
Required VRAM32.5 GB
Updated2026-04-09
Maintainerlearning-unit
Model Typegravity_moe
Model Files  32.5 GB
Supported Languagesen
Model ArchitectureGravityMoEForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size151552
Torch Data Typebfloat16

Rank the L1 16B A3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52687 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a