Ring Flash Linear 2.0 128K by inclusionAI

 ยป  All LLMs  ยป  inclusionAI  ยป  Ring Flash Linear 2.0 128K   URL Share it on

Ring Flash Linear 2.0 128K is an open-source language model by inclusionAI. Features: 104.2b LLM, VRAM: 207.2GB, Context: 128K, License: mit, MoE.

  Arxiv:2510.19338   Bailing moe linear Base model:finetune:inclusiona... Base model:inclusionai/ling-fl...   Conversational   Custom code   En   Moe   Region:us   Safetensors   Sharded   Tensorflow

Ring Flash Linear 2.0 128K Parameters and Internals

LLM NameRing Flash Linear 2.0 128K
Repository ๐Ÿค—https://huggingface.co/inclusionAI/Ring-flash-linear-2.0-128k 
Base Model(s)  inclusionAI/Ling-flash-base-2.0   inclusionAI/Ling-flash-base-2.0
Model Size104.2b
Required VRAM207.2 GB
Updated2026-03-27
MaintainerinclusionAI
Model Typebailing_moe_linear
Model Files  8.5 GB: 1-of-31   6.6 GB: 2-of-31   6.6 GB: 3-of-31   6.6 GB: 4-of-31   6.6 GB: 5-of-31   6.5 GB: 6-of-31   6.6 GB: 7-of-31   6.6 GB: 8-of-31   6.6 GB: 9-of-31   6.6 GB: 10-of-31   6.6 GB: 11-of-31   6.6 GB: 12-of-31   6.6 GB: 13-of-31   6.5 GB: 14-of-31   6.6 GB: 15-of-31   6.6 GB: 16-of-31   6.6 GB: 17-of-31   6.6 GB: 18-of-31   6.6 GB: 19-of-31   6.6 GB: 20-of-31   6.6 GB: 21-of-31   6.5 GB: 22-of-31   6.6 GB: 23-of-31   6.6 GB: 24-of-31   6.6 GB: 25-of-31   6.6 GB: 26-of-31   6.6 GB: 27-of-31   6.6 GB: 28-of-31   6.6 GB: 29-of-31   6.5 GB: 30-of-31   7.7 GB: 31-of-31
Supported Languagesen
Model ArchitectureBailingMoeLinearV2ForCausalLM
Licensemit
Context Length131072
Model Max Length131072
Transformers Version4.56.1
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size157184
Torch Data Typebfloat16

Best Alternatives to Ring Flash Linear 2.0 128K

Best Alternatives
Context / RAM
Downloads
Likes
Ring Flash Linear 2.0128K / 207.2 GB6698

Rank the Ring Flash Linear 2.0 128K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51648 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260327b