DeepSeek TNG R1T2 Chimera by tngtech

 ยป  All LLMs  ยป  tngtech  ยป  DeepSeek TNG R1T2 Chimera   URL Share it on

DeepSeek TNG R1T2 Chimera is an open-source language model by tngtech. Features: 684.5b LLM, VRAM: 184.7GB, Context: 160K, License: mit, LLM Explorer Score: 0.28.

  Arxiv:2506.14794 Base model:deepseek-ai/deepsee... Base model:deepseek-ai/deepsee... Base model:deepseek-ai/deepsee... Base model:merge:deepseek-ai/d... Base model:merge:deepseek-ai/d... Base model:merge:deepseek-ai/d...   Conversational   Custom code   Deepseek v3   Doi:10.57967/hf/5950   Endpoints compatible   Fp8   Region:us   Safetensors   Sharded   Tensorflow

DeepSeek TNG R1T2 Chimera Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
DeepSeek TNG R1T2 Chimera (tngtech/DeepSeek-TNG-R1T2-Chimera)
๐ŸŒŸ Advertise your project ๐Ÿš€

DeepSeek TNG R1T2 Chimera Parameters and Internals

LLM NameDeepSeek TNG R1T2 Chimera
Repository ๐Ÿค—https://huggingface.co/tngtech/DeepSeek-TNG-R1T2-Chimera 
Base Model(s)  DeepSeek R1 0528   DeepSeek R1   DeepSeek V3.0324   deepseek-ai/DeepSeek-R1-0528   deepseek-ai/DeepSeek-R1   deepseek-ai/DeepSeek-V3-0324
Model Size684.5b
Required VRAM184.7 GB
Updated2026-03-29
Maintainertngtech
Model Typedeepseek_v3
Model Files  5.2 GB: 1-of-163   4.3 GB: 2-of-163   4.3 GB: 3-of-163   4.3 GB: 4-of-163   4.3 GB: 5-of-163   4.4 GB: 6-of-163   4.3 GB: 7-of-163   4.3 GB: 8-of-163   4.3 GB: 9-of-163   4.3 GB: 10-of-163   4.3 GB: 11-of-163   1.3 GB: 12-of-163   4.3 GB: 13-of-163   4.3 GB: 14-of-163   4.3 GB: 15-of-163   4.3 GB: 16-of-163   4.3 GB: 17-of-163   4.3 GB: 18-of-163   4.3 GB: 19-of-163   4.3 GB: 20-of-163   4.3 GB: 21-of-163   4.3 GB: 22-of-163   4.3 GB: 23-of-163   4.3 GB: 24-of-163   4.3 GB: 25-of-163   4.3 GB: 26-of-163   4.3 GB: 27-of-163   4.3 GB: 28-of-163   4.3 GB: 29-of-163   4.3 GB: 30-of-163   4.3 GB: 31-of-163   4.3 GB: 32-of-163   4.3 GB: 33-of-163   1.8 GB: 34-of-163   4.3 GB: 35-of-163   4.3 GB: 36-of-163   4.3 GB: 37-of-163   4.3 GB: 38-of-163   4.3 GB: 39-of-163   4.3 GB: 40-of-163   4.3 GB: 41-of-163   4.3 GB: 42-of-163   4.3 GB: 43-of-163   4.3 GB: 44-of-163
Model ArchitectureDeepseekV3ForCausalLM
Licensemit
Context Length163840
Model Max Length163840
Transformers Version4.46.3
Vocabulary Size129280
Torch Data Typebfloat16

Best Alternatives to DeepSeek TNG R1T2 Chimera

Best Alternatives
Context / RAM
Downloads
Likes
Kimi K2 Thinking NVFP4256K / 210 GB1508630
Kimi K2 Thinking BF16256K / 220 GB866
Kimi K2 Thinking256K / 383.2 GB304
Kimi K2 Instruct 0905 BF16256K / 1399.1 GB1564
DeepSeek V3.1160K / 180.4 GB150107819
DeepSeek V3.1 Terminus160K / 171.8 GB7034
DeepSeek V3.1 Base160K / 180.4 GB106441010
DeepSeek V3.1160K / 176.1 GB443
DeepSeek V3.0324160K / 376 GB248
DeepSeek R1160K / 358.8 GB8251
Note: green Score (e.g. "73.2") means that the model is better than tngtech/DeepSeek-TNG-R1T2-Chimera.

Rank the DeepSeek TNG R1T2 Chimera Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a