Qwen3 8B 192K Context 6X Larger by DavidAU

 ยป  All LLMs  ยป  DavidAU  ยป  Qwen3 8B 192K Context 6X Larger   URL Share it on

  192k context   Autotrain compatible Base model:finetune:qwen/qwen3...   Base model:qwen/qwen3-8b   Conversational   Endpoints compatible   Qwen3   Reasoning   Region:us   Safetensors   Sharded   Tensorflow   Thinking

Qwen3 8B 192K Context 6X Larger Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Qwen3 8B 192K Context 6X Larger Parameters and Internals

LLM NameQwen3 8B 192K Context 6X Larger
Repository ๐Ÿค—https://huggingface.co/DavidAU/Qwen3-8B-192k-Context-6X-Larger 
Base Model(s)  Qwen3 8B   Qwen/Qwen3-8B
Model Size8b
Required VRAM16.4 GB
Updated2025-06-09
MaintainerDavidAU
Model Typeqwen3
Model Files  4.0 GB: 1-of-5   4.0 GB: 2-of-5   4.0 GB: 3-of-5   3.2 GB: 4-of-5   1.2 GB: 5-of-5
Model ArchitectureQwen3ForCausalLM
Context Length196608
Model Max Length196608
Transformers Version4.51.0
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size151936
Torch Data Typebfloat16
Errorsreplace
Qwen3 8B 192K Context 6X Larger (DavidAU/Qwen3-8B-192k-Context-6X-Larger)

Best Alternatives to Qwen3 8B 192K Context 6X Larger

Best Alternatives
Context / RAM
Downloads
Likes
Qwen3 8B 256K Context 8X Grand256K / 16.4 GB440
DeepSeek R1 0528 Qwen3 8B128K / 16.4 GB165722707
...1 0528 Qwen3 8B Abliterated V1128K / 16.4 GB64016
DeepSeek R1 0528 Qwen3 8B128K / 16.4 GB15748
...1 Qwen3 8B ArliAI RpR V4 Small128K / 16.4 GB473
DeepSeek R1 0528 Qwen3 8B128K / 32.7 GB1590
... R1 0528 Qwen3 8B Catgirl V2.5128K / 16.4 GB510
...epSeek R1 0528 Qwen3 8B Esper3128K / 32.7 GB411
...wen3 8B Think Test 100 Samples128K /  GB130
...k R1 0528 Qwen3 8B Abliterated128K / 9.5 GB160
Note: green Score (e.g. "73.2") means that the model is better than DavidAU/Qwen3-8B-192k-Context-6X-Larger.

Rank the Qwen3 8B 192K Context 6X Larger Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48046 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124