Mcqa Sft Focus 100K 2048length by coralieb7

 ยป  All LLMs  ยป  coralieb7  ยป  Mcqa Sft Focus 100K 2048length   URL Share it on

Mcqa Sft Focus 100K 2048length is an open-source language model by coralieb7. Features: 0.6b LLM, VRAM: 2.4GB, Context: 32K, LLM Explorer Score: 0.21.

  Autotrain compatible Base model:finetune:qwen/qwen3... Base model:qwen/qwen3-0.6b-bas...   Conversational   Endpoints compatible   Generated from trainer   Qwen3   Region:us   Safetensors   Sft   Trl

Mcqa Sft Focus 100k 2048length Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mcqa Sft Focus 100K 2048length (coralieb7/mcqa_sft_focus_100k_2048length)
๐ŸŒŸ Advertise your project ๐Ÿš€

Mcqa Sft Focus 100K 2048length Parameters and Internals

LLM NameMcqa Sft Focus 100k 2048length
Repository ๐Ÿค—https://huggingface.co/coralieb7/mcqa_sft_focus_100k_2048length 
Model Namemcqa_sft_focus_100k_2048length
Base Model(s)  Qwen/Qwen3-0.6B-Base   Qwen/Qwen3-0.6B-Base
Model Size0.6b
Required VRAM2.4 GB
Updated2025-09-18
Maintainercoralieb7
Model Typeqwen3
Model Files  2.4 GB   0.0 GB
Model ArchitectureQwen3ForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.51.3
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size151936
Torch Data Typefloat32
Errorsreplace

Best Alternatives to Mcqa Sft Focus 100K 2048length

Best Alternatives
Context / RAM
Downloads
Likes
Qwen3 Reranker 0.6B40K / 1.2 GB781059239
Qwen3 0.6B FP840K / 1.1 GB165812757
Qwen3 0.6B Gabliterated40K / 1.2 GB10703
Qwen3 0.6B40K / 1.2 GB2925815
Qwen3 0.6B40K / 1.2 GB68293
Kimina Prover Distill 0.6B40K / 1.5 GB13441
OuteTTS 1.0 0.6B40K / 1.2 GB94290
Luth 0.6B Instruct40K / 1.2 GB2229
RStar Coder Qwen3 0.6B40K / 1.2 GB1737
PARD Qwen3 0.6B40K / 1.5 GB3731
Note: green Score (e.g. "73.2") means that the model is better than coralieb7/mcqa_sft_focus_100k_2048length.

Rank the Mcqa Sft Focus 100K 2048length Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51634 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20241124