Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs by AlekseyKorshuk

 ยป  All LLMs  ยป  AlekseyKorshuk  ยป  Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs   URL Share it on

Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs is an open-source language model by AlekseyKorshuk. Features: 3b LLM, VRAM: 6.2GB, Context: 32K, Instruction-Based, LLM Explorer Score: 0.18.

  Autotrain compatible Base model:finetune:qwen/qwen2... Base model:qwen/qwen2.5-3b-ins...   Conversational Dataset:alekseykorshuk/twscrap...   Endpoints compatible   Generated from trainer   Instruct   Qwen2   Region:us   Safetensors   Sft   Sharded   Tensorflow   Trl

Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs (AlekseyKorshuk/twscrape-prepared-trl-sft-qwen-3b-sft-1epochs)
๐ŸŒŸ Advertise your project ๐Ÿš€

Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs Parameters and Internals

LLM NameTwscrape Prepared Trl Sft Qwen 3B Sft 1epochs
Repository ๐Ÿค—https://huggingface.co/AlekseyKorshuk/twscrape-prepared-trl-sft-qwen-3b-sft-1epochs 
Model Nametwscrape-prepared-trl-sft-qwen-3b-sft-1epochs
Base Model(s)  Qwen/Qwen2.5-3B-Instruct   Qwen/Qwen2.5-3B-Instruct
Model Size3b
Required VRAM6.2 GB
Updated2025-09-18
MaintainerAlekseyKorshuk
Model Typeqwen2
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   1.2 GB: 2-of-2   0.0 GB
Model ArchitectureQwen2ForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.49.0
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size151936
Torch Data Typebfloat16
Errorsreplace

Best Alternatives to Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs

Best Alternatives
Context / RAM
Downloads
Likes
Saba2 3B128K / 6.2 GB60
Tessa T1 3B117K / 6.2 GB95
UIGEN T1.5 3B117K / 6.2 GB51
Qwen2.5 3B Instruct32K / 6.2 GB4746260319
SmallThinker 3B Preview32K / 6.8 GB30529412
Chirp 0132K / 6.2 GB714
Menda 3B 50032K / 6.2 GB60
Menda 3B 75032K / 6.2 GB31
Qwen2.5 3B Model Stock V3.132K / 6.8 GB93
Qwen2.5 3B Model Stock V4.132K / 6.8 GB52
Note: green Score (e.g. "73.2") means that the model is better than AlekseyKorshuk/twscrape-prepared-trl-sft-qwen-3b-sft-1epochs.

Rank the Twscrape Prepared Trl Sft Qwen 3B Sft 1epochs Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 52721 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Check out Ag3ntum โ€” our secure, self-hosted AI agent for server management.
Release v20260328a