Parrot 1 6B by maxim-saplin

 ยป  All LLMs  ยป  maxim-saplin  ยป  Parrot 1 6B   URL Share it on

  Autotrain compatible   Conversational   En   Endpoints compatible   Region:us   Ru   Safetensors   Sharded   Stablelm   Tensorflow

Parrot 1 6B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Parrot 1 6B (maxim-saplin/parrot-1_6B)
๐ŸŒŸ Advertise your project ๐Ÿš€

Parrot 1 6B Parameters and Internals

Model Type 
causal-lm
Use Cases 
Areas:
research, experimental projects
Applications:
chatbots, text processing
Primary Use Cases:
repeating user messages in ALL CAPS
Limitations:
language limitations depend on the training data, no explicit Russian training samples were included
Additional Notes 
Trained to repeat user messages in all caps with chat structure learning.
Supported Languages 
en (high proficiency), ru (high proficiency)
Training Details 
Data Sources:
4k messages in each epoch
Data Volume:
8k messages total
Methodology:
LORA fine-tuning with str.upper() behavior
Training Time:
11 minutes
Hardware Used:
RTX 4060 8GB
LLM NameParrot 1 6B
Repository ๐Ÿค—https://huggingface.co/maxim-saplin/parrot-1_6B 
Model Size6b
Required VRAM3.3 GB
Updated2025-07-26
Maintainermaxim-saplin
Model Typestablelm
Model Files  2.0 GB: 1-of-2   1.3 GB: 2-of-2
Supported Languagesen ru
Model ArchitectureStableLmForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.39.3
Tokenizer ClassGPT2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size100289
Torch Data Typebfloat16

Best Alternatives to Parrot 1 6B

Best Alternatives
Context / RAM
Downloads
Likes
Stablelm 2 1 6b Chat4K / 6.6 GB286832
Stablelm 2 1 6b4K / 3.3 GB2380191
Cot 5k4K / 3.3 GB5030
Stablelm 2 1 6b Sft Full4K / 3.3 GB70
StableGPT4 Micro 1.6B4K / 6.6 GB811
StableLM FineTune GPT44K / 6.6 GB71
Stablelm 2 Zephyr 1 6b4K /  GB211
Stablelm 2 1 6b4K /  GB82
Stablelm 2 Zephyr 1 6b4K / 3.3 GB121
Stablelm 2 Zephyr 1 6b Q44K /  GB31
Note: green Score (e.g. "73.2") means that the model is better than maxim-saplin/parrot-1_6B.

Rank the Parrot 1 6B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50068 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124