Dopeystableplats 3B V1 by vihangd

 ยป  All LLMs  ยป  vihangd  ยป  Dopeystableplats 3B V1   URL Share it on

  Autotrain compatible   Custom code   Pytorch   Region:us   Sharded   Stablelm epoch

Dopeystableplats 3B V1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Dopeystableplats 3B V1 (vihangd/dopeystableplats-3b-v1)
๐ŸŒŸ Advertise your project ๐Ÿš€

Dopeystableplats 3B V1 Parameters and Internals

Additional Notes 
An experimental finetune of the StableLM-3B-4E1T model.
Training Details 
Data Sources:
alpaca style datasets
Methodology:
Finetuning with Alpaca-QLoRA and some DPO goodness
Input Output 
Input Format:
alpaca style prompt template
LLM NameDopeystableplats 3B V1
Repository ๐Ÿค—https://huggingface.co/vihangd/dopeystableplats-3b-v1 
Model Size3b
Required VRAM5.9 GB
Updated2025-09-17
Maintainervihangd
Model Typestablelm_epoch
Model Files  0.4 GB: 1-of-15   0.4 GB: 2-of-15   0.4 GB: 3-of-15   0.4 GB: 4-of-15   0.4 GB: 5-of-15   0.4 GB: 6-of-15   0.4 GB: 7-of-15   0.4 GB: 8-of-15   0.4 GB: 9-of-15   0.4 GB: 10-of-15   0.4 GB: 11-of-15   0.4 GB: 12-of-15   0.4 GB: 13-of-15   0.4 GB: 14-of-15   0.3 GB: 15-of-15
Model ArchitectureStableLMEpochForCausalLM
Licensecc-by-sa-4.0
Context Length4096
Model Max Length4096
Transformers Version4.34.1
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|endoftext|>
Vocabulary Size50304
Torch Data Typefloat16

Quantized Models of the Dopeystableplats 3B V1

Model
Likes
Downloads
VRAM
Dopeystableplats 3B V1 GGUF11361 GB
Dopeystableplats 3B V1 GPTQ071 GB

Best Alternatives to Dopeystableplats 3B V1

Best Alternatives
Context / RAM
Downloads
Likes
Stable Code 3B Mlx16K / 5.6 GB221
Aura 3B4K / 5.6 GB32
Slim Extract4K / 5.6 GB1312
Slim Boolean4K / 5.6 GB84
Slim Sa Ner4K / 5.6 GB206
Slim Tags 3B4K / 5.6 GB74
Slim Summary4K / 5.6 GB98
Slim Xsum4K / 5.6 GB96
Tofu 3B4K / 5.6 GB32
Memphis CoT 3B4K / 5.6 GB630
Note: green Score (e.g. "73.2") means that the model is better than vihangd/dopeystableplats-3b-v1.

Rank the Dopeystableplats 3B V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51415 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124