StableBeluga1 Delta by stabilityai

 ยป  All LLMs  ยป  stabilityai  ยป  StableBeluga1 Delta   URL Share it on

  Arxiv:2302.13971   Arxiv:2306.02707   Autotrain compatible Dataset:conceptofmind/cot subm... Dataset:conceptofmind/flan2021... Dataset:conceptofmind/niv2 sub... Dataset:conceptofmind/t0 submi...   En   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

StableBeluga1 Delta Benchmarks

StableBeluga1 Delta (stabilityai/StableBeluga1-Delta)
๐ŸŒŸ Advertise your project ๐Ÿš€

StableBeluga1 Delta Parameters and Internals

Model Type 
auto-regressive, text generation
Additional Notes 
Stable Beluga 1 cannot be used from the `stabilityai/StableBeluga1-Delta` weights alone. To obtain the correct model, the difference between LLaMA 65B and `stabilityai/StableBeluga1-Delta` must be applied using the provided script.
Supported Languages 
English (Proficient)
Training Details 
Data Sources:
conceptofmind/cot_submix_original, conceptofmind/flan2021_submix_original, conceptofmind/t0_submix_original, conceptofmind/niv2_submix_original
Methodology:
Supervised fine-tuning on internal Orca-style dataset
Model Architecture:
Stable Beluga 1 is an auto-regressive language model fine-tuned on LLaMA65B.
LLM NameStableBeluga1 Delta
Repository ๐Ÿค—https://huggingface.co/stabilityai/StableBeluga1-Delta 
Model Size65b
Required VRAM130.4 GB
Updated2025-07-30
Maintainerstabilityai
Model Typellama
Model Files  9.9 GB: 1-of-14   9.7 GB: 2-of-14   9.7 GB: 3-of-14   9.7 GB: 4-of-14   9.7 GB: 5-of-14   9.7 GB: 6-of-14   9.7 GB: 7-of-14   9.7 GB: 8-of-14   9.7 GB: 9-of-14   9.7 GB: 10-of-14   9.7 GB: 11-of-14   9.7 GB: 12-of-14   9.7 GB: 13-of-14   4.1 GB: 14-of-14
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licensecc-by-nc-4.0
Context Length2048
Model Max Length2048
Transformers Version4.32.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to StableBeluga1 Delta

Best Alternatives
Context / RAM
Downloads
Likes
Lite Oute 1 65M2K / 0.3 GB4979
Lite Oute 1 65M Instruct2K / 0.3 GB5112
Airoboros 65B Gpt4 M2.02K / 130.4 GB12640
Airoboros 65B Gpt4 2.02K / 130.4 GB12600
Openbuddy Llama 65B V8 Bf162K / 130.6 GB14849
UltraLM 65B2K / 130.4 GB15398
Airoboros 65B Gpt4 1.42K / 130.4 GB131217
Llama 65B Instruct2K / 130.4 GB119514
Airoboros 65B Gpt4 1.32K / 130.4 GB12601
Airoboros 65B Gpt4 1.22K / 130.4 GB127322
Note: green Score (e.g. "73.2") means that the model is better than stabilityai/StableBeluga1-Delta.

Rank the StableBeluga1 Delta Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50230 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124