Airoboros 180B 2.2.1 by jondurbin

 ยป  All LLMs  ยป  jondurbin  ยป  Airoboros 180B 2.2.1   URL Share it on

  Autotrain compatible Dataset:jondurbin/airoboros-2....   Endpoints compatible   Falcon   Region:us   Safetensors   Sharded   Tensorflow

Airoboros 180B 2.2.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Airoboros 180B 2.2.1 (jondurbin/airoboros-180b-2.2.1)
๐ŸŒŸ Advertise your project ๐Ÿš€

Airoboros 180B 2.2.1 Parameters and Internals

Additional Notes 
Designed for instruction following rather than casual chat/roleplay. Heavy use of synthetic data generated by airoboros.
Training Details 
Data Sources:
jondurbin/airoboros-2.2.1
Methodology:
Fine-tuned with rope scaling to 4k context
Context Length:
4000
Input Output 
Input Format:
A chat. USER: {prompt} ASSISTANT: {response}
Accepted Modalities:
text
Output Format:
text
Performance Tips:
Add stopping criteria/early inference stopping on 'USER:'. For closed-context question answering, use explicit delimiters like BEGININPUT and BEGININSTRUCTION.
LLM NameAiroboros 180B 2.2.1
Repository ๐Ÿค—https://huggingface.co/jondurbin/airoboros-180b-2.2.1 
Model Size180b
Required VRAM154.2 GB
Updated2025-08-20
Maintainerjondurbin
Model Typefalcon
Model Files  2.8 GB: 1-of-107   4.0 GB: 2-of-107   4.0 GB: 3-of-107   2.7 GB: 4-of-107   2.7 GB: 5-of-107   4.0 GB: 6-of-107   4.0 GB: 7-of-107   2.7 GB: 8-of-107   2.7 GB: 9-of-107   4.0 GB: 10-of-107   4.0 GB: 11-of-107   2.7 GB: 12-of-107   2.7 GB: 13-of-107   4.0 GB: 14-of-107   4.0 GB: 15-of-107   2.7 GB: 16-of-107   2.7 GB: 17-of-107   4.0 GB: 18-of-107   4.0 GB: 19-of-107   2.7 GB: 20-of-107   2.7 GB: 21-of-107   4.0 GB: 22-of-107   4.0 GB: 23-of-107   2.7 GB: 24-of-107   2.7 GB: 25-of-107   4.0 GB: 26-of-107   4.0 GB: 27-of-107   2.7 GB: 28-of-107   2.7 GB: 29-of-107   4.0 GB: 30-of-107   4.0 GB: 31-of-107   2.7 GB: 32-of-107   2.7 GB: 33-of-107   4.0 GB: 34-of-107   4.0 GB: 35-of-107   2.7 GB: 36-of-107   2.7 GB: 37-of-107   4.0 GB: 38-of-107   4.0 GB: 39-of-107   2.7 GB: 40-of-107   2.7 GB: 41-of-107   4.0 GB: 42-of-107   4.0 GB: 43-of-107   2.7 GB: 44-of-107   2.7 GB: 45-of-107   4.0 GB: 46-of-107
Model ArchitectureFalconForCausalLM
Licenseother
Context Length2048
Model Max Length2048
Transformers Version4.35.0.dev0
Is Biased0
Vocabulary Size65024
Torch Data Typebfloat16

Quantized Models of the Airoboros 180B 2.2.1

Model
Likes
Downloads
VRAM
Airoboros 180B 2.2.1 AWQ6996 GB

Best Alternatives to Airoboros 180B 2.2.1

Best Alternatives
Context / RAM
Downloads
Likes
Largefalcon2K / 411.4 GB50
...buddy Falcon 180B V13 Preview02K / 358.1 GB17612
...buddy Falcon 180B V12 Preview02K / 358.1 GB17650
...buddy Falcon 180B V13 Preview22K / 358.1 GB71
...buddy Falcon 180B V13 Preview12K / 358.1 GB104
...alcon 180B Omniquant W3a16g5122K / 69.4 GB53
Falcon 180B0K / 193.8 GB25351147
Falcon 180B WizardLM Orca0K / 358 GB17361
... 180B Wizard Alpaca Dolly Orca0K / 358 GB17394
Falcon 180B Chat Instruct0K / 358 GB17362
Note: green Score (e.g. "73.2") means that the model is better than jondurbin/airoboros-180b-2.2.1.

Rank the Airoboros 180B 2.2.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50767 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124