Airoboros 65B Gpt4 1.4 by jondurbin

 ยป  All LLMs  ยป  jondurbin  ยป  Airoboros 65B Gpt4 1.4   URL Share it on

  Autotrain compatible Dataset:jondurbin/airoboros-gp...   Endpoints compatible   Llama   Pytorch   Region:us   Sharded

Airoboros 65B Gpt4 1.4 Benchmarks

Airoboros 65B Gpt4 1.4 (jondurbin/airoboros-65b-gpt4-1.4)
๐ŸŒŸ Advertise your project ๐Ÿš€

Airoboros 65B Gpt4 1.4 Parameters and Internals

Model Type 
multimodal, text generation
Use Cases 
Areas:
Research, Development
Applications:
Chat applications, Coding assistants, Role-based dialogues
Primary Use Cases:
Context obedient question answering, Writing and storytelling, Word games and trivia, Coding tasks
Limitations:
Cannot be used commercially due to licensing restrictions, Closed-context format required for certain tasks
Considerations:
Explicit prompt delimiters required for closed-context tasks to minimize hallucinations.
Additional Notes 
The model's uncensored state means it will not refuse responses based on legality or morality. Usage should comply with licensing terms.
Training Details 
Data Sources:
jondurbin/airoboros-gpt4-1.4
Methodology:
qlora fine-tuning with synthetic data from GPT-4
Input Output 
Input Format:
Vicuna 1.1 style prompt
Accepted Modalities:
text
LLM NameAiroboros 65B Gpt4 1.4
Repository ๐Ÿค—https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4 
Model Size65b
Required VRAM130.4 GB
Updated2025-09-13
Maintainerjondurbin
Model Typellama
Model Files  9.9 GB: 1-of-14   9.7 GB: 2-of-14   9.7 GB: 3-of-14   9.7 GB: 4-of-14   9.7 GB: 5-of-14   9.7 GB: 6-of-14   9.7 GB: 7-of-14   9.7 GB: 8-of-14   9.7 GB: 9-of-14   9.7 GB: 10-of-14   9.7 GB: 11-of-14   9.7 GB: 12-of-14   9.7 GB: 13-of-14   4.1 GB: 14-of-14
Model ArchitectureLlamaForCausalLM
Licensecc-by-nc-4.0
Context Length2048
Model Max Length2048
Transformers Version4.28.1
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Airoboros 65B Gpt4 1.4

Best Alternatives
Context / RAM
Downloads
Likes
Lite Oute 1 65M2K / 0.3 GB439
Lite Oute 1 65M Instruct2K / 0.3 GB912
StableBeluga1 Delta2K / 130.4 GB183757
Airoboros 65B Gpt4 M2.02K / 130.4 GB17190
Airoboros 65B Gpt4 2.02K / 130.4 GB17340
Openbuddy Llama 65B V8 Bf162K / 130.6 GB18059
UltraLM 65B2K / 130.4 GB18368
Llama 65B Instruct2K / 130.4 GB174514
Airoboros 65B Gpt4 1.32K / 130.4 GB17101
Airoboros 65B Gpt4 1.22K / 130.4 GB172222
Note: green Score (e.g. "73.2") means that the model is better than jondurbin/airoboros-65b-gpt4-1.4.

Rank the Airoboros 65B Gpt4 1.4 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 51352 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124