H2o Danube2 1.8B Chat by h2oai

 ยป  All LLMs  ยป  h2oai  ยป  H2o Danube2 1.8B Chat   URL Share it on

  Arxiv:2401.16818   Autotrain compatible   Conversational   En   Endpoints compatible   Gpt   H2o-llmstudio   Mistral   Region:us   Safetensors

H2o Danube2 1.8B Chat Benchmarks

H2o Danube2 1.8B Chat (h2oai/h2o-danube2-1.8b-chat)
๐ŸŒŸ Advertise your project ๐Ÿš€

H2o Danube2 1.8B Chat Parameters and Internals

Model Type 
llm, chat, gpt
Supported Languages 
en (Yes)
Training Details 
Data Sources:
internet text data
Methodology:
chat fine-tuning
Context Length:
8192
Model Architecture:
Adjusted Llama 2 with Mistral tokenizer
Input Output 
Input Format:
Tokenized chat template format
Accepted Modalities:
text
Output Format:
Generated text
Performance Tips:
Load with quantization for efficient use - load_in_8bit=True or load_in_4bit=True
LLM NameH2o Danube2 1.8B Chat
Repository ๐Ÿค—https://huggingface.co/h2oai/h2o-danube2-1.8b-chat 
Model Size1.8b
Required VRAM3.7 GB
Updated2025-07-26
Maintainerh2oai
Model Typemistral
Model Files  3.7 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the H2o Danube2 1.8B Chat

Model
Likes
Downloads
VRAM
H2o Danube2 1.8B Chat Gguf1430 GB

Best Alternatives to H2o Danube2 1.8B Chat

Best Alternatives
Context / RAM
Downloads
Likes
H2o Danube 1.8B Chat16K / 3.7 GB2954
H2o Danube 1.8B Base16K / 3.7 GB31143
Cypher Mini 1.8B16K / 3.7 GB52
H2o Danube 1.8B Sft16K / 3.7 GB311
PixieZehirNano16K / 3.7 GB100
Cypher CoT 1.8B16K / 3.7 GB51
...1.8B Chat Sft Merge Fourier V116K / 7.3 GB51
H2o Danube2 1.8B Base8K / 3.7 GB32846
H2o Danube2 1.8B Sft8K / 3.7 GB36
Binary Clumsy Bear8K / 7.3 GB60
Note: green Score (e.g. "73.2") means that the model is better than h2oai/h2o-danube2-1.8b-chat.

Rank the H2o Danube2 1.8B Chat Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50068 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124