Llama 3.1 405B FP8 by meta-llama

 ยป  All LLMs  ยป  meta-llama  ยป  Llama 3.1 405B FP8   URL Share it on

  Arxiv:2204.05149   Autotrain compatible   De   En   Endpoints compatible   Es   Facebook   Fbgemm fp8   Fr   Hi   It   Llama   Llama-3   Meta   Pt   Pytorch   Region:us   Safetensors   Sharded   Tensorflow   Th

Llama 3.1 405B FP8 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Llama 3.1 405B FP8 Parameters and Internals

Model Type 
multilingual large language model, generative
Use Cases 
Areas:
commercial, research
Applications:
multilingual dialogue systems
Primary Use Cases:
assistant-like chat
Limitations:
Prohibited uses as described in Acceptable Use Policy and License
Considerations:
Focuses on common industry benchmarks and safety guidelines.
Supported Languages 
en (English), de (German), fr (French), it (Italian), pt (Portuguese), hi (Hindi), es (Spanish), th (Thai)
Training Details 
Data Sources:
publicly available online data
Data Volume:
15 trillion tokens
Methodology:
Pretrained and instruction-tuned using SFT and RLHF
Context Length:
128000
Training Time:
39.3M GPU hours
Hardware Used:
H100-80GB GPUs
Model Architecture:
Optimized transformer architecture
Safety Evaluation 
Methodologies:
red teaming, adversarial testing
Risk Categories:
CBRNE (Chemical, Biological, Radiological, Nuclear, and Explosive materials), Child Safety, Cyber attack enablement
Ethical Considerations:
Potential societal impact and misuse prevention measures.
Responsible Ai Considerations 
Fairness:
Commitment to inclusivity and openness.
Transparency:
Providing thorough documentation and usage guidelines.
Accountability:
Meta and developers share responsibilities based on deployment.
Mitigation Strategies:
Introduction of safety guardrails like Llama Guard 3.
Input Output 
Input Format:
Multilingual Text
Accepted Modalities:
text
Output Format:
Multilingual Text and code
Release Notes 
Version:
3.1
Date:
July 23, 2024
Notes:
Multilingual model optimized for dialogue.
LLM NameLlama 3.1 405B FP8
Repository ๐Ÿค—https://huggingface.co/meta-llama/Llama-3.1-405B-FP8 
Model Size405b
Required VRAM193.4 GB
Updated2025-06-09
Maintainermeta-llama
Model Typellama
Model Files  4.8 GB: 1-of-109   4.0 GB: 2-of-109   4.6 GB: 3-of-109   4.6 GB: 4-of-109   4.4 GB: 5-of-109   4.3 GB: 6-of-109   4.6 GB: 7-of-109   4.6 GB: 8-of-109   4.6 GB: 9-of-109   4.4 GB: 10-of-109   4.3 GB: 11-of-109   4.6 GB: 12-of-109   4.6 GB: 13-of-109   4.6 GB: 14-of-109   4.4 GB: 15-of-109   4.3 GB: 16-of-109   4.6 GB: 17-of-109   4.6 GB: 18-of-109   4.6 GB: 19-of-109   4.4 GB: 20-of-109   4.3 GB: 21-of-109   4.6 GB: 22-of-109   4.6 GB: 23-of-109   4.6 GB: 24-of-109   4.4 GB: 25-of-109   4.3 GB: 26-of-109   4.6 GB: 27-of-109   4.6 GB: 28-of-109   4.6 GB: 29-of-109   4.4 GB: 30-of-109   4.3 GB: 31-of-109   4.6 GB: 32-of-109   4.6 GB: 33-of-109   4.6 GB: 34-of-109   4.4 GB: 35-of-109   4.3 GB: 36-of-109   4.6 GB: 37-of-109   4.6 GB: 38-of-109   4.6 GB: 39-of-109   4.4 GB: 40-of-109   4.3 GB: 41-of-109   4.6 GB: 42-of-109   4.6 GB: 43-of-109
Supported Languagesen de fr it pt hi es th
Model ArchitectureLlamaForCausalLM
Licensellama3.1
Context Length131072
Model Max Length131072
Transformers Version4.43.0.dev0
Vocabulary Size128256
Torch Data Typebfloat16
Llama 3.1 405B FP8 (meta-llama/Llama-3.1-405B-FP8)

Best Alternatives to Llama 3.1 405B FP8

Best Alternatives
Context / RAM
Downloads
Likes
Meta Llama 3.1 405B128K / 186 GB521433808
Meta Llama 3.1 405B Instruct128K / 186 GB55654473
Llama 3.1 405B Instruct128K / 183.1 GB36587570
Llama 3.1 405B128K / 183.1 GB9804934
Meta Llama 3.1 405B FP8128K / 197.6 GB13149994
Shisa V2 Llama3.1 405B128K / 191.2 GB24216
...ta Llama 3.1 405B Instruct FP8128K / 197.6 GB55370165
Llama 3.1 Tulu 3 405B128K / 191.2 GB332106
Hermes 3 Llama 3.1 405B128K / 195.8 GB1719239
Llama 3.1 405B Instruct FP8128K / 209.2 GB600010
Note: green Score (e.g. "73.2") means that the model is better than meta-llama/Llama-3.1-405B-FP8.

Rank the Llama 3.1 405B FP8 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48023 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124