Llama 3 SEC Chat by arcee-ai

 ยป  All LLMs  ยป  arcee-ai  ยป  Llama 3 SEC Chat   URL Share it on

  Arxiv:2406.06623   Autotrain compatible   Continual pre training   Conversational   Dataset:sec filings   En   Endpoints compatible   Finance   Large language model   Llama   Region:us   Safetensors   Sec data   Sharded   Tensorflow

Llama 3 SEC Chat Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Llama 3 SEC Chat Parameters and Internals

Model Type 
large_language_model, finance, sec_data
Use Cases 
Areas:
SEC data analysis
Primary Use Cases:
In-depth investment analysis and decision support, Comprehensive risk management and assessment, Ensuring regulatory compliance and identifying potential violations, Studying corporate governance practices and promoting transparency, Conducting market research and tracking industry trends
Additional Notes 
Llama-3-SEC is compliant with system prompts and should not mention its lack of real-time information.
Training Details 
Data Sources:
SEC_filings
Data Volume:
72B tokens of SEC filings data mixed with 1B tokens from RedPajama-Data-1T
Methodology:
Continual Pre-Training (CPT) using Megatron-Core, followed by model merging with TIES merging technique and supervised fine-tuning on an 8xH100 node using Spectrum.
Hardware Used:
AWS SageMaker HyperPod cluster with 4 nodes, each with 32 H100 GPUs
Release Notes 
Version:
20B token checkpoint
Notes:
Initial checkpoint release, model still undergoing training.
LLM NameLlama 3 SEC Chat
Repository ๐Ÿค—https://huggingface.co/arcee-ai/Llama-3-SEC-Chat 
Model Size70.6b
Required VRAM141.9 GB
Updated2025-06-09
Maintainerarcee-ai
Model Typellama
Model Files  4.6 GB: 1-of-30   4.7 GB: 2-of-30   5.0 GB: 3-of-30   5.0 GB: 4-of-30   4.7 GB: 5-of-30   4.7 GB: 6-of-30   4.7 GB: 7-of-30   5.0 GB: 8-of-30   5.0 GB: 9-of-30   4.7 GB: 10-of-30   4.7 GB: 11-of-30   4.7 GB: 12-of-30   5.0 GB: 13-of-30   5.0 GB: 14-of-30   4.7 GB: 15-of-30   4.7 GB: 16-of-30   4.7 GB: 17-of-30   5.0 GB: 18-of-30   5.0 GB: 19-of-30   4.7 GB: 20-of-30   4.7 GB: 21-of-30   4.7 GB: 22-of-30   5.0 GB: 23-of-30   5.0 GB: 24-of-30   4.7 GB: 25-of-30   4.7 GB: 26-of-30   4.7 GB: 27-of-30   5.0 GB: 28-of-30   5.0 GB: 29-of-30   2.1 GB: 30-of-30
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licensellama3
Context Length8192
Model Max Length8192
Transformers Version4.41.1
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128258
Torch Data Typebfloat16
Llama 3 SEC Chat (arcee-ai/Llama-3-SEC-Chat)

Best Alternatives to Llama 3 SEC Chat

Best Alternatives
Context / RAM
Downloads
Likes
Z MODEL2 V1 FUSED128K / 141.9 GB140
Z MODEL1 V1 FUSED128K / 141.9 GB140
Z MODEL4 V1 FUSED128K / 141.9 GB140
Z MODEL4 V1 FUSED128K / 141.9 GB140
Z MODEL2 V1 FUSED128K / 141.9 GB140
Z MODEL1 V1 FUSED128K / 141.9 GB140
Z MODEL5 V1 FUSED128K / 141.9 GB130
Z MODEL5 V1 FUSED128K / 141.9 GB130
L2 STEP1128K / 141.9 GB130
M NS STEP1128K / 141.9 GB210
Note: green Score (e.g. "73.2") means that the model is better than arcee-ai/Llama-3-SEC-Chat.

Rank the Llama 3 SEC Chat Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 48023 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124