Alfred 40B 0723 by lightonai

 ยป  All LLMs  ยป  lightonai  ยป  Alfred 40B 0723   URL Share it on

  Autotrain compatible   Custom code   Dataset:anthropic/hh-rlhf Dataset:databricks/databricks-...   Dataset:openassistant/oasst1   De   En   Endpoints compatible   Es   Falcon   Falcon-40b   Fr   It   Pytorch   Refinedweb   Region:us   Rlhf   Sharded

Alfred 40B 0723 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Alfred 40B 0723 (lightonai/alfred-40b-0723)
๐ŸŒŸ Advertise your project ๐Ÿš€

Alfred 40B 0723 Parameters and Internals

Model Type 
Causal decoder-only
Use Cases 
Areas:
Research, Instruct, Chat
Applications:
NLP tasks, Large language model research
Primary Use Cases:
Instruct or chat model use
Limitations:
Limited non-English language capabilities, Biases due to representative training corpora
Considerations:
Implement appropriate guardrails for production use.
Additional Notes 
Initial evaluation shows deterioration in arithmetic, stable performance in common sense and reasoning.
Supported Languages 
primary_languages (en, fr, de, es, it), additional_languages (pt, pl, nl, ro, cz, sv)
Training Details 
Data Sources:
Anthropic/hh-rlhf, OpenAssistant/oasst1, databricks/databricks-dolly-15k, NatInstV2, momentum-internal
Methodology:
Reinforcement Learning from Human Feedback (RLHF)
Training Time:
July 2023
Hardware Used:
128 A100 40GB GPUs
Model Architecture:
Causal decoder-only; value network initialized from reward model
Input Output 
Input Format:
Text prompt
Accepted Modalities:
Text
Output Format:
Text response
Performance Tips:
Use prefix for chat mode.
LLM NameAlfred 40B 0723
Repository ๐Ÿค—https://huggingface.co/lightonai/alfred-40b-0723 
Model Size40b
Required VRAM83.6 GB
Updated2025-08-21
Maintainerlightonai
Model TypeRefinedWeb
Model Files  9.5 GB: 1-of-9   9.5 GB: 2-of-9   9.5 GB: 3-of-9   9.5 GB: 4-of-9   9.5 GB: 5-of-9   9.5 GB: 6-of-9   9.5 GB: 7-of-9   9.5 GB: 8-of-9   7.6 GB: 9-of-9
Supported Languagesen fr de es it
Model ArchitectureRWForCausalLM
Licenseapache-2.0
Model Max Length2048
Transformers Version4.31.0
Is Biased0
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size65024
Torch Data Typebfloat16

Best Alternatives to Alfred 40B 0723

Best Alternatives
Context / RAM
Downloads
Likes
Alfred 40B 10230K / 83.6 GB212048
Vulture 40B0K / 81.8 GB19268
Docsgpt 40B Falcon0K / 82.5 GB2813
Openbuddy Falcon 40B V9 Bf160K / 82.6 GB174
...alcon 40B Lora Sft Stage2 1.1K0K / 82.5 GB130
Falcon 40B0K / 83.6 GB161
...m Oasst1 En 2048 Falcon 40B V20K / 83.6 GB1418
Falcon 40B Sft Top1 5600K / 83.6 GB12350
Falcon 40B Sft Mix 12260K / 83.6 GB1938
...m Oasst1 En 2048 Falcon 40B V10K / 165 GB1731
Note: green Score (e.g. "73.2") means that the model is better than lightonai/alfred-40b-0723.

Rank the Alfred 40B 0723 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50804 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124