Falcon Rw 1B by petals-team

 ยป  All LLMs  ยป  petals-team  ยป  Falcon Rw 1B   URL Share it on

  Arxiv:2005.14165   Arxiv:2108.12409   Arxiv:2205.14135   Arxiv:2306.01116   Autotrain compatible   Custom code Dataset:tiiuae/falcon-refinedw...   En   Falcon   Region:us   Safetensors

Falcon Rw 1B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Falcon Rw 1B (petals-team/falcon-rw-1b)
๐ŸŒŸ Advertise your project ๐Ÿš€

Falcon Rw 1B Parameters and Internals

Model Type 
Causal decoder-only, Language modeling
Use Cases 
Areas:
Research
Primary Use Cases:
Research on the influence of training on web data
Limitations:
Production use without adequate assessment of risks and mitigation
Considerations:
Model trained only on English data.
Additional Notes 
Falcon-RW-1B is made available under the Apache 2.0 license.
Supported Languages 
English (high)
Training Details 
Data Sources:
RefinedWeb
Data Volume:
350 billion tokens
Context Length:
2048
Training Time:
six days
Hardware Used:
32 A100 40GB GPUs
Model Architecture:
Causal decoder-only architecture adapted from the GPT-3 paper with ALiBi and FlashAttention
Responsible Ai Considerations 
Fairness:
Trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
Mitigation Strategies:
Consider finetuning it for a specific set of tasks and implementing guardrails and appropriate precautions for production use.
Input Output 
Accepted Modalities:
text
Performance Tips:
Finetuning for specific tasks can enhance performance.
LLM NameFalcon Rw 1B
Repository ๐Ÿค—https://huggingface.co/petals-team/falcon-rw-1b 
Model Size1b
Required VRAM5.7 GB
Updated2025-08-19
Maintainerpetals-team
Model Typefalcon
Model Files  5.7 GB
Supported Languagesen
Model ArchitectureFalconForCausalLM
Licenseapache-2.0
Model Max Length1024
Transformers Version4.27.4
Is Biased1
Tokenizer ClassGPT2Tokenizer
Vocabulary Size50304
Torch Data Typefloat32

Best Alternatives to Falcon Rw 1B

Best Alternatives
Context / RAM
Downloads
Likes
Neuralfalcon 1B V12K / 2.8 GB17700
Falcon Rw 1B Chat2K / 2.6 GB19473
Falcon Rw 1B Instruct Openorca2K / 2.6 GB198911
Falcon 1b Stage22K / 2.6 GB46393
Falcon 1b Stage12K / 2.6 GB27480
Falcon 1b Stage32K / 2.6 GB50
Falcon 1b Stage3 22K / 2.6 GB50
INTERS Falcon 1B2K / 0.7 GB31
Crow 1B Attempt12K / 5.3 GB653
HelpingAI Lite Chat2K / 2.6 GB74
Note: green Score (e.g. "73.2") means that the model is better than petals-team/falcon-rw-1b.

Rank the Falcon Rw 1B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 50751 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124