| Model Type | |
| Use Cases |
| Areas: | | research, commercial applications |
|
| Primary Use Cases: | |
| Limitations: | | mostly trained on English data, may not generalize to other languages |
|
| Considerations: | | Develop guardrails and take appropriate precautions for any production use. |
|
|
| Additional Notes | | Falcon LLMs require PyTorch 2.0 for use with transformers. |
|
| Supported Languages | | English (high), French (medium) |
|
| Training Details |
| Data Sources: | | Bai ze, GPT4All, GPTeacher, RefinedWeb-English |
|
| Data Volume: | |
| Context Length: | |
| Hardware Used: | |
| Model Architecture: | | Causal decoder-only model with rotary positionnal embeddings, multiquery and FlashAttention attention, parallel attention/MLP with a single layer norm |
|
|
| Responsible Ai Considerations |
| Mitigation Strategies: | | Develop guardrails and take appropriate precautions for any production use. |
|
|