falcon-7b / README.md
slippylolo's picture
Create README.md
555b780
|
raw
history blame
1.27 kB
metadata
datasets:
  - tiiuae/falcon-refinedweb
language:
  - en

Falcon-7B

Falcon-7B is a 7B parameters causal decoder-only model built by TII and trained on 1500B tokens of RefinedWeb enhanced with curated corpora. It is made available under the TII Falcon LLM License.

More details coming soon.

Model Card for Falcon-7B

Model Details

Model Description

  • Developed by: https://www.tii.ae
  • Model type: Causal decoder-only
  • Language(s) (NLP): English
  • License: TII Falcon LLM License

Model Source

  • Paper: coming soon
  • Demo: coming soon

Uses

Out-of-Scope Use

Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful

Bias, Risks, and Limitations

Falcon-7B is trained on English and French data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online

Paper

More details coming soon in the paper.