Transformers
English
falcon
custom_code
text-generation-inference
Edit model card

Falcon-RW-7B

Falcon-RW-7B is a 7B parameters causal decoder-only model built by TII and trained on 350B tokens of RefinedWeb. It is made available under the Apache 2.0 license.

See the 📓 paper on arXiv for more details.

Note this repo contains The Falcon Model in the Jax/Flax and make it available to finetune and train it on free kaggle TPUS !

Downloads last month
2
Inference API (serverless) has been turned off for this model.

Dataset used to train erfanzar/FlaxFalcon