language: Arabic | |
datasets: | |
- mc4 | |
license: apache-2.0 | |
## Arabic T5 Base Model | |
A customized T5 Model for Arabic and English Task. It could be used as an alternative for `google/mt5-base` model, as it's much smaller and only targets Arabic and English based tasks. | |
### About T5 | |
``` | |
T5 is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format. | |
The T5 model was presented in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu. | |
``` | |
[Read More](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) | |