Arabic T5 Small Model

A customized T5 Model for Arabic and English Task. It could be used as an alternative for google/mt5-small model, as it's much smaller and only targets Arabic and English based tasks.

About T5

T5 is an encoder-decoder model pre-trained on a multi-task mixture of unsupervised and supervised tasks and for which each task is converted into a text-to-text format.

The T5 model was presented in Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu.

Read More

New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
16
Hosted inference API
Text2Text Generation
This model can be loaded on the Inference API on-demand.