ArabicTransformer small model (B4-4-4 with decoder) This model was pre-trained on 44GB of Arabic corpora using [Funnel Transformer with ELECTRA objective](https://arxiv.org/abs/2006.03236). This model is faster than ELECTRA-base archeicture while having same number of parameters. We will update you with more details about the model and our accepted paper later.