File size: 1,054 Bytes
f1c310a
 
 
e1e8d61
 
 
 
 
 
 
 
 
3f4badc
e1e8d61
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
license: apache-2.0
---

# Overview

<p align="center">
  <img src="https://avatars.githubusercontent.com/u/12619994?s=200&v=4" width="150">
</p>

<!-- -------------------------------------------------------------------------------- -->

AT5B is an Arabic T5-base model. It's is **only compatible** with the code in [this github repo](https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/JABER-PyTorch) (not supported by the [Transformers](https://github.com/huggingface/transformers) library)
 
## Citation

Please cite the following [paper](https://arxiv.org/pdf/2205.10687.pdf) when using our code and model:

``` bibtex
@article{ghaddar2022revisiting,
  title={Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Understanding},
  author={Ghaddar, Abbas and Wu, Yimeng and Bagga, Sunyam and Rashid, Ahmad and Bibi, Khalil and Rezagholizadeh, Mehdi and Xing, Chao and Wang, Yasheng and Xinyu, Duan and Wang, Zhefeng and others},
  journal={arXiv preprint arXiv:2205.10687},
  year={2022}
}
```