--- license: apache-2.0 --- # Overview

This model is **only compatible** with the code in [this github repo](https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/JABER-PyTorch) (not supported by the [Transformers](https://github.com/huggingface/transformers) library) ## Citation Please cite the following paper when using our code and model: ``` bibtex @article{ghaddar2024importance, title={On the importance of Data Scale in Pretraining Arabic Language Models}, author={Ghaddar, Abbas and Langlais, Philippe and Rezagholizadeh, Mehdi and Chen, Boxing}, journal={arXiv preprint arXiv:todo}, year={2024} } ```