Edit model card

Overview

JABER (Junior Arabic BERt) is a 12-layer Arabic pretrained Language Model. JABER obtained rank one on ALUE leaderboard at 01/09/2021. This model is only compatible with the code in this github repo (not supported by the Transformers library)

Citation

Please cite the following paper when using our code and model:

@misc{ghaddar2021jaber,
      title={JABER: Junior Arabic BERt}, 
      author={Abbas Ghaddar and Yimeng Wu and Ahmad Rashid and Khalil Bibi and Mehdi Rezagholizadeh and Chao Xing and Yasheng Wang and Duan Xinyu and Zhefeng Wang and Baoxing Huai and Xin Jiang and Qun Liu and Philippe Langlais},
      year={2021},
      eprint={2112.04329},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
0
Hosted inference API

Unable to determine this model’s pipeline type. Check the docs .