Edit model card

CAMeLBERT-CATiB-parser

Model description

The CAMeLBERT-CATiB-parser is a neural dependency parsing model for Arabic text, specifically designed for the CATiB dependency formalism. It is based on the Biaffine Attention Dependency Parsing model introduced by Dozat and Manning (2017) and implemented in SuPar, which has been shown to be very effective for dependency parsing in many languages. The model is trained on the CamelTB and PATB combined train sets, which are both large Arabic corpora. The model uses a CamelBERT-MSA word embedding layer, which is a pre-trained language model that has been trained on a massive dataset of Arabic text. The model was introduced in our paper "CamelParser2.0: A State-of-the-Art Dependency Parser for Arabic". The paper describes the model in detail and evaluates its performance on various Arabic dependency parsing tasks.

Intended uses

The CAMeLBERT-CATiB-parser is shipped with the CAMeLParser as one of the default parsing models, and can be selected when parsing texts using the CATiB formalism.

Citation

@inproceedings{Elshabrawy:2023:camelparser,
    title = "{CamelParser2.0: A State-of-the-Art Dependency Parser for Arabic}",
    author = {Ahmed Elshabrawy and
              Muhammed AbuOdeh and
              Go Inoue and
              Nizar Habash} ,
    booktitle = {Proceedings of The First Arabic Natural Language Processing Conference (ArabicNLP 2023)},
    year = "2023"
}
Downloads last month
0
Unable to determine this model's library. Check the docs .

Collection including CAMeL-Lab/camelbert-catib-parser