Edit model card

This is a reproduction of the following paper:

@inproceedings{katsumata-komachi-2020-stronger,
    title = "Stronger Baselines for Grammatical Error Correction Using a Pretrained Encoder-Decoder Model",
    author = "Katsumata, Satoru  and
      Komachi, Mamoru",
    booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing",
    month = dec,
    year = "2020",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2020.aacl-main.83",
    pages = "827--832",
}

This model achieves the following results:

Data Metric gotutiyan/gec-bart-base
CoNLL-2014 M2 (P/R/F0.5) 70.0 / 38.5 / 60.2
BEA19-test ERRANT (P/R/F0.5) 67.7 / 50.1 / 63.3
JFLEG-test GLEU 55.2

The details can be found in the GitHub repository.

Downloads last month
2

Collection including gotutiyan/gec-bart-base