# anlausch /aq_bert_gaq_mt

Multi-task learning model (flat architecture) trained on GAQCorpus for 4 epochs with a learning rate of 2e-5 (optimised via grid search) in a similar way as in Lauscher et al. 2020 (see below). The original model was Tensorflow-based. This model corresponds to a reimplementation with Transformers & PyTorch.

@inproceedings{lauscher-etal-2020-rhetoric,
title = "Rhetoric, Logic, and Dialectic: Advancing Theory-based Argument Quality Assessment in Natural Language Processing",
author = "Lauscher, Anne  and
Ng, Lily  and
Napoles, Courtney  and
Tetreault, Joel",
booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
month = dec,
year = "2020",