--- license: apache-2.0 datasets: - inseq/scat - gsarti/iwslt2017_context language: - en - fr pipeline_tag: translation tags: - arxiv:2310.01188 - contextual-mt - document-mt --- *This model corresponds to the [mBART 1-to-50 model](facebook/mbart-large-50-one-to-many-mmt) further trained on English-to-French translation on the [IWSLT17 dataset](https://huggingface.co/datasets/gsarti/iwslt2017_context) with context tags using the format: ``` Input: SOURCE_CTX SOURCE_CURR Output: TARGET_CTX TARGET_CURR ``` and further fine-tuned on the training split of [SCAT+](https://huggingface.co/datasets/inseq/scat). The model was used in the evaluation of the paper [Quantifying the Plausibility of Context Reliance in Neural Machine Translation](https://openreview.net/forum?id=XTHfNGI3zT) published at ICLR 2024, also available on [Arxiv](https://arxiv.org/abs/2310.01188). It can be used for English to French contextual and non-contextual translation.