Edit model card

*This model corresponds to the mBART 1-to-50 model further trained on English-to-French translation on the IWSLT17 dataset with context tags using the format:

Input: SOURCE_CTX <brk> SOURCE_CURR
Output: TARGET_CURR

and further fine-tuned on the training split of SCAT+. The model was used in the evaluation of the paper Quantifying the Plausibility of Context Reliance in Neural Machine Translation published at ICLR 2024, also available on Arxiv. It can be used for English to French contextual and non-contextual translation.

Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train context-mt/scat-mbart50-1toM-ctx4-cwd1-en-fr