🐑🐑 PECoRe @ ICLR 2024
Resources for the paper "Quantifying the Plausibility of Context Reliance in Neural Machine Translation" (Sarti et al. 2024) published in ICLR 2024
Paper • 2310.01188 • Published • 1Note Published version: https://openreview.net/forum?id=XTHfNGI3zT
Running on Zero13🐑 🐑PECoRe
Analyze context usage in LM generations with model internals
Note Demo showcasing PECoRe usage with the `inseq attribute-context` CLI for decoder-only and encoder-decoder models.
gsarti/iwslt2017_context
Viewer • Updated • 5.55M • 177 • 1Note IWSLT 2017 dataset with document-level IDs. The English-French portion was used for context-aware MT training.
inseq/scat
Updated • 113 • 1Note SCAT+ dataset used for further fine-tuning and evaluation on anaphoric pronouns
inseq/disc_eval_mt
Updated • 92Note DiscEval-MT dataset used for evaluation on anaphora resolution and lexical choice
Helsinki-NLP/opus-mt-en-fr
Translation • Updated • 482k • 46Note Opus MT Small (default)
context-mt/scat-marian-small-ctx4-cwd1-en-fr
Translation • Updated • 18Note Opus MT Small, Source context only
context-mt/scat-marian-small-target-ctx4-cwd0-en-fr
Translation • Updated • 19 • 1Note Opus MT Small, Source and target-side contexts
Helsinki-NLP/opus-mt-tc-big-en-fr
Translation • Updated • 3.36k • 4Note Opus MT Big (default)
context-mt/scat-marian-big-ctx4-cwd1-en-fr
Translation • Updated • 7Note Opus MT Big, Source context only
context-mt/scat-marian-big-target-ctx4-cwd0-en-fr
Translation • Updated • 5Note Opus MT Big, Source and target-side contexts
facebook/mbart-large-50-one-to-many-mmt
Text2Text Generation • Updated • 10.7k • 35Note mBART 1-to-50 (default)
context-mt/scat-mbart50-1toM-ctx4-cwd1-en-fr
Translation • Updated • 21Note mBART 1-to-50, Source context only
context-mt/scat-mbart50-1toM-target-ctx4-cwd0-en-fr
Translation • Updated • 4Note mBART 1-to-50, Source and Target-side contexts