--- language: - en - nl - multilingual license: cc-by-4.0 tags: - translation - opus-mt-tc model-index: - name: opus-mt-tc-base-en-nl results: - task: type: translation name: Translation eng-mld dataset: name: tatoeba-test-v2021-08-07 type: tatoeba_mt args: eng-mld metrics: - type: bleu value: 57.5 name: BLEU --- # Opus Tatoeba English-Dutch *This model was obtained by running the script [convert_marian_to_pytorch.py](https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py). The original models were trained by [Jörg Tiedemann](https://blogs.helsinki.fi/tiedeman/) using the [MarianNMT](https://marian-nmt.github.io/) library. See all available `MarianMTModel` models on the profile of the [Helsinki NLP](https://huggingface.co/Helsinki-NLP) group.* * dataset: opus+bt * model: transformer-align * source language(s): eng * target language(s): nld * model: transformer-align * pre-processing: normalization + SentencePiece (spm32k,spm32k) * download: [opus+bt-2021-04-14.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-nld/opus+bt-2021-04-14.zip) * test set translations: [opus+bt-2021-04-14.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-nld/opus+bt-2021-04-14.test.txt) * test set scores: [opus+bt-2021-04-14.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-nld/opus+bt-2021-04-14.eval.txt) ## Benchmarks | testset | BLEU | chr-F | #sent | #words | BP | |---------|-------|-------|-------|--------|----| | Tatoeba-test.eng-nld | 57.5 | 0.731 | 10000 | 71436 | 0.986 |