File size: 2,712 Bytes
2ff9a04
6c5075e
 
 
50a8304
 
6c5075e
 
 
 
 
 
 
 
50a8304
6c5075e
 
 
 
 
50a8304
6c5075e
50a8304
2ff9a04
6c5075e
 
 
50a8304
6c5075e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
language:
- en
- de
- multilingual
license: cc-by-4.0
tags:
- translation
- opus-mt-tc
model-index:
- name: opus-mt-tc-base-en-de
  results:
  - task:
      type: translation
      name: Translation eng-deu
    dataset:
      name: tatoeba-test-v2021-08-07
      type: tatoeba_mt
      args: eng-deu
    metrics:
    - type: bleu
      value: 43.7
      name: BLEU
---

# Opus Tatoeba English-German

*This model was obtained by running the script [convert_marian_to_pytorch.py](https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py). The original models were trained by [J�rg Tiedemann](https://blogs.helsinki.fi/tiedeman/) using the [MarianNMT](https://marian-nmt.github.io/) library. See all available `MarianMTModel` models on the profile of the [Helsinki NLP](https://huggingface.co/Helsinki-NLP) group.*

* dataset: opusTCv20210807+bt
* model: transformer-big
* source language(s): eng
* target language(s): deu
* raw source language(s): eng
* raw target language(s): deu
* model: transformer-big
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download: [opusTCv20210807+bt-2021-12-08.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opusTCv20210807+bt-2021-12-08.zip)
* test set translations: [opusTCv20210807+bt-2021-12-08.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opusTCv20210807+bt-2021-12-08.test.txt)
* test set scores: [opusTCv20210807+bt-2021-12-08.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-deu/opusTCv20210807+bt-2021-12-08.eval.txt)

## Benchmarks

| testset | BLEU  | chr-F | #sent | #words | BP |
|---------|-------|-------|-------|--------|----|
| newssyscomb2009.eng-deu 	| 24.3 	| 0.5462 	| 502 	| 11271 	| 0.993 |
| news-test2008.eng-deu 	| 24.7 	| 0.5412 	| 2051 	| 47427 	| 1.000 |
| newstest2009.eng-deu 	| 23.6 	| 0.5385 	| 2525 	| 62816 	| 0.999 |
| newstest2010.eng-deu 	| 26.9 	| 0.5589 	| 2489 	| 61511 	| 0.966 |
| newstest2011.eng-deu 	| 24.1 	| 0.5364 	| 3003 	| 72981 	| 0.990 |
| newstest2012.eng-deu 	| 24.6 	| 0.5375 	| 3003 	| 72886 	| 0.972 |
| newstest2013.eng-deu 	| 28.3 	| 0.5636 	| 3000 	| 63737 	| 0.988 |
| newstest2014-deen.eng-deu 	| 30.9 	| 0.6084 	| 3003 	| 62964 	| 1.000 |
| newstest2015-ende.eng-deu 	| 33.2 	| 0.6106 	| 2169 	| 44260 	| 1.000 |
| newstest2016-ende.eng-deu 	| 39.8 	| 0.6595 	| 2999 	| 62670 	| 0.993 |
| newstest2017-ende.eng-deu 	| 32.0 	| 0.6047 	| 3004 	| 61291 	| 1.000 |
| newstest2018-ende.eng-deu 	| 48.8 	| 0.7146 	| 2998 	| 64276 	| 1.000 |
| newstest2019-ende.eng-deu 	| 45.0 	| 0.6821 	| 1997 	| 48969 	| 0.995 |
| Tatoeba-test-v2021-08-07.eng-deu 	| 43.7 	| 0.6442 	| 10000 	| 85728 	| 1.000 |