tiedeman commited on
Commit
0a10a15
1 Parent(s): 73ddfbc

Initial commit

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.spm filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,2833 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ language:
4
+ - af
5
+ - ang
6
+ - bar
7
+ - bi
8
+ - bzj
9
+ - de
10
+ - djk
11
+ - drt
12
+ - en
13
+ - enm
14
+ - es
15
+ - fr
16
+ - frr
17
+ - fy
18
+ - gos
19
+ - gsw
20
+ - hrx
21
+ - hwc
22
+ - icr
23
+ - jam
24
+ - kri
25
+ - ksh
26
+ - lb
27
+ - li
28
+ - nds
29
+ - nl
30
+ - ofs
31
+ - pcm
32
+ - pdc
33
+ - pfl
34
+ - pih
35
+ - pis
36
+ - pt
37
+ - rop
38
+ - sco
39
+ - srm
40
+ - srn
41
+ - stq
42
+ - swg
43
+ - tcs
44
+ - tpi
45
+ - vls
46
+ - wae
47
+ - yi
48
+ - zea
49
+
50
+ tags:
51
+ - translation
52
+ - opus-mt-tc-bible
53
+
54
+ license: apache-2.0
55
+ model-index:
56
+ - name: opus-mt-tc-bible-big-deu_eng_fra_por_spa-gmw
57
+ results:
58
+ - task:
59
+ name: Translation deu-afr
60
+ type: translation
61
+ args: deu-afr
62
+ dataset:
63
+ name: flores200-devtest
64
+ type: flores200-devtest
65
+ args: deu-afr
66
+ metrics:
67
+ - name: BLEU
68
+ type: bleu
69
+ value: 26.2
70
+ - name: chr-F
71
+ type: chrf
72
+ value: 0.57725
73
+ - task:
74
+ name: Translation deu-eng
75
+ type: translation
76
+ args: deu-eng
77
+ dataset:
78
+ name: flores200-devtest
79
+ type: flores200-devtest
80
+ args: deu-eng
81
+ metrics:
82
+ - name: BLEU
83
+ type: bleu
84
+ value: 41.5
85
+ - name: chr-F
86
+ type: chrf
87
+ value: 0.67043
88
+ - task:
89
+ name: Translation deu-ltz
90
+ type: translation
91
+ args: deu-ltz
92
+ dataset:
93
+ name: flores200-devtest
94
+ type: flores200-devtest
95
+ args: deu-ltz
96
+ metrics:
97
+ - name: BLEU
98
+ type: bleu
99
+ value: 21.6
100
+ - name: chr-F
101
+ type: chrf
102
+ value: 0.54626
103
+ - task:
104
+ name: Translation deu-nld
105
+ type: translation
106
+ args: deu-nld
107
+ dataset:
108
+ name: flores200-devtest
109
+ type: flores200-devtest
110
+ args: deu-nld
111
+ metrics:
112
+ - name: BLEU
113
+ type: bleu
114
+ value: 24.0
115
+ - name: chr-F
116
+ type: chrf
117
+ value: 0.55679
118
+ - task:
119
+ name: Translation deu-tpi
120
+ type: translation
121
+ args: deu-tpi
122
+ dataset:
123
+ name: flores200-devtest
124
+ type: flores200-devtest
125
+ args: deu-tpi
126
+ metrics:
127
+ - name: BLEU
128
+ type: bleu
129
+ value: 13.5
130
+ - name: chr-F
131
+ type: chrf
132
+ value: 0.39146
133
+ - task:
134
+ name: Translation eng-afr
135
+ type: translation
136
+ args: eng-afr
137
+ dataset:
138
+ name: flores200-devtest
139
+ type: flores200-devtest
140
+ args: eng-afr
141
+ metrics:
142
+ - name: BLEU
143
+ type: bleu
144
+ value: 40.2
145
+ - name: chr-F
146
+ type: chrf
147
+ value: 0.68115
148
+ - task:
149
+ name: Translation eng-deu
150
+ type: translation
151
+ args: eng-deu
152
+ dataset:
153
+ name: flores200-devtest
154
+ type: flores200-devtest
155
+ args: eng-deu
156
+ metrics:
157
+ - name: BLEU
158
+ type: bleu
159
+ value: 37.4
160
+ - name: chr-F
161
+ type: chrf
162
+ value: 0.64561
163
+ - task:
164
+ name: Translation eng-ltz
165
+ type: translation
166
+ args: eng-ltz
167
+ dataset:
168
+ name: flores200-devtest
169
+ type: flores200-devtest
170
+ args: eng-ltz
171
+ metrics:
172
+ - name: BLEU
173
+ type: bleu
174
+ value: 22.0
175
+ - name: chr-F
176
+ type: chrf
177
+ value: 0.54932
178
+ - task:
179
+ name: Translation eng-nld
180
+ type: translation
181
+ args: eng-nld
182
+ dataset:
183
+ name: flores200-devtest
184
+ type: flores200-devtest
185
+ args: eng-nld
186
+ metrics:
187
+ - name: BLEU
188
+ type: bleu
189
+ value: 26.8
190
+ - name: chr-F
191
+ type: chrf
192
+ value: 0.58124
193
+ - task:
194
+ name: Translation eng-tpi
195
+ type: translation
196
+ args: eng-tpi
197
+ dataset:
198
+ name: flores200-devtest
199
+ type: flores200-devtest
200
+ args: eng-tpi
201
+ metrics:
202
+ - name: BLEU
203
+ type: bleu
204
+ value: 15.9
205
+ - name: chr-F
206
+ type: chrf
207
+ value: 0.40338
208
+ - task:
209
+ name: Translation fra-afr
210
+ type: translation
211
+ args: fra-afr
212
+ dataset:
213
+ name: flores200-devtest
214
+ type: flores200-devtest
215
+ args: fra-afr
216
+ metrics:
217
+ - name: BLEU
218
+ type: bleu
219
+ value: 26.4
220
+ - name: chr-F
221
+ type: chrf
222
+ value: 0.57320
223
+ - task:
224
+ name: Translation fra-deu
225
+ type: translation
226
+ args: fra-deu
227
+ dataset:
228
+ name: flores200-devtest
229
+ type: flores200-devtest
230
+ args: fra-deu
231
+ metrics:
232
+ - name: BLEU
233
+ type: bleu
234
+ value: 29.5
235
+ - name: chr-F
236
+ type: chrf
237
+ value: 0.58974
238
+ - task:
239
+ name: Translation fra-eng
240
+ type: translation
241
+ args: fra-eng
242
+ dataset:
243
+ name: flores200-devtest
244
+ type: flores200-devtest
245
+ args: fra-eng
246
+ metrics:
247
+ - name: BLEU
248
+ type: bleu
249
+ value: 43.7
250
+ - name: chr-F
251
+ type: chrf
252
+ value: 0.68106
253
+ - task:
254
+ name: Translation fra-ltz
255
+ type: translation
256
+ args: fra-ltz
257
+ dataset:
258
+ name: flores200-devtest
259
+ type: flores200-devtest
260
+ args: fra-ltz
261
+ metrics:
262
+ - name: BLEU
263
+ type: bleu
264
+ value: 17.8
265
+ - name: chr-F
266
+ type: chrf
267
+ value: 0.49618
268
+ - task:
269
+ name: Translation fra-nld
270
+ type: translation
271
+ args: fra-nld
272
+ dataset:
273
+ name: flores200-devtest
274
+ type: flores200-devtest
275
+ args: fra-nld
276
+ metrics:
277
+ - name: BLEU
278
+ type: bleu
279
+ value: 22.5
280
+ - name: chr-F
281
+ type: chrf
282
+ value: 0.54623
283
+ - task:
284
+ name: Translation fra-tpi
285
+ type: translation
286
+ args: fra-tpi
287
+ dataset:
288
+ name: flores200-devtest
289
+ type: flores200-devtest
290
+ args: fra-tpi
291
+ metrics:
292
+ - name: BLEU
293
+ type: bleu
294
+ value: 13.7
295
+ - name: chr-F
296
+ type: chrf
297
+ value: 0.39334
298
+ - task:
299
+ name: Translation por-afr
300
+ type: translation
301
+ args: por-afr
302
+ dataset:
303
+ name: flores200-devtest
304
+ type: flores200-devtest
305
+ args: por-afr
306
+ metrics:
307
+ - name: BLEU
308
+ type: bleu
309
+ value: 27.6
310
+ - name: chr-F
311
+ type: chrf
312
+ value: 0.58408
313
+ - task:
314
+ name: Translation por-deu
315
+ type: translation
316
+ args: por-deu
317
+ dataset:
318
+ name: flores200-devtest
319
+ type: flores200-devtest
320
+ args: por-deu
321
+ metrics:
322
+ - name: BLEU
323
+ type: bleu
324
+ value: 30.4
325
+ - name: chr-F
326
+ type: chrf
327
+ value: 0.59121
328
+ - task:
329
+ name: Translation por-eng
330
+ type: translation
331
+ args: por-eng
332
+ dataset:
333
+ name: flores200-devtest
334
+ type: flores200-devtest
335
+ args: por-eng
336
+ metrics:
337
+ - name: BLEU
338
+ type: bleu
339
+ value: 48.3
340
+ - name: chr-F
341
+ type: chrf
342
+ value: 0.71418
343
+ - task:
344
+ name: Translation por-ltz
345
+ type: translation
346
+ args: por-ltz
347
+ dataset:
348
+ name: flores200-devtest
349
+ type: flores200-devtest
350
+ args: por-ltz
351
+ metrics:
352
+ - name: BLEU
353
+ type: bleu
354
+ value: 12.3
355
+ - name: chr-F
356
+ type: chrf
357
+ value: 0.39073
358
+ - task:
359
+ name: Translation por-nld
360
+ type: translation
361
+ args: por-nld
362
+ dataset:
363
+ name: flores200-devtest
364
+ type: flores200-devtest
365
+ args: por-nld
366
+ metrics:
367
+ - name: BLEU
368
+ type: bleu
369
+ value: 22.9
370
+ - name: chr-F
371
+ type: chrf
372
+ value: 0.54828
373
+ - task:
374
+ name: Translation por-tpi
375
+ type: translation
376
+ args: por-tpi
377
+ dataset:
378
+ name: flores200-devtest
379
+ type: flores200-devtest
380
+ args: por-tpi
381
+ metrics:
382
+ - name: BLEU
383
+ type: bleu
384
+ value: 13.7
385
+ - name: chr-F
386
+ type: chrf
387
+ value: 0.38929
388
+ - task:
389
+ name: Translation spa-afr
390
+ type: translation
391
+ args: spa-afr
392
+ dataset:
393
+ name: flores200-devtest
394
+ type: flores200-devtest
395
+ args: spa-afr
396
+ metrics:
397
+ - name: BLEU
398
+ type: bleu
399
+ value: 17.8
400
+ - name: chr-F
401
+ type: chrf
402
+ value: 0.51514
403
+ - task:
404
+ name: Translation spa-deu
405
+ type: translation
406
+ args: spa-deu
407
+ dataset:
408
+ name: flores200-devtest
409
+ type: flores200-devtest
410
+ args: spa-deu
411
+ metrics:
412
+ - name: BLEU
413
+ type: bleu
414
+ value: 21.4
415
+ - name: chr-F
416
+ type: chrf
417
+ value: 0.53603
418
+ - task:
419
+ name: Translation spa-eng
420
+ type: translation
421
+ args: spa-eng
422
+ dataset:
423
+ name: flores200-devtest
424
+ type: flores200-devtest
425
+ args: spa-eng
426
+ metrics:
427
+ - name: BLEU
428
+ type: bleu
429
+ value: 28.2
430
+ - name: chr-F
431
+ type: chrf
432
+ value: 0.58604
433
+ - task:
434
+ name: Translation spa-nld
435
+ type: translation
436
+ args: spa-nld
437
+ dataset:
438
+ name: flores200-devtest
439
+ type: flores200-devtest
440
+ args: spa-nld
441
+ metrics:
442
+ - name: BLEU
443
+ type: bleu
444
+ value: 17.9
445
+ - name: chr-F
446
+ type: chrf
447
+ value: 0.51244
448
+ - task:
449
+ name: Translation spa-tpi
450
+ type: translation
451
+ args: spa-tpi
452
+ dataset:
453
+ name: flores200-devtest
454
+ type: flores200-devtest
455
+ args: spa-tpi
456
+ metrics:
457
+ - name: BLEU
458
+ type: bleu
459
+ value: 12.2
460
+ - name: chr-F
461
+ type: chrf
462
+ value: 0.37967
463
+ - task:
464
+ name: Translation deu-afr
465
+ type: translation
466
+ args: deu-afr
467
+ dataset:
468
+ name: flores101-devtest
469
+ type: flores_101
470
+ args: deu afr devtest
471
+ metrics:
472
+ - name: BLEU
473
+ type: bleu
474
+ value: 26.0
475
+ - name: chr-F
476
+ type: chrf
477
+ value: 0.57287
478
+ - task:
479
+ name: Translation deu-eng
480
+ type: translation
481
+ args: deu-eng
482
+ dataset:
483
+ name: flores101-devtest
484
+ type: flores_101
485
+ args: deu eng devtest
486
+ metrics:
487
+ - name: BLEU
488
+ type: bleu
489
+ value: 40.9
490
+ - name: chr-F
491
+ type: chrf
492
+ value: 0.66660
493
+ - task:
494
+ name: Translation deu-nld
495
+ type: translation
496
+ args: deu-nld
497
+ dataset:
498
+ name: flores101-devtest
499
+ type: flores_101
500
+ args: deu nld devtest
501
+ metrics:
502
+ - name: BLEU
503
+ type: bleu
504
+ value: 23.6
505
+ - name: chr-F
506
+ type: chrf
507
+ value: 0.55423
508
+ - task:
509
+ name: Translation eng-afr
510
+ type: translation
511
+ args: eng-afr
512
+ dataset:
513
+ name: flores101-devtest
514
+ type: flores_101
515
+ args: eng afr devtest
516
+ metrics:
517
+ - name: BLEU
518
+ type: bleu
519
+ value: 40.0
520
+ - name: chr-F
521
+ type: chrf
522
+ value: 0.67793
523
+ - task:
524
+ name: Translation eng-deu
525
+ type: translation
526
+ args: eng-deu
527
+ dataset:
528
+ name: flores101-devtest
529
+ type: flores_101
530
+ args: eng deu devtest
531
+ metrics:
532
+ - name: BLEU
533
+ type: bleu
534
+ value: 37.2
535
+ - name: chr-F
536
+ type: chrf
537
+ value: 0.64295
538
+ - task:
539
+ name: Translation eng-nld
540
+ type: translation
541
+ args: eng-nld
542
+ dataset:
543
+ name: flores101-devtest
544
+ type: flores_101
545
+ args: eng nld devtest
546
+ metrics:
547
+ - name: BLEU
548
+ type: bleu
549
+ value: 26.2
550
+ - name: chr-F
551
+ type: chrf
552
+ value: 0.57690
553
+ - task:
554
+ name: Translation fra-ltz
555
+ type: translation
556
+ args: fra-ltz
557
+ dataset:
558
+ name: flores101-devtest
559
+ type: flores_101
560
+ args: fra ltz devtest
561
+ metrics:
562
+ - name: BLEU
563
+ type: bleu
564
+ value: 17.3
565
+ - name: chr-F
566
+ type: chrf
567
+ value: 0.49430
568
+ - task:
569
+ name: Translation fra-nld
570
+ type: translation
571
+ args: fra-nld
572
+ dataset:
573
+ name: flores101-devtest
574
+ type: flores_101
575
+ args: fra nld devtest
576
+ metrics:
577
+ - name: BLEU
578
+ type: bleu
579
+ value: 22.2
580
+ - name: chr-F
581
+ type: chrf
582
+ value: 0.54318
583
+ - task:
584
+ name: Translation por-deu
585
+ type: translation
586
+ args: por-deu
587
+ dataset:
588
+ name: flores101-devtest
589
+ type: flores_101
590
+ args: por deu devtest
591
+ metrics:
592
+ - name: BLEU
593
+ type: bleu
594
+ value: 29.8
595
+ - name: chr-F
596
+ type: chrf
597
+ value: 0.58851
598
+ - task:
599
+ name: Translation por-nld
600
+ type: translation
601
+ args: por-nld
602
+ dataset:
603
+ name: flores101-devtest
604
+ type: flores_101
605
+ args: por nld devtest
606
+ metrics:
607
+ - name: BLEU
608
+ type: bleu
609
+ value: 22.6
610
+ - name: chr-F
611
+ type: chrf
612
+ value: 0.54571
613
+ - task:
614
+ name: Translation spa-nld
615
+ type: translation
616
+ args: spa-nld
617
+ dataset:
618
+ name: flores101-devtest
619
+ type: flores_101
620
+ args: spa nld devtest
621
+ metrics:
622
+ - name: BLEU
623
+ type: bleu
624
+ value: 17.5
625
+ - name: chr-F
626
+ type: chrf
627
+ value: 0.50968
628
+ - task:
629
+ name: Translation deu-eng
630
+ type: translation
631
+ args: deu-eng
632
+ dataset:
633
+ name: generaltest2022
634
+ type: generaltest2022
635
+ args: deu-eng
636
+ metrics:
637
+ - name: BLEU
638
+ type: bleu
639
+ value: 30.6
640
+ - name: chr-F
641
+ type: chrf
642
+ value: 0.55777
643
+ - task:
644
+ name: Translation eng-deu
645
+ type: translation
646
+ args: eng-deu
647
+ dataset:
648
+ name: generaltest2022
649
+ type: generaltest2022
650
+ args: eng-deu
651
+ metrics:
652
+ - name: BLEU
653
+ type: bleu
654
+ value: 33.0
655
+ - name: chr-F
656
+ type: chrf
657
+ value: 0.60792
658
+ - task:
659
+ name: Translation fra-deu
660
+ type: translation
661
+ args: fra-deu
662
+ dataset:
663
+ name: generaltest2022
664
+ type: generaltest2022
665
+ args: fra-deu
666
+ metrics:
667
+ - name: BLEU
668
+ type: bleu
669
+ value: 44.5
670
+ - name: chr-F
671
+ type: chrf
672
+ value: 0.67039
673
+ - task:
674
+ name: Translation deu-eng
675
+ type: translation
676
+ args: deu-eng
677
+ dataset:
678
+ name: multi30k_test_2016_flickr
679
+ type: multi30k-2016_flickr
680
+ args: deu-eng
681
+ metrics:
682
+ - name: BLEU
683
+ type: bleu
684
+ value: 40.1
685
+ - name: chr-F
686
+ type: chrf
687
+ value: 0.60981
688
+ - task:
689
+ name: Translation eng-deu
690
+ type: translation
691
+ args: eng-deu
692
+ dataset:
693
+ name: multi30k_test_2016_flickr
694
+ type: multi30k-2016_flickr
695
+ args: eng-deu
696
+ metrics:
697
+ - name: BLEU
698
+ type: bleu
699
+ value: 34.9
700
+ - name: chr-F
701
+ type: chrf
702
+ value: 0.64153
703
+ - task:
704
+ name: Translation fra-deu
705
+ type: translation
706
+ args: fra-deu
707
+ dataset:
708
+ name: multi30k_test_2016_flickr
709
+ type: multi30k-2016_flickr
710
+ args: fra-deu
711
+ metrics:
712
+ - name: BLEU
713
+ type: bleu
714
+ value: 32.1
715
+ - name: chr-F
716
+ type: chrf
717
+ value: 0.61781
718
+ - task:
719
+ name: Translation fra-eng
720
+ type: translation
721
+ args: fra-eng
722
+ dataset:
723
+ name: multi30k_test_2016_flickr
724
+ type: multi30k-2016_flickr
725
+ args: fra-eng
726
+ metrics:
727
+ - name: BLEU
728
+ type: bleu
729
+ value: 47.9
730
+ - name: chr-F
731
+ type: chrf
732
+ value: 0.66703
733
+ - task:
734
+ name: Translation deu-eng
735
+ type: translation
736
+ args: deu-eng
737
+ dataset:
738
+ name: multi30k_test_2017_flickr
739
+ type: multi30k-2017_flickr
740
+ args: deu-eng
741
+ metrics:
742
+ - name: BLEU
743
+ type: bleu
744
+ value: 41.0
745
+ - name: chr-F
746
+ type: chrf
747
+ value: 0.63624
748
+ - task:
749
+ name: Translation eng-deu
750
+ type: translation
751
+ args: eng-deu
752
+ dataset:
753
+ name: multi30k_test_2017_flickr
754
+ type: multi30k-2017_flickr
755
+ args: eng-deu
756
+ metrics:
757
+ - name: BLEU
758
+ type: bleu
759
+ value: 34.6
760
+ - name: chr-F
761
+ type: chrf
762
+ value: 0.63423
763
+ - task:
764
+ name: Translation fra-deu
765
+ type: translation
766
+ args: fra-deu
767
+ dataset:
768
+ name: multi30k_test_2017_flickr
769
+ type: multi30k-2017_flickr
770
+ args: fra-deu
771
+ metrics:
772
+ - name: BLEU
773
+ type: bleu
774
+ value: 29.7
775
+ - name: chr-F
776
+ type: chrf
777
+ value: 0.60084
778
+ - task:
779
+ name: Translation fra-eng
780
+ type: translation
781
+ args: fra-eng
782
+ dataset:
783
+ name: multi30k_test_2017_flickr
784
+ type: multi30k-2017_flickr
785
+ args: fra-eng
786
+ metrics:
787
+ - name: BLEU
788
+ type: bleu
789
+ value: 50.4
790
+ - name: chr-F
791
+ type: chrf
792
+ value: 0.69254
793
+ - task:
794
+ name: Translation deu-eng
795
+ type: translation
796
+ args: deu-eng
797
+ dataset:
798
+ name: multi30k_test_2017_mscoco
799
+ type: multi30k-2017_mscoco
800
+ args: deu-eng
801
+ metrics:
802
+ - name: BLEU
803
+ type: bleu
804
+ value: 32.5
805
+ - name: chr-F
806
+ type: chrf
807
+ value: 0.55790
808
+ - task:
809
+ name: Translation eng-deu
810
+ type: translation
811
+ args: eng-deu
812
+ dataset:
813
+ name: multi30k_test_2017_mscoco
814
+ type: multi30k-2017_mscoco
815
+ args: eng-deu
816
+ metrics:
817
+ - name: BLEU
818
+ type: bleu
819
+ value: 28.6
820
+ - name: chr-F
821
+ type: chrf
822
+ value: 0.57491
823
+ - task:
824
+ name: Translation fra-deu
825
+ type: translation
826
+ args: fra-deu
827
+ dataset:
828
+ name: multi30k_test_2017_mscoco
829
+ type: multi30k-2017_mscoco
830
+ args: fra-deu
831
+ metrics:
832
+ - name: BLEU
833
+ type: bleu
834
+ value: 26.4
835
+ - name: chr-F
836
+ type: chrf
837
+ value: 0.56108
838
+ - task:
839
+ name: Translation fra-eng
840
+ type: translation
841
+ args: fra-eng
842
+ dataset:
843
+ name: multi30k_test_2017_mscoco
844
+ type: multi30k-2017_mscoco
845
+ args: fra-eng
846
+ metrics:
847
+ - name: BLEU
848
+ type: bleu
849
+ value: 49.1
850
+ - name: chr-F
851
+ type: chrf
852
+ value: 0.68212
853
+ - task:
854
+ name: Translation deu-eng
855
+ type: translation
856
+ args: deu-eng
857
+ dataset:
858
+ name: multi30k_test_2018_flickr
859
+ type: multi30k-2018_flickr
860
+ args: deu-eng
861
+ metrics:
862
+ - name: BLEU
863
+ type: bleu
864
+ value: 36.6
865
+ - name: chr-F
866
+ type: chrf
867
+ value: 0.59322
868
+ - task:
869
+ name: Translation eng-deu
870
+ type: translation
871
+ args: eng-deu
872
+ dataset:
873
+ name: multi30k_test_2018_flickr
874
+ type: multi30k-2018_flickr
875
+ args: eng-deu
876
+ metrics:
877
+ - name: BLEU
878
+ type: bleu
879
+ value: 30.0
880
+ - name: chr-F
881
+ type: chrf
882
+ value: 0.59858
883
+ - task:
884
+ name: Translation fra-deu
885
+ type: translation
886
+ args: fra-deu
887
+ dataset:
888
+ name: multi30k_test_2018_flickr
889
+ type: multi30k-2018_flickr
890
+ args: fra-deu
891
+ metrics:
892
+ - name: BLEU
893
+ type: bleu
894
+ value: 24.7
895
+ - name: chr-F
896
+ type: chrf
897
+ value: 0.55667
898
+ - task:
899
+ name: Translation fra-eng
900
+ type: translation
901
+ args: fra-eng
902
+ dataset:
903
+ name: multi30k_test_2018_flickr
904
+ type: multi30k-2018_flickr
905
+ args: fra-eng
906
+ metrics:
907
+ - name: BLEU
908
+ type: bleu
909
+ value: 43.4
910
+ - name: chr-F
911
+ type: chrf
912
+ value: 0.64702
913
+ - task:
914
+ name: Translation fra-eng
915
+ type: translation
916
+ args: fra-eng
917
+ dataset:
918
+ name: newsdiscusstest2015
919
+ type: newsdiscusstest2015
920
+ args: fra-eng
921
+ metrics:
922
+ - name: BLEU
923
+ type: bleu
924
+ value: 38.5
925
+ - name: chr-F
926
+ type: chrf
927
+ value: 0.61399
928
+ - task:
929
+ name: Translation deu-eng
930
+ type: translation
931
+ args: deu-eng
932
+ dataset:
933
+ name: newstestALL2020
934
+ type: newstestALL2020
935
+ args: deu-eng
936
+ metrics:
937
+ - name: BLEU
938
+ type: bleu
939
+ value: 34.0
940
+ - name: chr-F
941
+ type: chrf
942
+ value: 0.60403
943
+ - task:
944
+ name: Translation eng-deu
945
+ type: translation
946
+ args: eng-deu
947
+ dataset:
948
+ name: newstestALL2020
949
+ type: newstestALL2020
950
+ args: eng-deu
951
+ metrics:
952
+ - name: BLEU
953
+ type: bleu
954
+ value: 32.3
955
+ - name: chr-F
956
+ type: chrf
957
+ value: 0.60255
958
+ - task:
959
+ name: Translation deu-afr
960
+ type: translation
961
+ args: deu-afr
962
+ dataset:
963
+ name: ntrex128
964
+ type: ntrex128
965
+ args: deu-afr
966
+ metrics:
967
+ - name: BLEU
968
+ type: bleu
969
+ value: 27.9
970
+ - name: chr-F
971
+ type: chrf
972
+ value: 0.57109
973
+ - task:
974
+ name: Translation deu-eng
975
+ type: translation
976
+ args: deu-eng
977
+ dataset:
978
+ name: ntrex128
979
+ type: ntrex128
980
+ args: deu-eng
981
+ metrics:
982
+ - name: BLEU
983
+ type: bleu
984
+ value: 34.5
985
+ - name: chr-F
986
+ type: chrf
987
+ value: 0.62043
988
+ - task:
989
+ name: Translation deu-ltz
990
+ type: translation
991
+ args: deu-ltz
992
+ dataset:
993
+ name: ntrex128
994
+ type: ntrex128
995
+ args: deu-ltz
996
+ metrics:
997
+ - name: BLEU
998
+ type: bleu
999
+ value: 15.4
1000
+ - name: chr-F
1001
+ type: chrf
1002
+ value: 0.47642
1003
+ - task:
1004
+ name: Translation deu-nld
1005
+ type: translation
1006
+ args: deu-nld
1007
+ dataset:
1008
+ name: ntrex128
1009
+ type: ntrex128
1010
+ args: deu-nld
1011
+ metrics:
1012
+ - name: BLEU
1013
+ type: bleu
1014
+ value: 27.6
1015
+ - name: chr-F
1016
+ type: chrf
1017
+ value: 0.56777
1018
+ - task:
1019
+ name: Translation eng-afr
1020
+ type: translation
1021
+ args: eng-afr
1022
+ dataset:
1023
+ name: ntrex128
1024
+ type: ntrex128
1025
+ args: eng-afr
1026
+ metrics:
1027
+ - name: BLEU
1028
+ type: bleu
1029
+ value: 44.1
1030
+ - name: chr-F
1031
+ type: chrf
1032
+ value: 0.68616
1033
+ - task:
1034
+ name: Translation eng-deu
1035
+ type: translation
1036
+ args: eng-deu
1037
+ dataset:
1038
+ name: ntrex128
1039
+ type: ntrex128
1040
+ args: eng-deu
1041
+ metrics:
1042
+ - name: BLEU
1043
+ type: bleu
1044
+ value: 30.2
1045
+ - name: chr-F
1046
+ type: chrf
1047
+ value: 0.58743
1048
+ - task:
1049
+ name: Translation eng-ltz
1050
+ type: translation
1051
+ args: eng-ltz
1052
+ dataset:
1053
+ name: ntrex128
1054
+ type: ntrex128
1055
+ args: eng-ltz
1056
+ metrics:
1057
+ - name: BLEU
1058
+ type: bleu
1059
+ value: 18.0
1060
+ - name: chr-F
1061
+ type: chrf
1062
+ value: 0.50083
1063
+ - task:
1064
+ name: Translation eng-nld
1065
+ type: translation
1066
+ args: eng-nld
1067
+ dataset:
1068
+ name: ntrex128
1069
+ type: ntrex128
1070
+ args: eng-nld
1071
+ metrics:
1072
+ - name: BLEU
1073
+ type: bleu
1074
+ value: 33.8
1075
+ - name: chr-F
1076
+ type: chrf
1077
+ value: 0.61041
1078
+ - task:
1079
+ name: Translation fra-afr
1080
+ type: translation
1081
+ args: fra-afr
1082
+ dataset:
1083
+ name: ntrex128
1084
+ type: ntrex128
1085
+ args: fra-afr
1086
+ metrics:
1087
+ - name: BLEU
1088
+ type: bleu
1089
+ value: 26.5
1090
+ - name: chr-F
1091
+ type: chrf
1092
+ value: 0.55607
1093
+ - task:
1094
+ name: Translation fra-deu
1095
+ type: translation
1096
+ args: fra-deu
1097
+ dataset:
1098
+ name: ntrex128
1099
+ type: ntrex128
1100
+ args: fra-deu
1101
+ metrics:
1102
+ - name: BLEU
1103
+ type: bleu
1104
+ value: 23.6
1105
+ - name: chr-F
1106
+ type: chrf
1107
+ value: 0.53269
1108
+ - task:
1109
+ name: Translation fra-eng
1110
+ type: translation
1111
+ args: fra-eng
1112
+ dataset:
1113
+ name: ntrex128
1114
+ type: ntrex128
1115
+ args: fra-eng
1116
+ metrics:
1117
+ - name: BLEU
1118
+ type: bleu
1119
+ value: 34.4
1120
+ - name: chr-F
1121
+ type: chrf
1122
+ value: 0.61058
1123
+ - task:
1124
+ name: Translation fra-ltz
1125
+ type: translation
1126
+ args: fra-ltz
1127
+ dataset:
1128
+ name: ntrex128
1129
+ type: ntrex128
1130
+ args: fra-ltz
1131
+ metrics:
1132
+ - name: BLEU
1133
+ type: bleu
1134
+ value: 12.0
1135
+ - name: chr-F
1136
+ type: chrf
1137
+ value: 0.41312
1138
+ - task:
1139
+ name: Translation fra-nld
1140
+ type: translation
1141
+ args: fra-nld
1142
+ dataset:
1143
+ name: ntrex128
1144
+ type: ntrex128
1145
+ args: fra-nld
1146
+ metrics:
1147
+ - name: BLEU
1148
+ type: bleu
1149
+ value: 25.2
1150
+ - name: chr-F
1151
+ type: chrf
1152
+ value: 0.54615
1153
+ - task:
1154
+ name: Translation por-afr
1155
+ type: translation
1156
+ args: por-afr
1157
+ dataset:
1158
+ name: ntrex128
1159
+ type: ntrex128
1160
+ args: por-afr
1161
+ metrics:
1162
+ - name: BLEU
1163
+ type: bleu
1164
+ value: 29.2
1165
+ - name: chr-F
1166
+ type: chrf
1167
+ value: 0.58296
1168
+ - task:
1169
+ name: Translation por-deu
1170
+ type: translation
1171
+ args: por-deu
1172
+ dataset:
1173
+ name: ntrex128
1174
+ type: ntrex128
1175
+ args: por-deu
1176
+ metrics:
1177
+ - name: BLEU
1178
+ type: bleu
1179
+ value: 24.7
1180
+ - name: chr-F
1181
+ type: chrf
1182
+ value: 0.54944
1183
+ - task:
1184
+ name: Translation por-eng
1185
+ type: translation
1186
+ args: por-eng
1187
+ dataset:
1188
+ name: ntrex128
1189
+ type: ntrex128
1190
+ args: por-eng
1191
+ metrics:
1192
+ - name: BLEU
1193
+ type: bleu
1194
+ value: 39.6
1195
+ - name: chr-F
1196
+ type: chrf
1197
+ value: 0.65002
1198
+ - task:
1199
+ name: Translation por-nld
1200
+ type: translation
1201
+ args: por-nld
1202
+ dataset:
1203
+ name: ntrex128
1204
+ type: ntrex128
1205
+ args: por-nld
1206
+ metrics:
1207
+ - name: BLEU
1208
+ type: bleu
1209
+ value: 28.1
1210
+ - name: chr-F
1211
+ type: chrf
1212
+ value: 0.56384
1213
+ - task:
1214
+ name: Translation spa-afr
1215
+ type: translation
1216
+ args: spa-afr
1217
+ dataset:
1218
+ name: ntrex128
1219
+ type: ntrex128
1220
+ args: spa-afr
1221
+ metrics:
1222
+ - name: BLEU
1223
+ type: bleu
1224
+ value: 27.7
1225
+ - name: chr-F
1226
+ type: chrf
1227
+ value: 0.57772
1228
+ - task:
1229
+ name: Translation spa-deu
1230
+ type: translation
1231
+ args: spa-deu
1232
+ dataset:
1233
+ name: ntrex128
1234
+ type: ntrex128
1235
+ args: spa-deu
1236
+ metrics:
1237
+ - name: BLEU
1238
+ type: bleu
1239
+ value: 24.0
1240
+ - name: chr-F
1241
+ type: chrf
1242
+ value: 0.54561
1243
+ - task:
1244
+ name: Translation spa-eng
1245
+ type: translation
1246
+ args: spa-eng
1247
+ dataset:
1248
+ name: ntrex128
1249
+ type: ntrex128
1250
+ args: spa-eng
1251
+ metrics:
1252
+ - name: BLEU
1253
+ type: bleu
1254
+ value: 37.3
1255
+ - name: chr-F
1256
+ type: chrf
1257
+ value: 0.64305
1258
+ - task:
1259
+ name: Translation spa-nld
1260
+ type: translation
1261
+ args: spa-nld
1262
+ dataset:
1263
+ name: ntrex128
1264
+ type: ntrex128
1265
+ args: spa-nld
1266
+ metrics:
1267
+ - name: BLEU
1268
+ type: bleu
1269
+ value: 27.8
1270
+ - name: chr-F
1271
+ type: chrf
1272
+ value: 0.56397
1273
+ - task:
1274
+ name: Translation deu-afr
1275
+ type: translation
1276
+ args: deu-afr
1277
+ dataset:
1278
+ name: tatoeba-test-v2021-08-07
1279
+ type: tatoeba_mt
1280
+ args: deu-afr
1281
+ metrics:
1282
+ - name: BLEU
1283
+ type: bleu
1284
+ value: 56.7
1285
+ - name: chr-F
1286
+ type: chrf
1287
+ value: 0.72039
1288
+ - task:
1289
+ name: Translation deu-deu
1290
+ type: translation
1291
+ args: deu-deu
1292
+ dataset:
1293
+ name: tatoeba-test-v2021-08-07
1294
+ type: tatoeba_mt
1295
+ args: deu-deu
1296
+ metrics:
1297
+ - name: BLEU
1298
+ type: bleu
1299
+ value: 33.7
1300
+ - name: chr-F
1301
+ type: chrf
1302
+ value: 0.59545
1303
+ - task:
1304
+ name: Translation deu-eng
1305
+ type: translation
1306
+ args: deu-eng
1307
+ dataset:
1308
+ name: tatoeba-test-v2021-08-07
1309
+ type: tatoeba_mt
1310
+ args: deu-eng
1311
+ metrics:
1312
+ - name: BLEU
1313
+ type: bleu
1314
+ value: 48.6
1315
+ - name: chr-F
1316
+ type: chrf
1317
+ value: 0.66015
1318
+ - task:
1319
+ name: Translation deu-ltz
1320
+ type: translation
1321
+ args: deu-ltz
1322
+ dataset:
1323
+ name: tatoeba-test-v2021-08-07
1324
+ type: tatoeba_mt
1325
+ args: deu-ltz
1326
+ metrics:
1327
+ - name: BLEU
1328
+ type: bleu
1329
+ value: 34.2
1330
+ - name: chr-F
1331
+ type: chrf
1332
+ value: 0.53760
1333
+ - task:
1334
+ name: Translation deu-nds
1335
+ type: translation
1336
+ args: deu-nds
1337
+ dataset:
1338
+ name: tatoeba-test-v2021-08-07
1339
+ type: tatoeba_mt
1340
+ args: deu-nds
1341
+ metrics:
1342
+ - name: BLEU
1343
+ type: bleu
1344
+ value: 20.1
1345
+ - name: chr-F
1346
+ type: chrf
1347
+ value: 0.44534
1348
+ - task:
1349
+ name: Translation deu-nld
1350
+ type: translation
1351
+ args: deu-nld
1352
+ dataset:
1353
+ name: tatoeba-test-v2021-08-07
1354
+ type: tatoeba_mt
1355
+ args: deu-nld
1356
+ metrics:
1357
+ - name: BLEU
1358
+ type: bleu
1359
+ value: 54.4
1360
+ - name: chr-F
1361
+ type: chrf
1362
+ value: 0.71276
1363
+ - task:
1364
+ name: Translation eng-afr
1365
+ type: translation
1366
+ args: eng-afr
1367
+ dataset:
1368
+ name: tatoeba-test-v2021-08-07
1369
+ type: tatoeba_mt
1370
+ args: eng-afr
1371
+ metrics:
1372
+ - name: BLEU
1373
+ type: bleu
1374
+ value: 56.6
1375
+ - name: chr-F
1376
+ type: chrf
1377
+ value: 0.72087
1378
+ - task:
1379
+ name: Translation eng-deu
1380
+ type: translation
1381
+ args: eng-deu
1382
+ dataset:
1383
+ name: tatoeba-test-v2021-08-07
1384
+ type: tatoeba_mt
1385
+ args: eng-deu
1386
+ metrics:
1387
+ - name: BLEU
1388
+ type: bleu
1389
+ value: 41.4
1390
+ - name: chr-F
1391
+ type: chrf
1392
+ value: 0.62971
1393
+ - task:
1394
+ name: Translation eng-eng
1395
+ type: translation
1396
+ args: eng-eng
1397
+ dataset:
1398
+ name: tatoeba-test-v2021-08-07
1399
+ type: tatoeba_mt
1400
+ args: eng-eng
1401
+ metrics:
1402
+ - name: BLEU
1403
+ type: bleu
1404
+ value: 58.0
1405
+ - name: chr-F
1406
+ type: chrf
1407
+ value: 0.80306
1408
+ - task:
1409
+ name: Translation eng-ltz
1410
+ type: translation
1411
+ args: eng-ltz
1412
+ dataset:
1413
+ name: tatoeba-test-v2021-08-07
1414
+ type: tatoeba_mt
1415
+ args: eng-ltz
1416
+ metrics:
1417
+ - name: BLEU
1418
+ type: bleu
1419
+ value: 45.8
1420
+ - name: chr-F
1421
+ type: chrf
1422
+ value: 0.64423
1423
+ - task:
1424
+ name: Translation eng-nds
1425
+ type: translation
1426
+ args: eng-nds
1427
+ dataset:
1428
+ name: tatoeba-test-v2021-08-07
1429
+ type: tatoeba_mt
1430
+ args: eng-nds
1431
+ metrics:
1432
+ - name: BLEU
1433
+ type: bleu
1434
+ value: 22.2
1435
+ - name: chr-F
1436
+ type: chrf
1437
+ value: 0.46446
1438
+ - task:
1439
+ name: Translation eng-nld
1440
+ type: translation
1441
+ args: eng-nld
1442
+ dataset:
1443
+ name: tatoeba-test-v2021-08-07
1444
+ type: tatoeba_mt
1445
+ args: eng-nld
1446
+ metrics:
1447
+ - name: BLEU
1448
+ type: bleu
1449
+ value: 54.5
1450
+ - name: chr-F
1451
+ type: chrf
1452
+ value: 0.71190
1453
+ - task:
1454
+ name: Translation fra-deu
1455
+ type: translation
1456
+ args: fra-deu
1457
+ dataset:
1458
+ name: tatoeba-test-v2021-08-07
1459
+ type: tatoeba_mt
1460
+ args: fra-deu
1461
+ metrics:
1462
+ - name: BLEU
1463
+ type: bleu
1464
+ value: 50.3
1465
+ - name: chr-F
1466
+ type: chrf
1467
+ value: 0.68991
1468
+ - task:
1469
+ name: Translation fra-eng
1470
+ type: translation
1471
+ args: fra-eng
1472
+ dataset:
1473
+ name: tatoeba-test-v2021-08-07
1474
+ type: tatoeba_mt
1475
+ args: fra-eng
1476
+ metrics:
1477
+ - name: BLEU
1478
+ type: bleu
1479
+ value: 58.0
1480
+ - name: chr-F
1481
+ type: chrf
1482
+ value: 0.72564
1483
+ - task:
1484
+ name: Translation fra-nld
1485
+ type: translation
1486
+ args: fra-nld
1487
+ dataset:
1488
+ name: tatoeba-test-v2021-08-07
1489
+ type: tatoeba_mt
1490
+ args: fra-nld
1491
+ metrics:
1492
+ - name: BLEU
1493
+ type: bleu
1494
+ value: 48.7
1495
+ - name: chr-F
1496
+ type: chrf
1497
+ value: 0.67078
1498
+ - task:
1499
+ name: Translation multi-multi
1500
+ type: translation
1501
+ args: multi-multi
1502
+ dataset:
1503
+ name: tatoeba-test-v2020-07-28-v2023-09-26
1504
+ type: tatoeba_mt
1505
+ args: multi-multi
1506
+ metrics:
1507
+ - name: BLEU
1508
+ type: bleu
1509
+ value: 48.7
1510
+ - name: chr-F
1511
+ type: chrf
1512
+ value: 0.67078
1513
+ - task:
1514
+ name: Translation por-deu
1515
+ type: translation
1516
+ args: por-deu
1517
+ dataset:
1518
+ name: tatoeba-test-v2021-08-07
1519
+ type: tatoeba_mt
1520
+ args: por-deu
1521
+ metrics:
1522
+ - name: BLEU
1523
+ type: bleu
1524
+ value: 48.7
1525
+ - name: chr-F
1526
+ type: chrf
1527
+ value: 0.68437
1528
+ - task:
1529
+ name: Translation por-eng
1530
+ type: translation
1531
+ args: por-eng
1532
+ dataset:
1533
+ name: tatoeba-test-v2021-08-07
1534
+ type: tatoeba_mt
1535
+ args: por-eng
1536
+ metrics:
1537
+ - name: BLEU
1538
+ type: bleu
1539
+ value: 64.3
1540
+ - name: chr-F
1541
+ type: chrf
1542
+ value: 0.77081
1543
+ - task:
1544
+ name: Translation por-nds
1545
+ type: translation
1546
+ args: por-nds
1547
+ dataset:
1548
+ name: tatoeba-test-v2021-08-07
1549
+ type: tatoeba_mt
1550
+ args: por-nds
1551
+ metrics:
1552
+ - name: BLEU
1553
+ type: bleu
1554
+ value: 20.7
1555
+ - name: chr-F
1556
+ type: chrf
1557
+ value: 0.45864
1558
+ - task:
1559
+ name: Translation por-nld
1560
+ type: translation
1561
+ args: por-nld
1562
+ dataset:
1563
+ name: tatoeba-test-v2021-08-07
1564
+ type: tatoeba_mt
1565
+ args: por-nld
1566
+ metrics:
1567
+ - name: BLEU
1568
+ type: bleu
1569
+ value: 52.8
1570
+ - name: chr-F
1571
+ type: chrf
1572
+ value: 0.69865
1573
+ - task:
1574
+ name: Translation spa-afr
1575
+ type: translation
1576
+ args: spa-afr
1577
+ dataset:
1578
+ name: tatoeba-test-v2021-08-07
1579
+ type: tatoeba_mt
1580
+ args: spa-afr
1581
+ metrics:
1582
+ - name: BLEU
1583
+ type: bleu
1584
+ value: 63.3
1585
+ - name: chr-F
1586
+ type: chrf
1587
+ value: 0.77148
1588
+ - task:
1589
+ name: Translation spa-deu
1590
+ type: translation
1591
+ args: spa-deu
1592
+ dataset:
1593
+ name: tatoeba-test-v2021-08-07
1594
+ type: tatoeba_mt
1595
+ args: spa-deu
1596
+ metrics:
1597
+ - name: BLEU
1598
+ type: bleu
1599
+ value: 49.1
1600
+ - name: chr-F
1601
+ type: chrf
1602
+ value: 0.68037
1603
+ - task:
1604
+ name: Translation spa-eng
1605
+ type: translation
1606
+ args: spa-eng
1607
+ dataset:
1608
+ name: tatoeba-test-v2021-08-07
1609
+ type: tatoeba_mt
1610
+ args: spa-eng
1611
+ metrics:
1612
+ - name: BLEU
1613
+ type: bleu
1614
+ value: 60.2
1615
+ - name: chr-F
1616
+ type: chrf
1617
+ value: 0.74575
1618
+ - task:
1619
+ name: Translation spa-nld
1620
+ type: translation
1621
+ args: spa-nld
1622
+ dataset:
1623
+ name: tatoeba-test-v2021-08-07
1624
+ type: tatoeba_mt
1625
+ args: spa-nld
1626
+ metrics:
1627
+ - name: BLEU
1628
+ type: bleu
1629
+ value: 51.1
1630
+ - name: chr-F
1631
+ type: chrf
1632
+ value: 0.68988
1633
+ - task:
1634
+ name: Translation fra-eng
1635
+ type: translation
1636
+ args: fra-eng
1637
+ dataset:
1638
+ name: tico19-test
1639
+ type: tico19-test
1640
+ args: fra-eng
1641
+ metrics:
1642
+ - name: BLEU
1643
+ type: bleu
1644
+ value: 39.2
1645
+ - name: chr-F
1646
+ type: chrf
1647
+ value: 0.62059
1648
+ - task:
1649
+ name: Translation por-eng
1650
+ type: translation
1651
+ args: por-eng
1652
+ dataset:
1653
+ name: tico19-test
1654
+ type: tico19-test
1655
+ args: por-eng
1656
+ metrics:
1657
+ - name: BLEU
1658
+ type: bleu
1659
+ value: 50.3
1660
+ - name: chr-F
1661
+ type: chrf
1662
+ value: 0.73896
1663
+ - task:
1664
+ name: Translation spa-eng
1665
+ type: translation
1666
+ args: spa-eng
1667
+ dataset:
1668
+ name: tico19-test
1669
+ type: tico19-test
1670
+ args: spa-eng
1671
+ metrics:
1672
+ - name: BLEU
1673
+ type: bleu
1674
+ value: 49.6
1675
+ - name: chr-F
1676
+ type: chrf
1677
+ value: 0.72923
1678
+ - task:
1679
+ name: Translation deu-eng
1680
+ type: translation
1681
+ args: deu-eng
1682
+ dataset:
1683
+ name: newstest2008
1684
+ type: wmt-2008-news
1685
+ args: deu-eng
1686
+ metrics:
1687
+ - name: BLEU
1688
+ type: bleu
1689
+ value: 26.9
1690
+ - name: chr-F
1691
+ type: chrf
1692
+ value: 0.54506
1693
+ - task:
1694
+ name: Translation eng-deu
1695
+ type: translation
1696
+ args: eng-deu
1697
+ dataset:
1698
+ name: newstest2008
1699
+ type: wmt-2008-news
1700
+ args: eng-deu
1701
+ metrics:
1702
+ - name: BLEU
1703
+ type: bleu
1704
+ value: 23.1
1705
+ - name: chr-F
1706
+ type: chrf
1707
+ value: 0.53077
1708
+ - task:
1709
+ name: Translation fra-deu
1710
+ type: translation
1711
+ args: fra-deu
1712
+ dataset:
1713
+ name: newstest2008
1714
+ type: wmt-2008-news
1715
+ args: fra-deu
1716
+ metrics:
1717
+ - name: BLEU
1718
+ type: bleu
1719
+ value: 22.9
1720
+ - name: chr-F
1721
+ type: chrf
1722
+ value: 0.53204
1723
+ - task:
1724
+ name: Translation fra-eng
1725
+ type: translation
1726
+ args: fra-eng
1727
+ dataset:
1728
+ name: newstest2008
1729
+ type: wmt-2008-news
1730
+ args: fra-eng
1731
+ metrics:
1732
+ - name: BLEU
1733
+ type: bleu
1734
+ value: 26.4
1735
+ - name: chr-F
1736
+ type: chrf
1737
+ value: 0.54320
1738
+ - task:
1739
+ name: Translation spa-deu
1740
+ type: translation
1741
+ args: spa-deu
1742
+ dataset:
1743
+ name: newstest2008
1744
+ type: wmt-2008-news
1745
+ args: spa-deu
1746
+ metrics:
1747
+ - name: BLEU
1748
+ type: bleu
1749
+ value: 21.6
1750
+ - name: chr-F
1751
+ type: chrf
1752
+ value: 0.52066
1753
+ - task:
1754
+ name: Translation spa-eng
1755
+ type: translation
1756
+ args: spa-eng
1757
+ dataset:
1758
+ name: newstest2008
1759
+ type: wmt-2008-news
1760
+ args: spa-eng
1761
+ metrics:
1762
+ - name: BLEU
1763
+ type: bleu
1764
+ value: 27.9
1765
+ - name: chr-F
1766
+ type: chrf
1767
+ value: 0.55305
1768
+ - task:
1769
+ name: Translation deu-eng
1770
+ type: translation
1771
+ args: deu-eng
1772
+ dataset:
1773
+ name: newstest2009
1774
+ type: wmt-2009-news
1775
+ args: deu-eng
1776
+ metrics:
1777
+ - name: BLEU
1778
+ type: bleu
1779
+ value: 26.2
1780
+ - name: chr-F
1781
+ type: chrf
1782
+ value: 0.53773
1783
+ - task:
1784
+ name: Translation eng-deu
1785
+ type: translation
1786
+ args: eng-deu
1787
+ dataset:
1788
+ name: newstest2009
1789
+ type: wmt-2009-news
1790
+ args: eng-deu
1791
+ metrics:
1792
+ - name: BLEU
1793
+ type: bleu
1794
+ value: 22.3
1795
+ - name: chr-F
1796
+ type: chrf
1797
+ value: 0.53217
1798
+ - task:
1799
+ name: Translation fra-deu
1800
+ type: translation
1801
+ args: fra-deu
1802
+ dataset:
1803
+ name: newstest2009
1804
+ type: wmt-2009-news
1805
+ args: fra-deu
1806
+ metrics:
1807
+ - name: BLEU
1808
+ type: bleu
1809
+ value: 22.9
1810
+ - name: chr-F
1811
+ type: chrf
1812
+ value: 0.52995
1813
+ - task:
1814
+ name: Translation fra-eng
1815
+ type: translation
1816
+ args: fra-eng
1817
+ dataset:
1818
+ name: newstest2009
1819
+ type: wmt-2009-news
1820
+ args: fra-eng
1821
+ metrics:
1822
+ - name: BLEU
1823
+ type: bleu
1824
+ value: 30.0
1825
+ - name: chr-F
1826
+ type: chrf
1827
+ value: 0.56663
1828
+ - task:
1829
+ name: Translation spa-deu
1830
+ type: translation
1831
+ args: spa-deu
1832
+ dataset:
1833
+ name: newstest2009
1834
+ type: wmt-2009-news
1835
+ args: spa-deu
1836
+ metrics:
1837
+ - name: BLEU
1838
+ type: bleu
1839
+ value: 22.1
1840
+ - name: chr-F
1841
+ type: chrf
1842
+ value: 0.52586
1843
+ - task:
1844
+ name: Translation spa-eng
1845
+ type: translation
1846
+ args: spa-eng
1847
+ dataset:
1848
+ name: newstest2009
1849
+ type: wmt-2009-news
1850
+ args: spa-eng
1851
+ metrics:
1852
+ - name: BLEU
1853
+ type: bleu
1854
+ value: 29.9
1855
+ - name: chr-F
1856
+ type: chrf
1857
+ value: 0.56756
1858
+ - task:
1859
+ name: Translation deu-eng
1860
+ type: translation
1861
+ args: deu-eng
1862
+ dataset:
1863
+ name: newstest2010
1864
+ type: wmt-2010-news
1865
+ args: deu-eng
1866
+ metrics:
1867
+ - name: BLEU
1868
+ type: bleu
1869
+ value: 30.4
1870
+ - name: chr-F
1871
+ type: chrf
1872
+ value: 0.58365
1873
+ - task:
1874
+ name: Translation eng-deu
1875
+ type: translation
1876
+ args: eng-deu
1877
+ dataset:
1878
+ name: newstest2010
1879
+ type: wmt-2010-news
1880
+ args: eng-deu
1881
+ metrics:
1882
+ - name: BLEU
1883
+ type: bleu
1884
+ value: 25.7
1885
+ - name: chr-F
1886
+ type: chrf
1887
+ value: 0.54917
1888
+ - task:
1889
+ name: Translation fra-deu
1890
+ type: translation
1891
+ args: fra-deu
1892
+ dataset:
1893
+ name: newstest2010
1894
+ type: wmt-2010-news
1895
+ args: fra-deu
1896
+ metrics:
1897
+ - name: BLEU
1898
+ type: bleu
1899
+ value: 24.3
1900
+ - name: chr-F
1901
+ type: chrf
1902
+ value: 0.53904
1903
+ - task:
1904
+ name: Translation fra-eng
1905
+ type: translation
1906
+ args: fra-eng
1907
+ dataset:
1908
+ name: newstest2010
1909
+ type: wmt-2010-news
1910
+ args: fra-eng
1911
+ metrics:
1912
+ - name: BLEU
1913
+ type: bleu
1914
+ value: 32.4
1915
+ - name: chr-F
1916
+ type: chrf
1917
+ value: 0.59241
1918
+ - task:
1919
+ name: Translation spa-deu
1920
+ type: translation
1921
+ args: spa-deu
1922
+ dataset:
1923
+ name: newstest2010
1924
+ type: wmt-2010-news
1925
+ args: spa-deu
1926
+ metrics:
1927
+ - name: BLEU
1928
+ type: bleu
1929
+ value: 26.2
1930
+ - name: chr-F
1931
+ type: chrf
1932
+ value: 0.55378
1933
+ - task:
1934
+ name: Translation spa-eng
1935
+ type: translation
1936
+ args: spa-eng
1937
+ dataset:
1938
+ name: newstest2010
1939
+ type: wmt-2010-news
1940
+ args: spa-eng
1941
+ metrics:
1942
+ - name: BLEU
1943
+ type: bleu
1944
+ value: 35.8
1945
+ - name: chr-F
1946
+ type: chrf
1947
+ value: 0.61316
1948
+ - task:
1949
+ name: Translation deu-eng
1950
+ type: translation
1951
+ args: deu-eng
1952
+ dataset:
1953
+ name: newstest2011
1954
+ type: wmt-2011-news
1955
+ args: deu-eng
1956
+ metrics:
1957
+ - name: BLEU
1958
+ type: bleu
1959
+ value: 26.1
1960
+ - name: chr-F
1961
+ type: chrf
1962
+ value: 0.54907
1963
+ - task:
1964
+ name: Translation eng-deu
1965
+ type: translation
1966
+ args: eng-deu
1967
+ dataset:
1968
+ name: newstest2011
1969
+ type: wmt-2011-news
1970
+ args: eng-deu
1971
+ metrics:
1972
+ - name: BLEU
1973
+ type: bleu
1974
+ value: 23.0
1975
+ - name: chr-F
1976
+ type: chrf
1977
+ value: 0.52873
1978
+ - task:
1979
+ name: Translation fra-deu
1980
+ type: translation
1981
+ args: fra-deu
1982
+ dataset:
1983
+ name: newstest2011
1984
+ type: wmt-2011-news
1985
+ args: fra-deu
1986
+ metrics:
1987
+ - name: BLEU
1988
+ type: bleu
1989
+ value: 23.0
1990
+ - name: chr-F
1991
+ type: chrf
1992
+ value: 0.52977
1993
+ - task:
1994
+ name: Translation fra-eng
1995
+ type: translation
1996
+ args: fra-eng
1997
+ dataset:
1998
+ name: newstest2011
1999
+ type: wmt-2011-news
2000
+ args: fra-eng
2001
+ metrics:
2002
+ - name: BLEU
2003
+ type: bleu
2004
+ value: 32.8
2005
+ - name: chr-F
2006
+ type: chrf
2007
+ value: 0.59565
2008
+ - task:
2009
+ name: Translation spa-deu
2010
+ type: translation
2011
+ args: spa-deu
2012
+ dataset:
2013
+ name: newstest2011
2014
+ type: wmt-2011-news
2015
+ args: spa-deu
2016
+ metrics:
2017
+ - name: BLEU
2018
+ type: bleu
2019
+ value: 23.4
2020
+ - name: chr-F
2021
+ type: chrf
2022
+ value: 0.53095
2023
+ - task:
2024
+ name: Translation spa-eng
2025
+ type: translation
2026
+ args: spa-eng
2027
+ dataset:
2028
+ name: newstest2011
2029
+ type: wmt-2011-news
2030
+ args: spa-eng
2031
+ metrics:
2032
+ - name: BLEU
2033
+ type: bleu
2034
+ value: 33.3
2035
+ - name: chr-F
2036
+ type: chrf
2037
+ value: 0.59513
2038
+ - task:
2039
+ name: Translation deu-eng
2040
+ type: translation
2041
+ args: deu-eng
2042
+ dataset:
2043
+ name: newstest2012
2044
+ type: wmt-2012-news
2045
+ args: deu-eng
2046
+ metrics:
2047
+ - name: BLEU
2048
+ type: bleu
2049
+ value: 28.1
2050
+ - name: chr-F
2051
+ type: chrf
2052
+ value: 0.56230
2053
+ - task:
2054
+ name: Translation eng-deu
2055
+ type: translation
2056
+ args: eng-deu
2057
+ dataset:
2058
+ name: newstest2012
2059
+ type: wmt-2012-news
2060
+ args: eng-deu
2061
+ metrics:
2062
+ - name: BLEU
2063
+ type: bleu
2064
+ value: 23.7
2065
+ - name: chr-F
2066
+ type: chrf
2067
+ value: 0.52871
2068
+ - task:
2069
+ name: Translation fra-deu
2070
+ type: translation
2071
+ args: fra-deu
2072
+ dataset:
2073
+ name: newstest2012
2074
+ type: wmt-2012-news
2075
+ args: fra-deu
2076
+ metrics:
2077
+ - name: BLEU
2078
+ type: bleu
2079
+ value: 24.1
2080
+ - name: chr-F
2081
+ type: chrf
2082
+ value: 0.53035
2083
+ - task:
2084
+ name: Translation fra-eng
2085
+ type: translation
2086
+ args: fra-eng
2087
+ dataset:
2088
+ name: newstest2012
2089
+ type: wmt-2012-news
2090
+ args: fra-eng
2091
+ metrics:
2092
+ - name: BLEU
2093
+ type: bleu
2094
+ value: 33.0
2095
+ - name: chr-F
2096
+ type: chrf
2097
+ value: 0.59137
2098
+ - task:
2099
+ name: Translation spa-deu
2100
+ type: translation
2101
+ args: spa-deu
2102
+ dataset:
2103
+ name: newstest2012
2104
+ type: wmt-2012-news
2105
+ args: spa-deu
2106
+ metrics:
2107
+ - name: BLEU
2108
+ type: bleu
2109
+ value: 24.3
2110
+ - name: chr-F
2111
+ type: chrf
2112
+ value: 0.53438
2113
+ - task:
2114
+ name: Translation spa-eng
2115
+ type: translation
2116
+ args: spa-eng
2117
+ dataset:
2118
+ name: newstest2012
2119
+ type: wmt-2012-news
2120
+ args: spa-eng
2121
+ metrics:
2122
+ - name: BLEU
2123
+ type: bleu
2124
+ value: 37.0
2125
+ - name: chr-F
2126
+ type: chrf
2127
+ value: 0.62058
2128
+ - task:
2129
+ name: Translation deu-eng
2130
+ type: translation
2131
+ args: deu-eng
2132
+ dataset:
2133
+ name: newstest2013
2134
+ type: wmt-2013-news
2135
+ args: deu-eng
2136
+ metrics:
2137
+ - name: BLEU
2138
+ type: bleu
2139
+ value: 31.5
2140
+ - name: chr-F
2141
+ type: chrf
2142
+ value: 0.57940
2143
+ - task:
2144
+ name: Translation eng-deu
2145
+ type: translation
2146
+ args: eng-deu
2147
+ dataset:
2148
+ name: newstest2013
2149
+ type: wmt-2013-news
2150
+ args: eng-deu
2151
+ metrics:
2152
+ - name: BLEU
2153
+ type: bleu
2154
+ value: 27.5
2155
+ - name: chr-F
2156
+ type: chrf
2157
+ value: 0.55718
2158
+ - task:
2159
+ name: Translation fra-deu
2160
+ type: translation
2161
+ args: fra-deu
2162
+ dataset:
2163
+ name: newstest2013
2164
+ type: wmt-2013-news
2165
+ args: fra-deu
2166
+ metrics:
2167
+ - name: BLEU
2168
+ type: bleu
2169
+ value: 25.6
2170
+ - name: chr-F
2171
+ type: chrf
2172
+ value: 0.54408
2173
+ - task:
2174
+ name: Translation fra-eng
2175
+ type: translation
2176
+ args: fra-eng
2177
+ dataset:
2178
+ name: newstest2013
2179
+ type: wmt-2013-news
2180
+ args: fra-eng
2181
+ metrics:
2182
+ - name: BLEU
2183
+ type: bleu
2184
+ value: 33.9
2185
+ - name: chr-F
2186
+ type: chrf
2187
+ value: 0.59151
2188
+ - task:
2189
+ name: Translation spa-deu
2190
+ type: translation
2191
+ args: spa-deu
2192
+ dataset:
2193
+ name: newstest2013
2194
+ type: wmt-2013-news
2195
+ args: spa-deu
2196
+ metrics:
2197
+ - name: BLEU
2198
+ type: bleu
2199
+ value: 26.2
2200
+ - name: chr-F
2201
+ type: chrf
2202
+ value: 0.55215
2203
+ - task:
2204
+ name: Translation spa-eng
2205
+ type: translation
2206
+ args: spa-eng
2207
+ dataset:
2208
+ name: newstest2013
2209
+ type: wmt-2013-news
2210
+ args: spa-eng
2211
+ metrics:
2212
+ - name: BLEU
2213
+ type: bleu
2214
+ value: 34.4
2215
+ - name: chr-F
2216
+ type: chrf
2217
+ value: 0.60465
2218
+ - task:
2219
+ name: Translation deu-eng
2220
+ type: translation
2221
+ args: deu-eng
2222
+ dataset:
2223
+ name: newstest2014
2224
+ type: wmt-2014-news
2225
+ args: deu-eng
2226
+ metrics:
2227
+ - name: BLEU
2228
+ type: bleu
2229
+ value: 33.1
2230
+ - name: chr-F
2231
+ type: chrf
2232
+ value: 0.59723
2233
+ - task:
2234
+ name: Translation eng-deu
2235
+ type: translation
2236
+ args: eng-deu
2237
+ dataset:
2238
+ name: newstest2014
2239
+ type: wmt-2014-news
2240
+ args: eng-deu
2241
+ metrics:
2242
+ - name: BLEU
2243
+ type: bleu
2244
+ value: 28.5
2245
+ - name: chr-F
2246
+ type: chrf
2247
+ value: 0.59127
2248
+ - task:
2249
+ name: Translation fra-eng
2250
+ type: translation
2251
+ args: fra-eng
2252
+ dataset:
2253
+ name: newstest2014
2254
+ type: wmt-2014-news
2255
+ args: fra-eng
2256
+ metrics:
2257
+ - name: BLEU
2258
+ type: bleu
2259
+ value: 38.0
2260
+ - name: chr-F
2261
+ type: chrf
2262
+ value: 0.63411
2263
+ - task:
2264
+ name: Translation deu-eng
2265
+ type: translation
2266
+ args: deu-eng
2267
+ dataset:
2268
+ name: newstest2015
2269
+ type: wmt-2015-news
2270
+ args: deu-eng
2271
+ metrics:
2272
+ - name: BLEU
2273
+ type: bleu
2274
+ value: 33.7
2275
+ - name: chr-F
2276
+ type: chrf
2277
+ value: 0.59799
2278
+ - task:
2279
+ name: Translation eng-deu
2280
+ type: translation
2281
+ args: eng-deu
2282
+ dataset:
2283
+ name: newstest2015
2284
+ type: wmt-2015-news
2285
+ args: eng-deu
2286
+ metrics:
2287
+ - name: BLEU
2288
+ type: bleu
2289
+ value: 32.0
2290
+ - name: chr-F
2291
+ type: chrf
2292
+ value: 0.59977
2293
+ - task:
2294
+ name: Translation deu-eng
2295
+ type: translation
2296
+ args: deu-eng
2297
+ dataset:
2298
+ name: newstest2016
2299
+ type: wmt-2016-news
2300
+ args: deu-eng
2301
+ metrics:
2302
+ - name: BLEU
2303
+ type: bleu
2304
+ value: 40.4
2305
+ - name: chr-F
2306
+ type: chrf
2307
+ value: 0.65039
2308
+ - task:
2309
+ name: Translation eng-deu
2310
+ type: translation
2311
+ args: eng-deu
2312
+ dataset:
2313
+ name: newstest2016
2314
+ type: wmt-2016-news
2315
+ args: eng-deu
2316
+ metrics:
2317
+ - name: BLEU
2318
+ type: bleu
2319
+ value: 37.9
2320
+ - name: chr-F
2321
+ type: chrf
2322
+ value: 0.64144
2323
+ - task:
2324
+ name: Translation deu-eng
2325
+ type: translation
2326
+ args: deu-eng
2327
+ dataset:
2328
+ name: newstest2017
2329
+ type: wmt-2017-news
2330
+ args: deu-eng
2331
+ metrics:
2332
+ - name: BLEU
2333
+ type: bleu
2334
+ value: 35.3
2335
+ - name: chr-F
2336
+ type: chrf
2337
+ value: 0.60921
2338
+ - task:
2339
+ name: Translation eng-deu
2340
+ type: translation
2341
+ args: eng-deu
2342
+ dataset:
2343
+ name: newstest2017
2344
+ type: wmt-2017-news
2345
+ args: eng-deu
2346
+ metrics:
2347
+ - name: BLEU
2348
+ type: bleu
2349
+ value: 30.4
2350
+ - name: chr-F
2351
+ type: chrf
2352
+ value: 0.59114
2353
+ - task:
2354
+ name: Translation deu-eng
2355
+ type: translation
2356
+ args: deu-eng
2357
+ dataset:
2358
+ name: newstest2018
2359
+ type: wmt-2018-news
2360
+ args: deu-eng
2361
+ metrics:
2362
+ - name: BLEU
2363
+ type: bleu
2364
+ value: 42.6
2365
+ - name: chr-F
2366
+ type: chrf
2367
+ value: 0.66680
2368
+ - task:
2369
+ name: Translation eng-deu
2370
+ type: translation
2371
+ args: eng-deu
2372
+ dataset:
2373
+ name: newstest2018
2374
+ type: wmt-2018-news
2375
+ args: eng-deu
2376
+ metrics:
2377
+ - name: BLEU
2378
+ type: bleu
2379
+ value: 45.8
2380
+ - name: chr-F
2381
+ type: chrf
2382
+ value: 0.69428
2383
+ - task:
2384
+ name: Translation deu-eng
2385
+ type: translation
2386
+ args: deu-eng
2387
+ dataset:
2388
+ name: newstest2019
2389
+ type: wmt-2019-news
2390
+ args: deu-eng
2391
+ metrics:
2392
+ - name: BLEU
2393
+ type: bleu
2394
+ value: 39.1
2395
+ - name: chr-F
2396
+ type: chrf
2397
+ value: 0.63482
2398
+ - task:
2399
+ name: Translation eng-deu
2400
+ type: translation
2401
+ args: eng-deu
2402
+ dataset:
2403
+ name: newstest2019
2404
+ type: wmt-2019-news
2405
+ args: eng-deu
2406
+ metrics:
2407
+ - name: BLEU
2408
+ type: bleu
2409
+ value: 42.0
2410
+ - name: chr-F
2411
+ type: chrf
2412
+ value: 0.66430
2413
+ - task:
2414
+ name: Translation fra-deu
2415
+ type: translation
2416
+ args: fra-deu
2417
+ dataset:
2418
+ name: newstest2019
2419
+ type: wmt-2019-news
2420
+ args: fra-deu
2421
+ metrics:
2422
+ - name: BLEU
2423
+ type: bleu
2424
+ value: 29.4
2425
+ - name: chr-F
2426
+ type: chrf
2427
+ value: 0.60993
2428
+ - task:
2429
+ name: Translation deu-eng
2430
+ type: translation
2431
+ args: deu-eng
2432
+ dataset:
2433
+ name: newstest2020
2434
+ type: wmt-2020-news
2435
+ args: deu-eng
2436
+ metrics:
2437
+ - name: BLEU
2438
+ type: bleu
2439
+ value: 34.0
2440
+ - name: chr-F
2441
+ type: chrf
2442
+ value: 0.60403
2443
+ - task:
2444
+ name: Translation eng-deu
2445
+ type: translation
2446
+ args: eng-deu
2447
+ dataset:
2448
+ name: newstest2020
2449
+ type: wmt-2020-news
2450
+ args: eng-deu
2451
+ metrics:
2452
+ - name: BLEU
2453
+ type: bleu
2454
+ value: 32.3
2455
+ - name: chr-F
2456
+ type: chrf
2457
+ value: 0.60255
2458
+ - task:
2459
+ name: Translation fra-deu
2460
+ type: translation
2461
+ args: fra-deu
2462
+ dataset:
2463
+ name: newstest2020
2464
+ type: wmt-2020-news
2465
+ args: fra-deu
2466
+ metrics:
2467
+ - name: BLEU
2468
+ type: bleu
2469
+ value: 29.2
2470
+ - name: chr-F
2471
+ type: chrf
2472
+ value: 0.61470
2473
+ - task:
2474
+ name: Translation deu-eng
2475
+ type: translation
2476
+ args: deu-eng
2477
+ dataset:
2478
+ name: newstest2021
2479
+ type: wmt-2021-news
2480
+ args: deu-eng
2481
+ metrics:
2482
+ - name: BLEU
2483
+ type: bleu
2484
+ value: 31.9
2485
+ - name: chr-F
2486
+ type: chrf
2487
+ value: 0.59738
2488
+ - task:
2489
+ name: Translation eng-deu
2490
+ type: translation
2491
+ args: eng-deu
2492
+ dataset:
2493
+ name: newstest2021
2494
+ type: wmt-2021-news
2495
+ args: eng-deu
2496
+ metrics:
2497
+ - name: BLEU
2498
+ type: bleu
2499
+ value: 26.1
2500
+ - name: chr-F
2501
+ type: chrf
2502
+ value: 0.56399
2503
+ - task:
2504
+ name: Translation fra-deu
2505
+ type: translation
2506
+ args: fra-deu
2507
+ dataset:
2508
+ name: newstest2021
2509
+ type: wmt-2021-news
2510
+ args: fra-deu
2511
+ metrics:
2512
+ - name: BLEU
2513
+ type: bleu
2514
+ value: 40.0
2515
+ - name: chr-F
2516
+ type: chrf
2517
+ value: 0.66155
2518
+ ---
2519
+ # opus-mt-tc-bible-big-deu_eng_fra_por_spa-gmw
2520
+
2521
+ ## Table of Contents
2522
+ - [Model Details](#model-details)
2523
+ - [Uses](#uses)
2524
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
2525
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
2526
+ - [Training](#training)
2527
+ - [Evaluation](#evaluation)
2528
+ - [Citation Information](#citation-information)
2529
+ - [Acknowledgements](#acknowledgements)
2530
+
2531
+ ## Model Details
2532
+
2533
+ Neural machine translation model for translating from unknown (deu+eng+fra+por+spa) to West Germanic languages (gmw).
2534
+
2535
+ This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
2536
+ **Model Description:**
2537
+ - **Developed by:** Language Technology Research Group at the University of Helsinki
2538
+ - **Model Type:** Translation (transformer-big)
2539
+ - **Release**: 2024-05-30
2540
+ - **License:** Apache-2.0
2541
+ - **Language(s):**
2542
+ - Source Language(s): deu eng fra por spa
2543
+ - Target Language(s): afr ang bar bis bzj deu djk drt eng enm frr fry gos gsw hrx hwc icr jam kri ksh lim ltz nds nld ofs pcm pdc pfl pih pis rop sco srm srn stq swg tcs tpi vls wae yid zea
2544
+ - Valid Target Language Labels: >>act<< >>afr<< >>afs<< >>aig<< >>ang<< >>ang_Latn<< >>bah<< >>bar<< >>bis<< >>bjs<< >>brc<< >>bzj<< >>bzk<< >>cim<< >>dcr<< >>deu<< >>djk<< >>djk_Latn<< >>drt<< >>drt_Latn<< >>dum<< >>eng<< >>enm<< >>enm_Latn<< >>fpe<< >>frk<< >>frr<< >>fry<< >>gcl<< >>gct<< >>geh<< >>gmh<< >>gml<< >>goh<< >>gos<< >>gpe<< >>gsw<< >>gul<< >>gyn<< >>hrx<< >>hrx_Latn<< >>hwc<< >>icr<< >>jam<< >>jvd<< >>kri<< >>ksh<< >>kww<< >>lim<< >>lng<< >>ltz<< >>mhn<< >>nds<< >>nld<< >>odt<< >>ofs<< >>ofs_Latn<< >>oor<< >>osx<< >>pcm<< >>pdc<< >>pdt<< >>pey<< >>pfl<< >>pih<< >>pih_Latn<< >>pis<< >>rop<< >>sco<< >>sdz<< >>skw<< >>sli<< >>srm<< >>srn<< >>stl<< >>stq<< >>svc<< >>swg<< >>sxu<< >>tch<< >>tcs<< >>tgh<< >>tpi<< >>trf<< >>twd<< >>uln<< >>vel<< >>vic<< >>vls<< >>vmf<< >>wae<< >>wep<< >>wes<< >>wym<< >>xxx<< >>yec<< >>yid<< >>zea<<
2545
+ - **Original Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-gmw/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
2546
+ - **Resources for more information:**
2547
+ - [OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/deu%2Beng%2Bfra%2Bpor%2Bspa-gmw/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
2548
+ - [OPUS-MT-train GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
2549
+ - [More information about MarianNMT models in the transformers library](https://huggingface.co/docs/transformers/model_doc/marian)
2550
+ - [Tatoeba Translation Challenge](https://github.com/Helsinki-NLP/Tatoeba-Challenge/)
2551
+ - [HPLT bilingual data v1 (as part of the Tatoeba Translation Challenge dataset)](https://hplt-project.org/datasets/v1)
2552
+ - [A massively parallel Bible corpus](https://aclanthology.org/L14-1215/)
2553
+
2554
+ This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>afr<<`
2555
+
2556
+ ## Uses
2557
+
2558
+ This model can be used for translation and text-to-text generation.
2559
+
2560
+ ## Risks, Limitations and Biases
2561
+
2562
+ **CONTENT WARNING: Readers should be aware that the model is trained on various public data sets that may contain content that is disturbing, offensive, and can propagate historical and current stereotypes.**
2563
+
2564
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
2565
+
2566
+ ## How to Get Started With the Model
2567
+
2568
+ A short example code:
2569
+
2570
+ ```python
2571
+ from transformers import MarianMTModel, MarianTokenizer
2572
+
2573
+ src_text = [
2574
+ ">>afr<< Replace this with text in an accepted source language.",
2575
+ ">>zea<< This is the second sentence."
2576
+ ]
2577
+
2578
+ model_name = "pytorch-models/opus-mt-tc-bible-big-deu_eng_fra_por_spa-gmw"
2579
+ tokenizer = MarianTokenizer.from_pretrained(model_name)
2580
+ model = MarianMTModel.from_pretrained(model_name)
2581
+ translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
2582
+
2583
+ for t in translated:
2584
+ print( tokenizer.decode(t, skip_special_tokens=True) )
2585
+ ```
2586
+
2587
+ You can also use OPUS-MT models with the transformers pipelines, for example:
2588
+
2589
+ ```python
2590
+ from transformers import pipeline
2591
+ pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-bible-big-deu_eng_fra_por_spa-gmw")
2592
+ print(pipe(">>afr<< Replace this with text in an accepted source language."))
2593
+ ```
2594
+
2595
+ ## Training
2596
+
2597
+ - **Data**: opusTCv20230926max50+bt+jhubc ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
2598
+ - **Pre-processing**: SentencePiece (spm32k,spm32k)
2599
+ - **Model Type:** transformer-big
2600
+ - **Original MarianNMT Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-gmw/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
2601
+ - **Training Scripts**: [GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
2602
+
2603
+ ## Evaluation
2604
+
2605
+ * [Model scores at the OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/deu%2Beng%2Bfra%2Bpor%2Bspa-gmw/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
2606
+ * test set translations: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-gmw/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt)
2607
+ * test set scores: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-gmw/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt)
2608
+ * benchmark results: [benchmark_results.txt](benchmark_results.txt)
2609
+ * benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
2610
+
2611
+ | langpair | testset | chr-F | BLEU | #sent | #words |
2612
+ |----------|---------|-------|-------|-------|--------|
2613
+ | deu-afr | tatoeba-test-v2021-08-07 | 0.72039 | 56.7 | 1583 | 9507 |
2614
+ | deu-deu | tatoeba-test-v2021-08-07 | 0.59545 | 33.7 | 2500 | 20806 |
2615
+ | deu-eng | tatoeba-test-v2021-08-07 | 0.66015 | 48.6 | 17565 | 149462 |
2616
+ | deu-ltz | tatoeba-test-v2021-08-07 | 0.53760 | 34.2 | 347 | 2206 |
2617
+ | deu-nds | tatoeba-test-v2021-08-07 | 0.44534 | 20.1 | 9999 | 76137 |
2618
+ | deu-nld | tatoeba-test-v2021-08-07 | 0.71276 | 54.4 | 10218 | 75235 |
2619
+ | eng-afr | tatoeba-test-v2021-08-07 | 0.72087 | 56.6 | 1374 | 10317 |
2620
+ | eng-deu | tatoeba-test-v2021-08-07 | 0.62971 | 41.4 | 17565 | 151568 |
2621
+ | eng-eng | tatoeba-test-v2021-08-07 | 0.80306 | 58.0 | 12062 | 115106 |
2622
+ | eng-fry | tatoeba-test-v2021-08-07 | 0.40324 | 13.8 | 220 | 1600 |
2623
+ | eng-ltz | tatoeba-test-v2021-08-07 | 0.64423 | 45.8 | 293 | 1828 |
2624
+ | eng-nds | tatoeba-test-v2021-08-07 | 0.46446 | 22.2 | 2500 | 18264 |
2625
+ | eng-nld | tatoeba-test-v2021-08-07 | 0.71190 | 54.5 | 12696 | 91796 |
2626
+ | fra-deu | tatoeba-test-v2021-08-07 | 0.68991 | 50.3 | 12418 | 100545 |
2627
+ | fra-eng | tatoeba-test-v2021-08-07 | 0.72564 | 58.0 | 12681 | 101754 |
2628
+ | fra-nld | tatoeba-test-v2021-08-07 | 0.67078 | 48.7 | 11548 | 82164 |
2629
+ | por-deu | tatoeba-test-v2021-08-07 | 0.68437 | 48.7 | 10000 | 81246 |
2630
+ | por-eng | tatoeba-test-v2021-08-07 | 0.77081 | 64.3 | 13222 | 105351 |
2631
+ | por-nds | tatoeba-test-v2021-08-07 | 0.45864 | 20.7 | 207 | 1292 |
2632
+ | por-nld | tatoeba-test-v2021-08-07 | 0.69865 | 52.8 | 2500 | 17816 |
2633
+ | spa-afr | tatoeba-test-v2021-08-07 | 0.77148 | 63.3 | 448 | 3044 |
2634
+ | spa-deu | tatoeba-test-v2021-08-07 | 0.68037 | 49.1 | 10521 | 86430 |
2635
+ | spa-eng | tatoeba-test-v2021-08-07 | 0.74575 | 60.2 | 16583 | 138123 |
2636
+ | spa-nds | tatoeba-test-v2021-08-07 | 0.43154 | 18.5 | 923 | 5941 |
2637
+ | spa-nld | tatoeba-test-v2021-08-07 | 0.68988 | 51.1 | 10113 | 79162 |
2638
+ | deu-afr | flores101-devtest | 0.57287 | 26.0 | 1012 | 25740 |
2639
+ | deu-eng | flores101-devtest | 0.66660 | 40.9 | 1012 | 24721 |
2640
+ | deu-nld | flores101-devtest | 0.55423 | 23.6 | 1012 | 25467 |
2641
+ | eng-afr | flores101-devtest | 0.67793 | 40.0 | 1012 | 25740 |
2642
+ | eng-deu | flores101-devtest | 0.64295 | 37.2 | 1012 | 25094 |
2643
+ | eng-nld | flores101-devtest | 0.57690 | 26.2 | 1012 | 25467 |
2644
+ | fra-ltz | flores101-devtest | 0.49430 | 17.3 | 1012 | 25087 |
2645
+ | fra-nld | flores101-devtest | 0.54318 | 22.2 | 1012 | 25467 |
2646
+ | por-deu | flores101-devtest | 0.58851 | 29.8 | 1012 | 25094 |
2647
+ | por-nld | flores101-devtest | 0.54571 | 22.6 | 1012 | 25467 |
2648
+ | spa-nld | flores101-devtest | 0.50968 | 17.5 | 1012 | 25467 |
2649
+ | deu-afr | flores200-devtest | 0.57725 | 26.2 | 1012 | 25740 |
2650
+ | deu-eng | flores200-devtest | 0.67043 | 41.5 | 1012 | 24721 |
2651
+ | deu-ltz | flores200-devtest | 0.54626 | 21.6 | 1012 | 25087 |
2652
+ | deu-nld | flores200-devtest | 0.55679 | 24.0 | 1012 | 25467 |
2653
+ | eng-afr | flores200-devtest | 0.68115 | 40.2 | 1012 | 25740 |
2654
+ | eng-deu | flores200-devtest | 0.64561 | 37.4 | 1012 | 25094 |
2655
+ | eng-ltz | flores200-devtest | 0.54932 | 22.0 | 1012 | 25087 |
2656
+ | eng-nld | flores200-devtest | 0.58124 | 26.8 | 1012 | 25467 |
2657
+ | eng-tpi | flores200-devtest | 0.40338 | 15.9 | 1012 | 35240 |
2658
+ | fra-afr | flores200-devtest | 0.57320 | 26.4 | 1012 | 25740 |
2659
+ | fra-deu | flores200-devtest | 0.58974 | 29.5 | 1012 | 25094 |
2660
+ | fra-eng | flores200-devtest | 0.68106 | 43.7 | 1012 | 24721 |
2661
+ | fra-ltz | flores200-devtest | 0.49618 | 17.8 | 1012 | 25087 |
2662
+ | fra-nld | flores200-devtest | 0.54623 | 22.5 | 1012 | 25467 |
2663
+ | por-afr | flores200-devtest | 0.58408 | 27.6 | 1012 | 25740 |
2664
+ | por-deu | flores200-devtest | 0.59121 | 30.4 | 1012 | 25094 |
2665
+ | por-eng | flores200-devtest | 0.71418 | 48.3 | 1012 | 24721 |
2666
+ | por-nld | flores200-devtest | 0.54828 | 22.9 | 1012 | 25467 |
2667
+ | spa-afr | flores200-devtest | 0.51514 | 17.8 | 1012 | 25740 |
2668
+ | spa-deu | flores200-devtest | 0.53603 | 21.4 | 1012 | 25094 |
2669
+ | spa-eng | flores200-devtest | 0.58604 | 28.2 | 1012 | 24721 |
2670
+ | spa-nld | flores200-devtest | 0.51244 | 17.9 | 1012 | 25467 |
2671
+ | deu-eng | generaltest2022 | 0.55777 | 30.6 | 1984 | 37634 |
2672
+ | eng-deu | generaltest2022 | 0.60792 | 33.0 | 2037 | 38914 |
2673
+ | fra-deu | generaltest2022 | 0.67039 | 44.5 | 2006 | 37696 |
2674
+ | deu-eng | multi30k_test_2016_flickr | 0.60981 | 40.1 | 1000 | 12955 |
2675
+ | eng-deu | multi30k_test_2016_flickr | 0.64153 | 34.9 | 1000 | 12106 |
2676
+ | fra-deu | multi30k_test_2016_flickr | 0.61781 | 32.1 | 1000 | 12106 |
2677
+ | fra-eng | multi30k_test_2016_flickr | 0.66703 | 47.9 | 1000 | 12955 |
2678
+ | deu-eng | multi30k_test_2017_flickr | 0.63624 | 41.0 | 1000 | 11374 |
2679
+ | eng-deu | multi30k_test_2017_flickr | 0.63423 | 34.6 | 1000 | 10755 |
2680
+ | fra-deu | multi30k_test_2017_flickr | 0.60084 | 29.7 | 1000 | 10755 |
2681
+ | fra-eng | multi30k_test_2017_flickr | 0.69254 | 50.4 | 1000 | 11374 |
2682
+ | deu-eng | multi30k_test_2017_mscoco | 0.55790 | 32.5 | 461 | 5231 |
2683
+ | eng-deu | multi30k_test_2017_mscoco | 0.57491 | 28.6 | 461 | 5158 |
2684
+ | fra-deu | multi30k_test_2017_mscoco | 0.56108 | 26.4 | 461 | 5158 |
2685
+ | fra-eng | multi30k_test_2017_mscoco | 0.68212 | 49.1 | 461 | 5231 |
2686
+ | deu-eng | multi30k_test_2018_flickr | 0.59322 | 36.6 | 1071 | 14689 |
2687
+ | eng-deu | multi30k_test_2018_flickr | 0.59858 | 30.0 | 1071 | 13703 |
2688
+ | fra-deu | multi30k_test_2018_flickr | 0.55667 | 24.7 | 1071 | 13703 |
2689
+ | fra-eng | multi30k_test_2018_flickr | 0.64702 | 43.4 | 1071 | 14689 |
2690
+ | fra-eng | newsdiscusstest2015 | 0.61399 | 38.5 | 1500 | 26982 |
2691
+ | deu-eng | newssyscomb2009 | 0.55180 | 28.8 | 502 | 11818 |
2692
+ | eng-deu | newssyscomb2009 | 0.53676 | 22.9 | 502 | 11271 |
2693
+ | fra-deu | newssyscomb2009 | 0.53733 | 23.9 | 502 | 11271 |
2694
+ | fra-eng | newssyscomb2009 | 0.57219 | 31.1 | 502 | 11818 |
2695
+ | spa-deu | newssyscomb2009 | 0.53056 | 22.0 | 502 | 11271 |
2696
+ | spa-eng | newssyscomb2009 | 0.57225 | 30.8 | 502 | 11818 |
2697
+ | deu-eng | newstest2008 | 0.54506 | 26.9 | 2051 | 49380 |
2698
+ | eng-deu | newstest2008 | 0.53077 | 23.1 | 2051 | 47447 |
2699
+ | fra-deu | newstest2008 | 0.53204 | 22.9 | 2051 | 47447 |
2700
+ | fra-eng | newstest2008 | 0.54320 | 26.4 | 2051 | 49380 |
2701
+ | spa-deu | newstest2008 | 0.52066 | 21.6 | 2051 | 47447 |
2702
+ | spa-eng | newstest2008 | 0.55305 | 27.9 | 2051 | 49380 |
2703
+ | deu-eng | newstest2009 | 0.53773 | 26.2 | 2525 | 65399 |
2704
+ | eng-deu | newstest2009 | 0.53217 | 22.3 | 2525 | 62816 |
2705
+ | fra-deu | newstest2009 | 0.52995 | 22.9 | 2525 | 62816 |
2706
+ | fra-eng | newstest2009 | 0.56663 | 30.0 | 2525 | 65399 |
2707
+ | spa-deu | newstest2009 | 0.52586 | 22.1 | 2525 | 62816 |
2708
+ | spa-eng | newstest2009 | 0.56756 | 29.9 | 2525 | 65399 |
2709
+ | deu-eng | newstest2010 | 0.58365 | 30.4 | 2489 | 61711 |
2710
+ | eng-deu | newstest2010 | 0.54917 | 25.7 | 2489 | 61503 |
2711
+ | fra-deu | newstest2010 | 0.53904 | 24.3 | 2489 | 61503 |
2712
+ | fra-eng | newstest2010 | 0.59241 | 32.4 | 2489 | 61711 |
2713
+ | spa-deu | newstest2010 | 0.55378 | 26.2 | 2489 | 61503 |
2714
+ | spa-eng | newstest2010 | 0.61316 | 35.8 | 2489 | 61711 |
2715
+ | deu-eng | newstest2011 | 0.54907 | 26.1 | 3003 | 74681 |
2716
+ | eng-deu | newstest2011 | 0.52873 | 23.0 | 3003 | 72981 |
2717
+ | fra-deu | newstest2011 | 0.52977 | 23.0 | 3003 | 72981 |
2718
+ | fra-eng | newstest2011 | 0.59565 | 32.8 | 3003 | 74681 |
2719
+ | spa-deu | newstest2011 | 0.53095 | 23.4 | 3003 | 72981 |
2720
+ | spa-eng | newstest2011 | 0.59513 | 33.3 | 3003 | 74681 |
2721
+ | deu-eng | newstest2012 | 0.56230 | 28.1 | 3003 | 72812 |
2722
+ | eng-deu | newstest2012 | 0.52871 | 23.7 | 3003 | 72886 |
2723
+ | fra-deu | newstest2012 | 0.53035 | 24.1 | 3003 | 72886 |
2724
+ | fra-eng | newstest2012 | 0.59137 | 33.0 | 3003 | 72812 |
2725
+ | spa-deu | newstest2012 | 0.53438 | 24.3 | 3003 | 72886 |
2726
+ | spa-eng | newstest2012 | 0.62058 | 37.0 | 3003 | 72812 |
2727
+ | deu-eng | newstest2013 | 0.57940 | 31.5 | 3000 | 64505 |
2728
+ | eng-deu | newstest2013 | 0.55718 | 27.5 | 3000 | 63737 |
2729
+ | fra-deu | newstest2013 | 0.54408 | 25.6 | 3000 | 63737 |
2730
+ | fra-eng | newstest2013 | 0.59151 | 33.9 | 3000 | 64505 |
2731
+ | spa-deu | newstest2013 | 0.55215 | 26.2 | 3000 | 63737 |
2732
+ | spa-eng | newstest2013 | 0.60465 | 34.4 | 3000 | 64505 |
2733
+ | deu-eng | newstest2014 | 0.59723 | 33.1 | 3003 | 67337 |
2734
+ | eng-deu | newstest2014 | 0.59127 | 28.5 | 3003 | 62688 |
2735
+ | fra-eng | newstest2014 | 0.63411 | 38.0 | 3003 | 70708 |
2736
+ | deu-eng | newstest2015 | 0.59799 | 33.7 | 2169 | 46443 |
2737
+ | eng-deu | newstest2015 | 0.59977 | 32.0 | 2169 | 44260 |
2738
+ | deu-eng | newstest2016 | 0.65039 | 40.4 | 2999 | 64119 |
2739
+ | eng-deu | newstest2016 | 0.64144 | 37.9 | 2999 | 62669 |
2740
+ | deu-eng | newstest2017 | 0.60921 | 35.3 | 3004 | 64399 |
2741
+ | eng-deu | newstest2017 | 0.59114 | 30.4 | 3004 | 61287 |
2742
+ | deu-eng | newstest2018 | 0.66680 | 42.6 | 2998 | 67012 |
2743
+ | eng-deu | newstest2018 | 0.69428 | 45.8 | 2998 | 64276 |
2744
+ | deu-eng | newstest2019 | 0.63482 | 39.1 | 2000 | 39227 |
2745
+ | eng-deu | newstest2019 | 0.66430 | 42.0 | 1997 | 48746 |
2746
+ | fra-deu | newstest2019 | 0.60993 | 29.4 | 1701 | 36446 |
2747
+ | deu-eng | newstest2020 | 0.60403 | 34.0 | 785 | 38220 |
2748
+ | eng-deu | newstest2020 | 0.60255 | 32.3 | 1418 | 52383 |
2749
+ | fra-deu | newstest2020 | 0.61470 | 29.2 | 1619 | 30265 |
2750
+ | deu-eng | newstest2021 | 0.59738 | 31.9 | 1000 | 20180 |
2751
+ | eng-deu | newstest2021 | 0.56399 | 26.1 | 1002 | 27970 |
2752
+ | fra-deu | newstest2021 | 0.66155 | 40.0 | 1026 | 26077 |
2753
+ | deu-eng | newstestALL2020 | 0.60403 | 34.0 | 785 | 38220 |
2754
+ | eng-deu | newstestALL2020 | 0.60255 | 32.3 | 1418 | 52383 |
2755
+ | deu-eng | newstestB2020 | 0.60520 | 34.2 | 785 | 37696 |
2756
+ | eng-deu | newstestB2020 | 0.59226 | 31.6 | 1418 | 53092 |
2757
+ | deu-afr | ntrex128 | 0.57109 | 27.9 | 1997 | 50050 |
2758
+ | deu-eng | ntrex128 | 0.62043 | 34.5 | 1997 | 47673 |
2759
+ | deu-ltz | ntrex128 | 0.47642 | 15.4 | 1997 | 49763 |
2760
+ | deu-nld | ntrex128 | 0.56777 | 27.6 | 1997 | 51884 |
2761
+ | eng-afr | ntrex128 | 0.68616 | 44.1 | 1997 | 50050 |
2762
+ | eng-deu | ntrex128 | 0.58743 | 30.2 | 1997 | 48761 |
2763
+ | eng-ltz | ntrex128 | 0.50083 | 18.0 | 1997 | 49763 |
2764
+ | eng-nld | ntrex128 | 0.61041 | 33.8 | 1997 | 51884 |
2765
+ | fra-afr | ntrex128 | 0.55607 | 26.5 | 1997 | 50050 |
2766
+ | fra-deu | ntrex128 | 0.53269 | 23.6 | 1997 | 48761 |
2767
+ | fra-eng | ntrex128 | 0.61058 | 34.4 | 1997 | 47673 |
2768
+ | fra-ltz | ntrex128 | 0.41312 | 12.0 | 1997 | 49763 |
2769
+ | fra-nld | ntrex128 | 0.54615 | 25.2 | 1997 | 51884 |
2770
+ | por-afr | ntrex128 | 0.58296 | 29.2 | 1997 | 50050 |
2771
+ | por-deu | ntrex128 | 0.54944 | 24.7 | 1997 | 48761 |
2772
+ | por-eng | ntrex128 | 0.65002 | 39.6 | 1997 | 47673 |
2773
+ | por-nld | ntrex128 | 0.56384 | 28.1 | 1997 | 51884 |
2774
+ | spa-afr | ntrex128 | 0.57772 | 27.7 | 1997 | 50050 |
2775
+ | spa-deu | ntrex128 | 0.54561 | 24.0 | 1997 | 48761 |
2776
+ | spa-eng | ntrex128 | 0.64305 | 37.3 | 1997 | 47673 |
2777
+ | spa-nld | ntrex128 | 0.56397 | 27.8 | 1997 | 51884 |
2778
+ | fra-eng | tico19-test | 0.62059 | 39.2 | 2100 | 56323 |
2779
+ | por-eng | tico19-test | 0.73896 | 50.3 | 2100 | 56315 |
2780
+ | spa-eng | tico19-test | 0.72923 | 49.6 | 2100 | 56315 |
2781
+
2782
+ ## Citation Information
2783
+
2784
+ * Publications: [Democratizing neural machine translation with OPUS-MT](https://doi.org/10.1007/s10579-023-09704-w) and [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
2785
+
2786
+ ```bibtex
2787
+ @article{tiedemann2023democratizing,
2788
+ title={Democratizing neural machine translation with {OPUS-MT}},
2789
+ author={Tiedemann, J{\"o}rg and Aulamo, Mikko and Bakshandaeva, Daria and Boggia, Michele and Gr{\"o}nroos, Stig-Arne and Nieminen, Tommi and Raganato, Alessandro and Scherrer, Yves and Vazquez, Raul and Virpioja, Sami},
2790
+ journal={Language Resources and Evaluation},
2791
+ number={58},
2792
+ pages={713--755},
2793
+ year={2023},
2794
+ publisher={Springer Nature},
2795
+ issn={1574-0218},
2796
+ doi={10.1007/s10579-023-09704-w}
2797
+ }
2798
+
2799
+ @inproceedings{tiedemann-thottingal-2020-opus,
2800
+ title = "{OPUS}-{MT} {--} Building open translation services for the World",
2801
+ author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
2802
+ booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
2803
+ month = nov,
2804
+ year = "2020",
2805
+ address = "Lisboa, Portugal",
2806
+ publisher = "European Association for Machine Translation",
2807
+ url = "https://aclanthology.org/2020.eamt-1.61",
2808
+ pages = "479--480",
2809
+ }
2810
+
2811
+ @inproceedings{tiedemann-2020-tatoeba,
2812
+ title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
2813
+ author = {Tiedemann, J{\"o}rg},
2814
+ booktitle = "Proceedings of the Fifth Conference on Machine Translation",
2815
+ month = nov,
2816
+ year = "2020",
2817
+ address = "Online",
2818
+ publisher = "Association for Computational Linguistics",
2819
+ url = "https://aclanthology.org/2020.wmt-1.139",
2820
+ pages = "1174--1182",
2821
+ }
2822
+ ```
2823
+
2824
+ ## Acknowledgements
2825
+
2826
+ The work is supported by the [HPLT project](https://hplt-project.org/), funded by the European Union’s Horizon Europe research and innovation programme under grant agreement No 101070350. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland, and the [EuroHPC supercomputer LUMI](https://www.lumi-supercomputer.eu/).
2827
+
2828
+ ## Model conversion info
2829
+
2830
+ * transformers version: 4.45.1
2831
+ * OPUS-MT git hash: 0882077
2832
+ * port time: Tue Oct 8 10:01:07 EEST 2024
2833
+ * port machine: LM0-400-22516.local
benchmark_results.txt ADDED
@@ -0,0 +1,227 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ multi-multi tatoeba-test-v2020-07-28-v2023-09-26 0.67078 48.7 10000 86220
2
+ deu-afr flores101-devtest 0.57287 26.0 1012 25740
3
+ deu-eng flores101-devtest 0.66660 40.9 1012 24721
4
+ deu-nld flores101-devtest 0.55423 23.6 1012 25467
5
+ eng-afr flores101-devtest 0.67793 40.0 1012 25740
6
+ eng-deu flores101-devtest 0.64295 37.2 1012 25094
7
+ eng-nld flores101-devtest 0.57690 26.2 1012 25467
8
+ fra-ltz flores101-devtest 0.49430 17.3 1012 25087
9
+ fra-nld flores101-devtest 0.54318 22.2 1012 25467
10
+ por-deu flores101-devtest 0.58851 29.8 1012 25094
11
+ por-nld flores101-devtest 0.54571 22.6 1012 25467
12
+ spa-nld flores101-devtest 0.50968 17.5 1012 25467
13
+ deu-afr flores200-devtest 0.57725 26.2 1012 25740
14
+ deu-eng flores200-devtest 0.67043 41.5 1012 24721
15
+ deu-lim flores200-devtest 0.21909 2.4 1012 24889
16
+ deu-ltz flores200-devtest 0.54626 21.6 1012 25087
17
+ deu-nld flores200-devtest 0.55679 24.0 1012 25467
18
+ deu-tpi flores200-devtest 0.39146 13.5 1012 35240
19
+ eng-afr flores200-devtest 0.68115 40.2 1012 25740
20
+ eng-deu flores200-devtest 0.64561 37.4 1012 25094
21
+ eng-lim flores200-devtest 0.30658 4.7 1012 24889
22
+ eng-ltz flores200-devtest 0.54932 22.0 1012 25087
23
+ eng-nld flores200-devtest 0.58124 26.8 1012 25467
24
+ eng-tpi flores200-devtest 0.40338 15.9 1012 35240
25
+ fra-afr flores200-devtest 0.57320 26.4 1012 25740
26
+ fra-deu flores200-devtest 0.58974 29.5 1012 25094
27
+ fra-eng flores200-devtest 0.68106 43.7 1012 24721
28
+ fra-lim flores200-devtest 0.31100 4.5 1012 24889
29
+ fra-ltz flores200-devtest 0.49618 17.8 1012 25087
30
+ fra-nld flores200-devtest 0.54623 22.5 1012 25467
31
+ fra-tpi flores200-devtest 0.39334 13.7 1012 35240
32
+ por-afr flores200-devtest 0.58408 27.6 1012 25740
33
+ por-deu flores200-devtest 0.59121 30.4 1012 25094
34
+ por-eng flores200-devtest 0.71418 48.3 1012 24721
35
+ por-lim flores200-devtest 0.30653 4.6 1012 24889
36
+ por-ltz flores200-devtest 0.39073 12.3 1012 25087
37
+ por-nld flores200-devtest 0.54828 22.9 1012 25467
38
+ por-tpi flores200-devtest 0.38929 13.7 1012 35240
39
+ spa-afr flores200-devtest 0.51514 17.8 1012 25740
40
+ spa-deu flores200-devtest 0.53603 21.4 1012 25094
41
+ spa-eng flores200-devtest 0.58604 28.2 1012 24721
42
+ spa-lim flores200-devtest 0.28237 3.2 1012 24889
43
+ spa-ltz flores200-devtest 0.35047 8.5 1012 25087
44
+ spa-nld flores200-devtest 0.51244 17.9 1012 25467
45
+ spa-tpi flores200-devtest 0.37967 12.2 1012 35240
46
+ deu-eng generaltest2022 0.55777 30.6 1984 37634
47
+ eng-deu generaltest2022 0.60792 33.0 2037 38914
48
+ fra-deu generaltest2022 0.67039 44.5 2006 37696
49
+ deu-eng multi30k_task2_test_2016 0.21562 4.1 5000 67382
50
+ eng-deu multi30k_task2_test_2016 0.26922 2.6 5000 51501
51
+ deu-eng multi30k_test_2016_flickr 0.60981 40.1 1000 12955
52
+ eng-deu multi30k_test_2016_flickr 0.64153 34.9 1000 12106
53
+ fra-deu multi30k_test_2016_flickr 0.61781 32.1 1000 12106
54
+ fra-eng multi30k_test_2016_flickr 0.66703 47.9 1000 12955
55
+ deu-eng multi30k_test_2017_flickr 0.63624 41.0 1000 11374
56
+ eng-deu multi30k_test_2017_flickr 0.63423 34.6 1000 10755
57
+ fra-deu multi30k_test_2017_flickr 0.60084 29.7 1000 10755
58
+ fra-eng multi30k_test_2017_flickr 0.69254 50.4 1000 11374
59
+ deu-eng multi30k_test_2017_mscoco 0.55790 32.5 461 5231
60
+ eng-deu multi30k_test_2017_mscoco 0.57491 28.6 461 5158
61
+ fra-deu multi30k_test_2017_mscoco 0.56108 26.4 461 5158
62
+ fra-eng multi30k_test_2017_mscoco 0.68212 49.1 461 5231
63
+ deu-eng multi30k_test_2018_flickr 0.59322 36.6 1071 14689
64
+ eng-deu multi30k_test_2018_flickr 0.59858 30.0 1071 13703
65
+ fra-deu multi30k_test_2018_flickr 0.55667 24.7 1071 13703
66
+ fra-eng multi30k_test_2018_flickr 0.64702 43.4 1071 14689
67
+ fra-eng newsdiscusstest2015 0.61399 38.5 1500 26982
68
+ deu-eng newssyscomb2009 0.55180 28.8 502 11818
69
+ eng-deu newssyscomb2009 0.53676 22.9 502 11271
70
+ fra-deu newssyscomb2009 0.53733 23.9 502 11271
71
+ fra-eng newssyscomb2009 0.57219 31.1 502 11818
72
+ spa-deu newssyscomb2009 0.53056 22.0 502 11271
73
+ spa-eng newssyscomb2009 0.57225 30.8 502 11818
74
+ deu-eng newstest2008 0.54506 26.9 2051 49380
75
+ eng-deu newstest2008 0.53077 23.1 2051 47447
76
+ fra-deu newstest2008 0.53204 22.9 2051 47447
77
+ fra-eng newstest2008 0.54320 26.4 2051 49380
78
+ spa-deu newstest2008 0.52066 21.6 2051 47447
79
+ spa-eng newstest2008 0.55305 27.9 2051 49380
80
+ deu-eng newstest2009 0.53773 26.2 2525 65399
81
+ eng-deu newstest2009 0.53217 22.3 2525 62816
82
+ fra-deu newstest2009 0.52995 22.9 2525 62816
83
+ fra-eng newstest2009 0.56663 30.0 2525 65399
84
+ spa-deu newstest2009 0.52586 22.1 2525 62816
85
+ spa-eng newstest2009 0.56756 29.9 2525 65399
86
+ deu-eng newstest2010 0.58365 30.4 2489 61711
87
+ eng-deu newstest2010 0.54917 25.7 2489 61503
88
+ fra-deu newstest2010 0.53904 24.3 2489 61503
89
+ fra-eng newstest2010 0.59241 32.4 2489 61711
90
+ spa-deu newstest2010 0.55378 26.2 2489 61503
91
+ spa-eng newstest2010 0.61316 35.8 2489 61711
92
+ deu-eng newstest2011 0.54907 26.1 3003 74681
93
+ eng-deu newstest2011 0.52873 23.0 3003 72981
94
+ fra-deu newstest2011 0.52977 23.0 3003 72981
95
+ fra-eng newstest2011 0.59565 32.8 3003 74681
96
+ spa-deu newstest2011 0.53095 23.4 3003 72981
97
+ spa-eng newstest2011 0.59513 33.3 3003 74681
98
+ deu-eng newstest2012 0.56230 28.1 3003 72812
99
+ eng-deu newstest2012 0.52871 23.7 3003 72886
100
+ fra-deu newstest2012 0.53035 24.1 3003 72886
101
+ fra-eng newstest2012 0.59137 33.0 3003 72812
102
+ spa-deu newstest2012 0.53438 24.3 3003 72886
103
+ spa-eng newstest2012 0.62058 37.0 3003 72812
104
+ deu-eng newstest2013 0.57940 31.5 3000 64505
105
+ eng-deu newstest2013 0.55718 27.5 3000 63737
106
+ fra-deu newstest2013 0.54408 25.6 3000 63737
107
+ fra-eng newstest2013 0.59151 33.9 3000 64505
108
+ spa-deu newstest2013 0.55215 26.2 3000 63737
109
+ spa-eng newstest2013 0.60465 34.4 3000 64505
110
+ deu-eng newstest2014 0.59723 33.1 3003 67337
111
+ eng-deu newstest2014 0.59127 28.5 3003 62688
112
+ fra-eng newstest2014 0.63411 38.0 3003 70708
113
+ deu-eng newstest2015 0.59799 33.7 2169 46443
114
+ eng-deu newstest2015 0.59977 32.0 2169 44260
115
+ deu-eng newstest2016 0.65039 40.4 2999 64119
116
+ eng-deu newstest2016 0.64144 37.9 2999 62669
117
+ deu-eng newstest2017 0.60921 35.3 3004 64399
118
+ eng-deu newstest2017 0.59114 30.4 3004 61287
119
+ deu-eng newstest2018 0.66680 42.6 2998 67012
120
+ eng-deu newstest2018 0.69428 45.8 2998 64276
121
+ deu-eng newstest2019 0.63482 39.1 2000 39227
122
+ eng-deu newstest2019 0.66430 42.0 1997 48746
123
+ fra-deu newstest2019 0.60993 29.4 1701 36446
124
+ deu-eng newstest2020 0.60403 34.0 785 38220
125
+ eng-deu newstest2020 0.60255 32.3 1418 52383
126
+ fra-deu newstest2020 0.61470 29.2 1619 30265
127
+ deu-eng newstest2021 0.59738 31.9 1000 20180
128
+ eng-deu newstest2021 0.56399 26.1 1002 27970
129
+ fra-deu newstest2021 0.66155 40.0 1026 26077
130
+ deu-eng newstestALL2020 0.60403 34.0 785 38220
131
+ eng-deu newstestALL2020 0.60255 32.3 1418 52383
132
+ deu-eng newstestB2020 0.60520 34.2 785 37696
133
+ eng-deu newstestB2020 0.59226 31.6 1418 53092
134
+ deu-afr ntrex128 0.57109 27.9 1997 50050
135
+ deu-eng ntrex128 0.62043 34.5 1997 47673
136
+ deu-ltz ntrex128 0.47642 15.4 1997 49763
137
+ deu-nld ntrex128 0.56777 27.6 1997 51884
138
+ eng-afr ntrex128 0.68616 44.1 1997 50050
139
+ eng-deu ntrex128 0.58743 30.2 1997 48761
140
+ eng-ltz ntrex128 0.50083 18.0 1997 49763
141
+ eng-nld ntrex128 0.61041 33.8 1997 51884
142
+ fra-afr ntrex128 0.55607 26.5 1997 50050
143
+ fra-deu ntrex128 0.53269 23.6 1997 48761
144
+ fra-eng ntrex128 0.61058 34.4 1997 47673
145
+ fra-ltz ntrex128 0.41312 12.0 1997 49763
146
+ fra-nld ntrex128 0.54615 25.2 1997 51884
147
+ por-afr ntrex128 0.58296 29.2 1997 50050
148
+ por-deu ntrex128 0.54944 24.7 1997 48761
149
+ por-eng ntrex128 0.65002 39.6 1997 47673
150
+ por-ltz ntrex128 0.35212 9.5 1997 49763
151
+ por-nld ntrex128 0.56384 28.1 1997 51884
152
+ spa-afr ntrex128 0.57772 27.7 1997 50050
153
+ spa-deu ntrex128 0.54561 24.0 1997 48761
154
+ spa-eng ntrex128 0.64305 37.3 1997 47673
155
+ spa-ltz ntrex128 0.35540 9.7 1997 49763
156
+ spa-nld ntrex128 0.56397 27.8 1997 51884
157
+ deu-ltz tatoeba-test-v2020-07-28 0.52660 32.3 337 2135
158
+ deu-nds tatoeba-test-v2020-07-28 0.44577 19.8 10000 76144
159
+ deu-yid tatoeba-test-v2020-07-28 0.15568 2.3 556 3425
160
+ eng-deu tatoeba-test-v2020-07-28 0.63613 43.0 10000 83347
161
+ eng-fry tatoeba-test-v2020-07-28 0.40596 12.8 205 1529
162
+ eng-ltz tatoeba-test-v2020-07-28 0.64082 45.2 283 1733
163
+ eng-nld tatoeba-test-v2020-07-28 0.71526 55.2 10000 71436
164
+ eng-yid tatoeba-test-v2020-07-28 0.37185 9.0 1168 8094
165
+ fra-nds tatoeba-test-v2020-07-28 0.39471 15.9 831 5571
166
+ fra-nld tatoeba-test-v2020-07-28 0.66599 48.8 10000 69845
167
+ fra-yid tatoeba-test-v2020-07-28 0.40041 9.8 230 1358
168
+ por-eng tatoeba-test-v2020-07-28 0.76063 62.8 10000 75240
169
+ spa-deu tatoeba-test-v2020-07-28 0.67395 48.3 10000 81214
170
+ spa-eng tatoeba-test-v2020-07-28 0.73258 58.9 10000 79376
171
+ spa-nld tatoeba-test-v2020-07-28 0.68457 50.6 10000 78395
172
+ deu-deu tatoeba-test-v2021-03-30 0.59934 35.0 2500 20806
173
+ deu-frr tatoeba-test-v2021-03-30 0.18528 0.7 279 1861
174
+ deu-gos tatoeba-test-v2021-03-30 0.27291 5.1 210 1140
175
+ deu-ltz tatoeba-test-v2021-03-30 0.52966 33.0 350 2227
176
+ deu-nds tatoeba-test-v2021-03-30 0.44577 19.8 10000 76144
177
+ deu-nld tatoeba-test-v2021-03-30 0.71040 54.1 10124 74568
178
+ deu-yid tatoeba-test-v2021-03-30 0.15883 2.2 830 5207
179
+ eng-deu tatoeba-test-v2021-03-30 0.62946 41.8 12664 107460
180
+ eng-gos tatoeba-test-v2021-03-30 0.21072 2.9 1193 5711
181
+ eng-gsw tatoeba-test-v2021-03-30 0.20552 0.9 210 1013
182
+ eng-ltz tatoeba-test-v2021-03-30 0.64514 45.8 299 1833
183
+ eng-nld tatoeba-test-v2021-03-30 0.70834 54.3 11660 83811
184
+ fra-eng tatoeba-test-v2021-03-30 0.71919 57.3 10892 85143
185
+ por-eng tatoeba-test-v2021-03-30 0.76398 63.4 11574 87523
186
+ spa-deu tatoeba-test-v2021-03-30 0.67408 48.3 10138 82525
187
+ spa-eng tatoeba-test-v2021-03-30 0.73396 59.0 11940 96122
188
+ spa-nld tatoeba-test-v2021-03-30 0.68521 50.7 10083 78945
189
+ spa-yid tatoeba-test-v2021-03-30 0.37513 10.4 336 2076
190
+ deu-afr tatoeba-test-v2021-08-07 0.72039 56.7 1583 9507
191
+ deu-deu tatoeba-test-v2021-08-07 0.59545 33.7 2500 20806
192
+ deu-eng tatoeba-test-v2021-08-07 0.66015 48.6 17565 149462
193
+ deu-frr tatoeba-test-v2021-08-07 0.18304 0.7 278 1855
194
+ deu-gos tatoeba-test-v2021-08-07 0.27546 5.4 207 1135
195
+ deu-ltz tatoeba-test-v2021-08-07 0.53760 34.2 347 2206
196
+ deu-nds tatoeba-test-v2021-08-07 0.44534 20.1 9999 76137
197
+ deu-nld tatoeba-test-v2021-08-07 0.71276 54.4 10218 75235
198
+ deu-swg tatoeba-test-v2021-08-07 0.16214 0.2 1523 15448
199
+ deu-yid tatoeba-test-v2021-08-07 0.15599 1.8 853 5355
200
+ eng-afr tatoeba-test-v2021-08-07 0.72087 56.6 1374 10317
201
+ eng-deu tatoeba-test-v2021-08-07 0.62971 41.4 17565 151568
202
+ eng-eng tatoeba-test-v2021-08-07 0.80306 58.0 12062 115106
203
+ eng-fry tatoeba-test-v2021-08-07 0.40324 13.8 220 1600
204
+ eng-gos tatoeba-test-v2021-08-07 0.20426 1.1 1154 5525
205
+ eng-gsw tatoeba-test-v2021-08-07 0.20266 1.0 205 984
206
+ eng-ltz tatoeba-test-v2021-08-07 0.64423 45.8 293 1828
207
+ eng-nds tatoeba-test-v2021-08-07 0.46446 22.2 2500 18264
208
+ eng-nld tatoeba-test-v2021-08-07 0.71190 54.5 12696 91796
209
+ eng-yid tatoeba-test-v2021-08-07 0.37401 9.5 2483 16395
210
+ fra-deu tatoeba-test-v2021-08-07 0.68991 50.3 12418 100545
211
+ fra-eng tatoeba-test-v2021-08-07 0.72564 58.0 12681 101754
212
+ fra-nds tatoeba-test-v2021-08-07 0.39555 16.2 857 5760
213
+ fra-nld tatoeba-test-v2021-08-07 0.67078 48.7 11548 82164
214
+ fra-yid tatoeba-test-v2021-08-07 0.39100 10.0 384 2381
215
+ por-deu tatoeba-test-v2021-08-07 0.68437 48.7 10000 81246
216
+ por-eng tatoeba-test-v2021-08-07 0.77081 64.3 13222 105351
217
+ por-nds tatoeba-test-v2021-08-07 0.45864 20.7 207 1292
218
+ por-nld tatoeba-test-v2021-08-07 0.69865 52.8 2500 17816
219
+ spa-afr tatoeba-test-v2021-08-07 0.77148 63.3 448 3044
220
+ spa-deu tatoeba-test-v2021-08-07 0.68037 49.1 10521 86430
221
+ spa-eng tatoeba-test-v2021-08-07 0.74575 60.2 16583 138123
222
+ spa-nds tatoeba-test-v2021-08-07 0.43154 18.5 923 5941
223
+ spa-nld tatoeba-test-v2021-08-07 0.68988 51.1 10113 79162
224
+ spa-yid tatoeba-test-v2021-08-07 0.34472 8.9 407 2599
225
+ fra-eng tico19-test 0.62059 39.2 2100 56323
226
+ por-eng tico19-test 0.73896 50.3 2100 56315
227
+ spa-eng tico19-test 0.72923 49.6 2100 56315
benchmark_translations.zip ADDED
File without changes
config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "pytorch-models/opus-mt-tc-bible-big-deu_eng_fra_por_spa-gmw",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "relu",
5
+ "architectures": [
6
+ "MarianMTModel"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "bos_token_id": 0,
10
+ "classifier_dropout": 0.0,
11
+ "d_model": 1024,
12
+ "decoder_attention_heads": 16,
13
+ "decoder_ffn_dim": 4096,
14
+ "decoder_layerdrop": 0.0,
15
+ "decoder_layers": 6,
16
+ "decoder_start_token_id": 48182,
17
+ "decoder_vocab_size": 48183,
18
+ "dropout": 0.1,
19
+ "encoder_attention_heads": 16,
20
+ "encoder_ffn_dim": 4096,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 6,
23
+ "eos_token_id": 453,
24
+ "forced_eos_token_id": null,
25
+ "init_std": 0.02,
26
+ "is_encoder_decoder": true,
27
+ "max_length": null,
28
+ "max_position_embeddings": 1024,
29
+ "model_type": "marian",
30
+ "normalize_embedding": false,
31
+ "num_beams": null,
32
+ "num_hidden_layers": 6,
33
+ "pad_token_id": 48182,
34
+ "scale_embedding": true,
35
+ "share_encoder_decoder_embeddings": true,
36
+ "static_position_embeddings": true,
37
+ "torch_dtype": "float32",
38
+ "transformers_version": "4.45.1",
39
+ "use_cache": true,
40
+ "vocab_size": 48183
41
+ }
generation_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bad_words_ids": [
4
+ [
5
+ 48182
6
+ ]
7
+ ],
8
+ "bos_token_id": 0,
9
+ "decoder_start_token_id": 48182,
10
+ "eos_token_id": 453,
11
+ "forced_eos_token_id": 453,
12
+ "max_length": 512,
13
+ "num_beams": 4,
14
+ "pad_token_id": 48182,
15
+ "transformers_version": "4.45.1"
16
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a53ea4daf925e53c698deebc12c60052e70a4eb4ba747c7e627ee1fb7468c444
3
+ size 903009420
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bf509f969068542a44b9c20dd069e0b141703c029543f1cd4e00a8e7a249b6b6
3
+ size 903060677
source.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1fc2d5cb53cd6257b69be3f6a959ff6e35bf99c1f14d45ccc2a3c8546650e197
3
+ size 806669
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"eos_token": "</s>", "unk_token": "<unk>", "pad_token": "<pad>"}
target.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dc24a79926c7fe53bcdd4dcdef68fbd96d8bc74f5fc41f5d8d4c45ce0207dc9c
3
+ size 795621
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"source_lang": "deu+eng+fra+por+spa", "target_lang": "gmw", "unk_token": "<unk>", "eos_token": "</s>", "pad_token": "<pad>", "model_max_length": 512, "sp_model_kwargs": {}, "separate_vocabs": false, "special_tokens_map_file": null, "name_or_path": "marian-models/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30/deu+eng+fra+por+spa-gmw", "tokenizer_class": "MarianTokenizer"}
vocab.json ADDED
The diff for this file is too large to render. See raw diff