Add multilingual to the language tag
Browse filesHi! A PR to add multilingual to the language tag to improve the referencing.
README.md
CHANGED
@@ -11,615 +11,270 @@ language:
|
|
11 |
- pt
|
12 |
- ro
|
13 |
- sv
|
14 |
-
|
|
|
15 |
tags:
|
16 |
- translation
|
17 |
- opus-mt-tc
|
18 |
-
|
19 |
-
license: cc-by-4.0
|
20 |
model-index:
|
21 |
- name: opus-mt-tc-big-gmq-itc
|
22 |
results:
|
23 |
- task:
|
24 |
-
name: Translation dan-cat
|
25 |
type: translation
|
26 |
-
|
27 |
dataset:
|
28 |
name: flores101-devtest
|
29 |
type: flores_101
|
30 |
args: dan cat devtest
|
31 |
metrics:
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
name:
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
name:
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
name:
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
name:
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
name:
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
115 |
-
|
116 |
-
|
117 |
-
|
118 |
-
name:
|
119 |
-
|
120 |
-
|
121 |
-
|
122 |
-
|
123 |
-
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
name:
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
143 |
- task:
|
144 |
-
name: Translation isl-fra
|
145 |
type: translation
|
146 |
-
args: isl-fra
|
147 |
-
dataset:
|
148 |
-
name: flores101-devtest
|
149 |
-
type: flores_101
|
150 |
-
args: isl fra devtest
|
151 |
-
metrics:
|
152 |
-
- name: BLEU
|
153 |
-
type: bleu
|
154 |
-
value: 26.2
|
155 |
-
- name: chr-F
|
156 |
-
type: chrf
|
157 |
-
value: 0.52704
|
158 |
-
- task:
|
159 |
-
name: Translation isl-glg
|
160 |
-
type: translation
|
161 |
-
args: isl-glg
|
162 |
-
dataset:
|
163 |
-
name: flores101-devtest
|
164 |
-
type: flores_101
|
165 |
-
args: isl glg devtest
|
166 |
-
metrics:
|
167 |
-
- name: BLEU
|
168 |
-
type: bleu
|
169 |
-
value: 18.0
|
170 |
-
- name: chr-F
|
171 |
-
type: chrf
|
172 |
-
value: 0.45387
|
173 |
-
- task:
|
174 |
-
name: Translation isl-ita
|
175 |
-
type: translation
|
176 |
-
args: isl-ita
|
177 |
-
dataset:
|
178 |
-
name: flores101-devtest
|
179 |
-
type: flores_101
|
180 |
-
args: isl ita devtest
|
181 |
-
metrics:
|
182 |
-
- name: BLEU
|
183 |
-
type: bleu
|
184 |
-
value: 18.6
|
185 |
-
- name: chr-F
|
186 |
-
type: chrf
|
187 |
-
value: 0.47303
|
188 |
-
- task:
|
189 |
-
name: Translation isl-por
|
190 |
-
type: translation
|
191 |
-
args: isl-por
|
192 |
-
dataset:
|
193 |
-
name: flores101-devtest
|
194 |
-
type: flores_101
|
195 |
-
args: isl por devtest
|
196 |
-
metrics:
|
197 |
-
- name: BLEU
|
198 |
-
type: bleu
|
199 |
-
value: 24.9
|
200 |
-
- name: chr-F
|
201 |
-
type: chrf
|
202 |
-
value: 0.51381
|
203 |
-
- task:
|
204 |
-
name: Translation isl-ron
|
205 |
-
type: translation
|
206 |
-
args: isl-ron
|
207 |
-
dataset:
|
208 |
-
name: flores101-devtest
|
209 |
-
type: flores_101
|
210 |
-
args: isl ron devtest
|
211 |
-
metrics:
|
212 |
-
- name: BLEU
|
213 |
-
type: bleu
|
214 |
-
value: 21.6
|
215 |
-
- name: chr-F
|
216 |
-
type: chrf
|
217 |
-
value: 0.48224
|
218 |
-
- task:
|
219 |
-
name: Translation isl-spa
|
220 |
-
type: translation
|
221 |
-
args: isl-spa
|
222 |
-
dataset:
|
223 |
-
name: flores101-devtest
|
224 |
-
type: flores_101
|
225 |
-
args: isl spa devtest
|
226 |
-
metrics:
|
227 |
-
- name: BLEU
|
228 |
-
type: bleu
|
229 |
-
value: 18.1
|
230 |
-
- name: chr-F
|
231 |
-
type: chrf
|
232 |
-
value: 0.45786
|
233 |
-
- task:
|
234 |
-
name: Translation nob-cat
|
235 |
-
type: translation
|
236 |
-
args: nob-cat
|
237 |
-
dataset:
|
238 |
-
name: flores101-devtest
|
239 |
-
type: flores_101
|
240 |
-
args: nob cat devtest
|
241 |
-
metrics:
|
242 |
-
- name: BLEU
|
243 |
-
type: bleu
|
244 |
-
value: 28.9
|
245 |
-
- name: chr-F
|
246 |
-
type: chrf
|
247 |
-
value: 0.55984
|
248 |
-
- task:
|
249 |
-
name: Translation nob-fra
|
250 |
-
type: translation
|
251 |
-
args: nob-fra
|
252 |
-
dataset:
|
253 |
-
name: flores101-devtest
|
254 |
-
type: flores_101
|
255 |
-
args: nob fra devtest
|
256 |
-
metrics:
|
257 |
-
- name: BLEU
|
258 |
-
type: bleu
|
259 |
-
value: 33.8
|
260 |
-
- name: chr-F
|
261 |
-
type: chrf
|
262 |
-
value: 0.60102
|
263 |
-
- task:
|
264 |
-
name: Translation nob-glg
|
265 |
-
type: translation
|
266 |
-
args: nob-glg
|
267 |
-
dataset:
|
268 |
-
name: flores101-devtest
|
269 |
-
type: flores_101
|
270 |
-
args: nob glg devtest
|
271 |
-
metrics:
|
272 |
-
- name: BLEU
|
273 |
-
type: bleu
|
274 |
-
value: 23.4
|
275 |
-
- name: chr-F
|
276 |
-
type: chrf
|
277 |
-
value: 0.52145
|
278 |
-
- task:
|
279 |
-
name: Translation nob-ita
|
280 |
-
type: translation
|
281 |
-
args: nob-ita
|
282 |
-
dataset:
|
283 |
-
name: flores101-devtest
|
284 |
-
type: flores_101
|
285 |
-
args: nob ita devtest
|
286 |
-
metrics:
|
287 |
-
- name: BLEU
|
288 |
-
type: bleu
|
289 |
-
value: 22.2
|
290 |
-
- name: chr-F
|
291 |
-
type: chrf
|
292 |
-
value: 0.52619
|
293 |
-
- task:
|
294 |
-
name: Translation nob-por
|
295 |
-
type: translation
|
296 |
-
args: nob-por
|
297 |
-
dataset:
|
298 |
-
name: flores101-devtest
|
299 |
-
type: flores_101
|
300 |
-
args: nob por devtest
|
301 |
-
metrics:
|
302 |
-
- name: BLEU
|
303 |
-
type: bleu
|
304 |
-
value: 32.2
|
305 |
-
- name: chr-F
|
306 |
-
type: chrf
|
307 |
-
value: 0.58836
|
308 |
-
- task:
|
309 |
-
name: Translation nob-ron
|
310 |
-
type: translation
|
311 |
-
args: nob-ron
|
312 |
-
dataset:
|
313 |
-
name: flores101-devtest
|
314 |
-
type: flores_101
|
315 |
-
args: nob ron devtest
|
316 |
-
metrics:
|
317 |
-
- name: BLEU
|
318 |
-
type: bleu
|
319 |
-
value: 27.6
|
320 |
-
- name: chr-F
|
321 |
-
type: chrf
|
322 |
-
value: 0.54845
|
323 |
-
- task:
|
324 |
-
name: Translation nob-spa
|
325 |
-
type: translation
|
326 |
-
args: nob-spa
|
327 |
-
dataset:
|
328 |
-
name: flores101-devtest
|
329 |
-
type: flores_101
|
330 |
-
args: nob spa devtest
|
331 |
-
metrics:
|
332 |
-
- name: BLEU
|
333 |
-
type: bleu
|
334 |
-
value: 21.8
|
335 |
-
- name: chr-F
|
336 |
-
type: chrf
|
337 |
-
value: 0.50661
|
338 |
-
- task:
|
339 |
-
name: Translation swe-cat
|
340 |
-
type: translation
|
341 |
-
args: swe-cat
|
342 |
-
dataset:
|
343 |
-
name: flores101-devtest
|
344 |
-
type: flores_101
|
345 |
-
args: swe cat devtest
|
346 |
-
metrics:
|
347 |
-
- name: BLEU
|
348 |
-
type: bleu
|
349 |
-
value: 32.4
|
350 |
-
- name: chr-F
|
351 |
-
type: chrf
|
352 |
-
value: 0.58542
|
353 |
-
- task:
|
354 |
-
name: Translation swe-fra
|
355 |
-
type: translation
|
356 |
-
args: swe-fra
|
357 |
-
dataset:
|
358 |
-
name: flores101-devtest
|
359 |
-
type: flores_101
|
360 |
-
args: swe fra devtest
|
361 |
-
metrics:
|
362 |
-
- name: BLEU
|
363 |
-
type: bleu
|
364 |
-
value: 39.3
|
365 |
-
- name: chr-F
|
366 |
-
type: chrf
|
367 |
-
value: 0.63688
|
368 |
-
- task:
|
369 |
-
name: Translation swe-glg
|
370 |
-
type: translation
|
371 |
-
args: swe-glg
|
372 |
-
dataset:
|
373 |
-
name: flores101-devtest
|
374 |
-
type: flores_101
|
375 |
-
args: swe glg devtest
|
376 |
-
metrics:
|
377 |
-
- name: BLEU
|
378 |
-
type: bleu
|
379 |
-
value: 26.0
|
380 |
-
- name: chr-F
|
381 |
-
type: chrf
|
382 |
-
value: 0.53989
|
383 |
-
- task:
|
384 |
-
name: Translation swe-ita
|
385 |
-
type: translation
|
386 |
-
args: swe-ita
|
387 |
-
dataset:
|
388 |
-
name: flores101-devtest
|
389 |
-
type: flores_101
|
390 |
-
args: swe ita devtest
|
391 |
-
metrics:
|
392 |
-
- name: BLEU
|
393 |
-
type: bleu
|
394 |
-
value: 25.9
|
395 |
-
- name: chr-F
|
396 |
-
type: chrf
|
397 |
-
value: 0.55232
|
398 |
-
- task:
|
399 |
-
name: Translation swe-por
|
400 |
-
type: translation
|
401 |
-
args: swe-por
|
402 |
-
dataset:
|
403 |
-
name: flores101-devtest
|
404 |
-
type: flores_101
|
405 |
-
args: swe por devtest
|
406 |
-
metrics:
|
407 |
-
- name: BLEU
|
408 |
-
type: bleu
|
409 |
-
value: 36.5
|
410 |
-
- name: chr-F
|
411 |
-
type: chrf
|
412 |
-
value: 0.61882
|
413 |
-
- task:
|
414 |
-
name: Translation swe-ron
|
415 |
-
type: translation
|
416 |
-
args: swe-ron
|
417 |
-
dataset:
|
418 |
-
name: flores101-devtest
|
419 |
-
type: flores_101
|
420 |
-
args: swe ron devtest
|
421 |
-
metrics:
|
422 |
-
- name: BLEU
|
423 |
-
type: bleu
|
424 |
-
value: 31.0
|
425 |
-
- name: chr-F
|
426 |
-
type: chrf
|
427 |
-
value: 0.57419
|
428 |
-
- task:
|
429 |
-
name: Translation swe-spa
|
430 |
-
type: translation
|
431 |
-
args: swe-spa
|
432 |
-
dataset:
|
433 |
-
name: flores101-devtest
|
434 |
-
type: flores_101
|
435 |
-
args: swe spa devtest
|
436 |
-
metrics:
|
437 |
-
- name: BLEU
|
438 |
-
type: bleu
|
439 |
-
value: 23.8
|
440 |
-
- name: chr-F
|
441 |
-
type: chrf
|
442 |
-
value: 0.52175
|
443 |
-
- task:
|
444 |
name: Translation dan-fra
|
445 |
-
type: translation
|
446 |
-
args: dan-fra
|
447 |
dataset:
|
448 |
name: tatoeba-test-v2021-08-07
|
449 |
type: tatoeba_mt
|
450 |
args: dan-fra
|
451 |
metrics:
|
452 |
-
|
453 |
-
|
454 |
-
|
455 |
-
|
456 |
-
|
457 |
-
|
458 |
-
|
459 |
-
|
460 |
-
|
461 |
-
|
462 |
-
|
463 |
-
name:
|
464 |
-
|
465 |
-
|
466 |
-
|
467 |
-
|
468 |
-
|
469 |
-
|
470 |
-
|
471 |
-
|
472 |
-
|
473 |
-
|
474 |
-
|
475 |
-
|
476 |
-
|
477 |
-
|
478 |
-
name:
|
479 |
-
|
480 |
-
|
481 |
-
|
482 |
-
|
483 |
-
|
484 |
-
|
485 |
-
|
486 |
-
|
487 |
-
|
488 |
-
|
489 |
-
|
490 |
-
|
491 |
-
|
492 |
-
|
493 |
-
name:
|
494 |
-
|
495 |
-
|
496 |
-
|
497 |
-
|
498 |
-
|
499 |
-
|
500 |
-
|
501 |
-
|
502 |
-
|
503 |
-
|
504 |
-
|
505 |
-
|
506 |
-
|
507 |
-
|
508 |
-
name:
|
509 |
-
|
510 |
-
|
511 |
-
|
512 |
-
|
513 |
-
|
514 |
-
|
515 |
-
|
516 |
-
|
517 |
-
|
518 |
-
|
519 |
-
|
520 |
-
|
521 |
-
|
522 |
-
|
523 |
-
name:
|
524 |
-
type: tatoeba_mt
|
525 |
-
args: isl-spa
|
526 |
-
metrics:
|
527 |
-
- name: BLEU
|
528 |
-
type: bleu
|
529 |
-
value: 49.2
|
530 |
-
- name: chr-F
|
531 |
-
type: chrf
|
532 |
-
value: 0.66008
|
533 |
-
- task:
|
534 |
-
name: Translation nob-fra
|
535 |
-
type: translation
|
536 |
-
args: nob-fra
|
537 |
-
dataset:
|
538 |
-
name: tatoeba-test-v2021-08-07
|
539 |
-
type: tatoeba_mt
|
540 |
-
args: nob-fra
|
541 |
-
metrics:
|
542 |
-
- name: BLEU
|
543 |
-
type: bleu
|
544 |
-
value: 54.4
|
545 |
-
- name: chr-F
|
546 |
-
type: chrf
|
547 |
-
value: 0.70854
|
548 |
-
- task:
|
549 |
-
name: Translation nob-spa
|
550 |
-
type: translation
|
551 |
-
args: nob-spa
|
552 |
-
dataset:
|
553 |
-
name: tatoeba-test-v2021-08-07
|
554 |
-
type: tatoeba_mt
|
555 |
-
args: nob-spa
|
556 |
-
metrics:
|
557 |
-
- name: BLEU
|
558 |
-
type: bleu
|
559 |
-
value: 55.9
|
560 |
-
- name: chr-F
|
561 |
-
type: chrf
|
562 |
-
value: 0.73672
|
563 |
-
- task:
|
564 |
-
name: Translation swe-fra
|
565 |
-
type: translation
|
566 |
-
args: swe-fra
|
567 |
-
dataset:
|
568 |
-
name: tatoeba-test-v2021-08-07
|
569 |
-
type: tatoeba_mt
|
570 |
-
args: swe-fra
|
571 |
-
metrics:
|
572 |
-
- name: BLEU
|
573 |
-
type: bleu
|
574 |
-
value: 59.2
|
575 |
-
- name: chr-F
|
576 |
-
type: chrf
|
577 |
-
value: 0.73014
|
578 |
-
- task:
|
579 |
-
name: Translation swe-ita
|
580 |
-
type: translation
|
581 |
-
args: swe-ita
|
582 |
-
dataset:
|
583 |
-
name: tatoeba-test-v2021-08-07
|
584 |
-
type: tatoeba_mt
|
585 |
-
args: swe-ita
|
586 |
-
metrics:
|
587 |
-
- name: BLEU
|
588 |
-
type: bleu
|
589 |
-
value: 56.6
|
590 |
-
- name: chr-F
|
591 |
-
type: chrf
|
592 |
-
value: 0.73211
|
593 |
-
- task:
|
594 |
-
name: Translation swe-por
|
595 |
-
type: translation
|
596 |
-
args: swe-por
|
597 |
-
dataset:
|
598 |
-
name: tatoeba-test-v2021-08-07
|
599 |
-
type: tatoeba_mt
|
600 |
-
args: swe-por
|
601 |
-
metrics:
|
602 |
-
- name: BLEU
|
603 |
-
type: bleu
|
604 |
-
value: 48.7
|
605 |
-
- name: chr-F
|
606 |
-
type: chrf
|
607 |
-
value: 0.68146
|
608 |
-
- task:
|
609 |
-
name: Translation swe-spa
|
610 |
-
type: translation
|
611 |
-
args: swe-spa
|
612 |
-
dataset:
|
613 |
-
name: tatoeba-test-v2021-08-07
|
614 |
-
type: tatoeba_mt
|
615 |
-
args: swe-spa
|
616 |
-
metrics:
|
617 |
-
- name: BLEU
|
618 |
-
type: bleu
|
619 |
-
value: 55.3
|
620 |
-
- name: chr-F
|
621 |
-
type: chrf
|
622 |
-
value: 0.71373
|
623 |
---
|
624 |
# opus-mt-tc-big-gmq-itc
|
625 |
|
@@ -675,8 +330,8 @@ A short example code:
|
|
675 |
from transformers import MarianMTModel, MarianTokenizer
|
676 |
|
677 |
src_text = [
|
678 |
-
">>spa<< Jag
|
679 |
-
">>por<< Livet er for kort til
|
680 |
]
|
681 |
|
682 |
model_name = "pytorch-models/opus-mt-tc-big-gmq-itc"
|
@@ -689,7 +344,7 @@ for t in translated:
|
|
689 |
|
690 |
# expected output:
|
691 |
# No soy religioso.
|
692 |
-
# A vida
|
693 |
```
|
694 |
|
695 |
You can also use OPUS-MT models with the transformers pipelines, for example:
|
@@ -697,7 +352,7 @@ You can also use OPUS-MT models with the transformers pipelines, for example:
|
|
697 |
```python
|
698 |
from transformers import pipeline
|
699 |
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-gmq-itc")
|
700 |
-
print(pipe(">>spa<< Jag
|
701 |
|
702 |
# expected output: No soy religioso.
|
703 |
```
|
@@ -762,7 +417,7 @@ print(pipe(">>spa<< Jag är inte religiös."))
|
|
762 |
|
763 |
## Citation Information
|
764 |
|
765 |
-
* Publications: [OPUS-MT
|
766 |
|
767 |
```
|
768 |
@inproceedings{tiedemann-thottingal-2020-opus,
|
@@ -792,7 +447,7 @@ print(pipe(">>spa<< Jag är inte religiös."))
|
|
792 |
|
793 |
## Acknowledgements
|
794 |
|
795 |
-
The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Union
|
796 |
|
797 |
## Model conversion info
|
798 |
|
|
|
11 |
- pt
|
12 |
- ro
|
13 |
- sv
|
14 |
+
- multilingual
|
15 |
+
license: cc-by-4.0
|
16 |
tags:
|
17 |
- translation
|
18 |
- opus-mt-tc
|
|
|
|
|
19 |
model-index:
|
20 |
- name: opus-mt-tc-big-gmq-itc
|
21 |
results:
|
22 |
- task:
|
|
|
23 |
type: translation
|
24 |
+
name: Translation dan-cat
|
25 |
dataset:
|
26 |
name: flores101-devtest
|
27 |
type: flores_101
|
28 |
args: dan cat devtest
|
29 |
metrics:
|
30 |
+
- type: bleu
|
31 |
+
value: 33.4
|
32 |
+
name: BLEU
|
33 |
+
- type: chrf
|
34 |
+
value: 0.59224
|
35 |
+
name: chr-F
|
36 |
+
- type: bleu
|
37 |
+
value: 38.3
|
38 |
+
name: BLEU
|
39 |
+
- type: chrf
|
40 |
+
value: 0.63387
|
41 |
+
name: chr-F
|
42 |
+
- type: bleu
|
43 |
+
value: 26.4
|
44 |
+
name: BLEU
|
45 |
+
- type: chrf
|
46 |
+
value: 0.54446
|
47 |
+
name: chr-F
|
48 |
+
- type: bleu
|
49 |
+
value: 25.7
|
50 |
+
name: BLEU
|
51 |
+
- type: chrf
|
52 |
+
value: 0.55237
|
53 |
+
name: chr-F
|
54 |
+
- type: bleu
|
55 |
+
value: 36.9
|
56 |
+
name: BLEU
|
57 |
+
- type: chrf
|
58 |
+
value: 0.62233
|
59 |
+
name: chr-F
|
60 |
+
- type: bleu
|
61 |
+
value: 31.8
|
62 |
+
name: BLEU
|
63 |
+
- type: chrf
|
64 |
+
value: 0.58235
|
65 |
+
name: chr-F
|
66 |
+
- type: bleu
|
67 |
+
value: 24.3
|
68 |
+
name: BLEU
|
69 |
+
- type: chrf
|
70 |
+
value: 0.52453
|
71 |
+
name: chr-F
|
72 |
+
- type: bleu
|
73 |
+
value: 22.7
|
74 |
+
name: BLEU
|
75 |
+
- type: chrf
|
76 |
+
value: 0.4893
|
77 |
+
name: chr-F
|
78 |
+
- type: bleu
|
79 |
+
value: 26.2
|
80 |
+
name: BLEU
|
81 |
+
- type: chrf
|
82 |
+
value: 0.52704
|
83 |
+
name: chr-F
|
84 |
+
- type: bleu
|
85 |
+
value: 18.0
|
86 |
+
name: BLEU
|
87 |
+
- type: chrf
|
88 |
+
value: 0.45387
|
89 |
+
name: chr-F
|
90 |
+
- type: bleu
|
91 |
+
value: 18.6
|
92 |
+
name: BLEU
|
93 |
+
- type: chrf
|
94 |
+
value: 0.47303
|
95 |
+
name: chr-F
|
96 |
+
- type: bleu
|
97 |
+
value: 24.9
|
98 |
+
name: BLEU
|
99 |
+
- type: chrf
|
100 |
+
value: 0.51381
|
101 |
+
name: chr-F
|
102 |
+
- type: bleu
|
103 |
+
value: 21.6
|
104 |
+
name: BLEU
|
105 |
+
- type: chrf
|
106 |
+
value: 0.48224
|
107 |
+
name: chr-F
|
108 |
+
- type: bleu
|
109 |
+
value: 18.1
|
110 |
+
name: BLEU
|
111 |
+
- type: chrf
|
112 |
+
value: 0.45786
|
113 |
+
name: chr-F
|
114 |
+
- type: bleu
|
115 |
+
value: 28.9
|
116 |
+
name: BLEU
|
117 |
+
- type: chrf
|
118 |
+
value: 0.55984
|
119 |
+
name: chr-F
|
120 |
+
- type: bleu
|
121 |
+
value: 33.8
|
122 |
+
name: BLEU
|
123 |
+
- type: chrf
|
124 |
+
value: 0.60102
|
125 |
+
name: chr-F
|
126 |
+
- type: bleu
|
127 |
+
value: 23.4
|
128 |
+
name: BLEU
|
129 |
+
- type: chrf
|
130 |
+
value: 0.52145
|
131 |
+
name: chr-F
|
132 |
+
- type: bleu
|
133 |
+
value: 22.2
|
134 |
+
name: BLEU
|
135 |
+
- type: chrf
|
136 |
+
value: 0.52619
|
137 |
+
name: chr-F
|
138 |
+
- type: bleu
|
139 |
+
value: 32.2
|
140 |
+
name: BLEU
|
141 |
+
- type: chrf
|
142 |
+
value: 0.58836
|
143 |
+
name: chr-F
|
144 |
+
- type: bleu
|
145 |
+
value: 27.6
|
146 |
+
name: BLEU
|
147 |
+
- type: chrf
|
148 |
+
value: 0.54845
|
149 |
+
name: chr-F
|
150 |
+
- type: bleu
|
151 |
+
value: 21.8
|
152 |
+
name: BLEU
|
153 |
+
- type: chrf
|
154 |
+
value: 0.50661
|
155 |
+
name: chr-F
|
156 |
+
- type: bleu
|
157 |
+
value: 32.4
|
158 |
+
name: BLEU
|
159 |
+
- type: chrf
|
160 |
+
value: 0.58542
|
161 |
+
name: chr-F
|
162 |
+
- type: bleu
|
163 |
+
value: 39.3
|
164 |
+
name: BLEU
|
165 |
+
- type: chrf
|
166 |
+
value: 0.63688
|
167 |
+
name: chr-F
|
168 |
+
- type: bleu
|
169 |
+
value: 26.0
|
170 |
+
name: BLEU
|
171 |
+
- type: chrf
|
172 |
+
value: 0.53989
|
173 |
+
name: chr-F
|
174 |
+
- type: bleu
|
175 |
+
value: 25.9
|
176 |
+
name: BLEU
|
177 |
+
- type: chrf
|
178 |
+
value: 0.55232
|
179 |
+
name: chr-F
|
180 |
+
- type: bleu
|
181 |
+
value: 36.5
|
182 |
+
name: BLEU
|
183 |
+
- type: chrf
|
184 |
+
value: 0.61882
|
185 |
+
name: chr-F
|
186 |
+
- type: bleu
|
187 |
+
value: 31.0
|
188 |
+
name: BLEU
|
189 |
+
- type: chrf
|
190 |
+
value: 0.57419
|
191 |
+
name: chr-F
|
192 |
+
- type: bleu
|
193 |
+
value: 23.8
|
194 |
+
name: BLEU
|
195 |
+
- type: chrf
|
196 |
+
value: 0.52175
|
197 |
+
name: chr-F
|
198 |
- task:
|
|
|
199 |
type: translation
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
200 |
name: Translation dan-fra
|
|
|
|
|
201 |
dataset:
|
202 |
name: tatoeba-test-v2021-08-07
|
203 |
type: tatoeba_mt
|
204 |
args: dan-fra
|
205 |
metrics:
|
206 |
+
- type: bleu
|
207 |
+
value: 63.8
|
208 |
+
name: BLEU
|
209 |
+
- type: chrf
|
210 |
+
value: 0.76671
|
211 |
+
name: chr-F
|
212 |
+
- type: bleu
|
213 |
+
value: 56.2
|
214 |
+
name: BLEU
|
215 |
+
- type: chrf
|
216 |
+
value: 0.74658
|
217 |
+
name: chr-F
|
218 |
+
- type: bleu
|
219 |
+
value: 57.8
|
220 |
+
name: BLEU
|
221 |
+
- type: chrf
|
222 |
+
value: 0.74944
|
223 |
+
name: chr-F
|
224 |
+
- type: bleu
|
225 |
+
value: 54.8
|
226 |
+
name: BLEU
|
227 |
+
- type: chrf
|
228 |
+
value: 0.72328
|
229 |
+
name: chr-F
|
230 |
+
- type: bleu
|
231 |
+
value: 51.0
|
232 |
+
name: BLEU
|
233 |
+
- type: chrf
|
234 |
+
value: 0.69354
|
235 |
+
name: chr-F
|
236 |
+
- type: bleu
|
237 |
+
value: 49.2
|
238 |
+
name: BLEU
|
239 |
+
- type: chrf
|
240 |
+
value: 0.66008
|
241 |
+
name: chr-F
|
242 |
+
- type: bleu
|
243 |
+
value: 54.4
|
244 |
+
name: BLEU
|
245 |
+
- type: chrf
|
246 |
+
value: 0.70854
|
247 |
+
name: chr-F
|
248 |
+
- type: bleu
|
249 |
+
value: 55.9
|
250 |
+
name: BLEU
|
251 |
+
- type: chrf
|
252 |
+
value: 0.73672
|
253 |
+
name: chr-F
|
254 |
+
- type: bleu
|
255 |
+
value: 59.2
|
256 |
+
name: BLEU
|
257 |
+
- type: chrf
|
258 |
+
value: 0.73014
|
259 |
+
name: chr-F
|
260 |
+
- type: bleu
|
261 |
+
value: 56.6
|
262 |
+
name: BLEU
|
263 |
+
- type: chrf
|
264 |
+
value: 0.73211
|
265 |
+
name: chr-F
|
266 |
+
- type: bleu
|
267 |
+
value: 48.7
|
268 |
+
name: BLEU
|
269 |
+
- type: chrf
|
270 |
+
value: 0.68146
|
271 |
+
name: chr-F
|
272 |
+
- type: bleu
|
273 |
+
value: 55.3
|
274 |
+
name: BLEU
|
275 |
+
- type: chrf
|
276 |
+
value: 0.71373
|
277 |
+
name: chr-F
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
278 |
---
|
279 |
# opus-mt-tc-big-gmq-itc
|
280 |
|
|
|
330 |
from transformers import MarianMTModel, MarianTokenizer
|
331 |
|
332 |
src_text = [
|
333 |
+
">>spa<< Jag �r inte religi�s.",
|
334 |
+
">>por<< Livet er for kort til � l�re seg tysk."
|
335 |
]
|
336 |
|
337 |
model_name = "pytorch-models/opus-mt-tc-big-gmq-itc"
|
|
|
344 |
|
345 |
# expected output:
|
346 |
# No soy religioso.
|
347 |
+
# A vida � muito curta para aprender alem�o.
|
348 |
```
|
349 |
|
350 |
You can also use OPUS-MT models with the transformers pipelines, for example:
|
|
|
352 |
```python
|
353 |
from transformers import pipeline
|
354 |
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-gmq-itc")
|
355 |
+
print(pipe(">>spa<< Jag �r inte religi�s."))
|
356 |
|
357 |
# expected output: No soy religioso.
|
358 |
```
|
|
|
417 |
|
418 |
## Citation Information
|
419 |
|
420 |
+
* Publications: [OPUS-MT � Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge � Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
|
421 |
|
422 |
```
|
423 |
@inproceedings{tiedemann-thottingal-2020-opus,
|
|
|
447 |
|
448 |
## Acknowledgements
|
449 |
|
450 |
+
The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Union�s Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Union�s Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
|
451 |
|
452 |
## Model conversion info
|
453 |
|