lbourdois commited on
Commit
7fc2adf
1 Parent(s): 5a09125

Add multilingual to the language tag

Browse files

Hi! A PR to add multilingual to the language tag to improve the referencing.

Files changed (1) hide show
  1. README.md +249 -594
README.md CHANGED
@@ -11,615 +11,270 @@ language:
11
  - pt
12
  - ro
13
  - sv
14
-
 
15
  tags:
16
  - translation
17
  - opus-mt-tc
18
-
19
- license: cc-by-4.0
20
  model-index:
21
  - name: opus-mt-tc-big-gmq-itc
22
  results:
23
  - task:
24
- name: Translation dan-cat
25
  type: translation
26
- args: dan-cat
27
  dataset:
28
  name: flores101-devtest
29
  type: flores_101
30
  args: dan cat devtest
31
  metrics:
32
- - name: BLEU
33
- type: bleu
34
- value: 33.4
35
- - name: chr-F
36
- type: chrf
37
- value: 0.59224
38
- - task:
39
- name: Translation dan-fra
40
- type: translation
41
- args: dan-fra
42
- dataset:
43
- name: flores101-devtest
44
- type: flores_101
45
- args: dan fra devtest
46
- metrics:
47
- - name: BLEU
48
- type: bleu
49
- value: 38.3
50
- - name: chr-F
51
- type: chrf
52
- value: 0.63387
53
- - task:
54
- name: Translation dan-glg
55
- type: translation
56
- args: dan-glg
57
- dataset:
58
- name: flores101-devtest
59
- type: flores_101
60
- args: dan glg devtest
61
- metrics:
62
- - name: BLEU
63
- type: bleu
64
- value: 26.4
65
- - name: chr-F
66
- type: chrf
67
- value: 0.54446
68
- - task:
69
- name: Translation dan-ita
70
- type: translation
71
- args: dan-ita
72
- dataset:
73
- name: flores101-devtest
74
- type: flores_101
75
- args: dan ita devtest
76
- metrics:
77
- - name: BLEU
78
- type: bleu
79
- value: 25.7
80
- - name: chr-F
81
- type: chrf
82
- value: 0.55237
83
- - task:
84
- name: Translation dan-por
85
- type: translation
86
- args: dan-por
87
- dataset:
88
- name: flores101-devtest
89
- type: flores_101
90
- args: dan por devtest
91
- metrics:
92
- - name: BLEU
93
- type: bleu
94
- value: 36.9
95
- - name: chr-F
96
- type: chrf
97
- value: 0.62233
98
- - task:
99
- name: Translation dan-ron
100
- type: translation
101
- args: dan-ron
102
- dataset:
103
- name: flores101-devtest
104
- type: flores_101
105
- args: dan ron devtest
106
- metrics:
107
- - name: BLEU
108
- type: bleu
109
- value: 31.8
110
- - name: chr-F
111
- type: chrf
112
- value: 0.58235
113
- - task:
114
- name: Translation dan-spa
115
- type: translation
116
- args: dan-spa
117
- dataset:
118
- name: flores101-devtest
119
- type: flores_101
120
- args: dan spa devtest
121
- metrics:
122
- - name: BLEU
123
- type: bleu
124
- value: 24.3
125
- - name: chr-F
126
- type: chrf
127
- value: 0.52453
128
- - task:
129
- name: Translation isl-cat
130
- type: translation
131
- args: isl-cat
132
- dataset:
133
- name: flores101-devtest
134
- type: flores_101
135
- args: isl cat devtest
136
- metrics:
137
- - name: BLEU
138
- type: bleu
139
- value: 22.7
140
- - name: chr-F
141
- type: chrf
142
- value: 0.48930
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
143
  - task:
144
- name: Translation isl-fra
145
  type: translation
146
- args: isl-fra
147
- dataset:
148
- name: flores101-devtest
149
- type: flores_101
150
- args: isl fra devtest
151
- metrics:
152
- - name: BLEU
153
- type: bleu
154
- value: 26.2
155
- - name: chr-F
156
- type: chrf
157
- value: 0.52704
158
- - task:
159
- name: Translation isl-glg
160
- type: translation
161
- args: isl-glg
162
- dataset:
163
- name: flores101-devtest
164
- type: flores_101
165
- args: isl glg devtest
166
- metrics:
167
- - name: BLEU
168
- type: bleu
169
- value: 18.0
170
- - name: chr-F
171
- type: chrf
172
- value: 0.45387
173
- - task:
174
- name: Translation isl-ita
175
- type: translation
176
- args: isl-ita
177
- dataset:
178
- name: flores101-devtest
179
- type: flores_101
180
- args: isl ita devtest
181
- metrics:
182
- - name: BLEU
183
- type: bleu
184
- value: 18.6
185
- - name: chr-F
186
- type: chrf
187
- value: 0.47303
188
- - task:
189
- name: Translation isl-por
190
- type: translation
191
- args: isl-por
192
- dataset:
193
- name: flores101-devtest
194
- type: flores_101
195
- args: isl por devtest
196
- metrics:
197
- - name: BLEU
198
- type: bleu
199
- value: 24.9
200
- - name: chr-F
201
- type: chrf
202
- value: 0.51381
203
- - task:
204
- name: Translation isl-ron
205
- type: translation
206
- args: isl-ron
207
- dataset:
208
- name: flores101-devtest
209
- type: flores_101
210
- args: isl ron devtest
211
- metrics:
212
- - name: BLEU
213
- type: bleu
214
- value: 21.6
215
- - name: chr-F
216
- type: chrf
217
- value: 0.48224
218
- - task:
219
- name: Translation isl-spa
220
- type: translation
221
- args: isl-spa
222
- dataset:
223
- name: flores101-devtest
224
- type: flores_101
225
- args: isl spa devtest
226
- metrics:
227
- - name: BLEU
228
- type: bleu
229
- value: 18.1
230
- - name: chr-F
231
- type: chrf
232
- value: 0.45786
233
- - task:
234
- name: Translation nob-cat
235
- type: translation
236
- args: nob-cat
237
- dataset:
238
- name: flores101-devtest
239
- type: flores_101
240
- args: nob cat devtest
241
- metrics:
242
- - name: BLEU
243
- type: bleu
244
- value: 28.9
245
- - name: chr-F
246
- type: chrf
247
- value: 0.55984
248
- - task:
249
- name: Translation nob-fra
250
- type: translation
251
- args: nob-fra
252
- dataset:
253
- name: flores101-devtest
254
- type: flores_101
255
- args: nob fra devtest
256
- metrics:
257
- - name: BLEU
258
- type: bleu
259
- value: 33.8
260
- - name: chr-F
261
- type: chrf
262
- value: 0.60102
263
- - task:
264
- name: Translation nob-glg
265
- type: translation
266
- args: nob-glg
267
- dataset:
268
- name: flores101-devtest
269
- type: flores_101
270
- args: nob glg devtest
271
- metrics:
272
- - name: BLEU
273
- type: bleu
274
- value: 23.4
275
- - name: chr-F
276
- type: chrf
277
- value: 0.52145
278
- - task:
279
- name: Translation nob-ita
280
- type: translation
281
- args: nob-ita
282
- dataset:
283
- name: flores101-devtest
284
- type: flores_101
285
- args: nob ita devtest
286
- metrics:
287
- - name: BLEU
288
- type: bleu
289
- value: 22.2
290
- - name: chr-F
291
- type: chrf
292
- value: 0.52619
293
- - task:
294
- name: Translation nob-por
295
- type: translation
296
- args: nob-por
297
- dataset:
298
- name: flores101-devtest
299
- type: flores_101
300
- args: nob por devtest
301
- metrics:
302
- - name: BLEU
303
- type: bleu
304
- value: 32.2
305
- - name: chr-F
306
- type: chrf
307
- value: 0.58836
308
- - task:
309
- name: Translation nob-ron
310
- type: translation
311
- args: nob-ron
312
- dataset:
313
- name: flores101-devtest
314
- type: flores_101
315
- args: nob ron devtest
316
- metrics:
317
- - name: BLEU
318
- type: bleu
319
- value: 27.6
320
- - name: chr-F
321
- type: chrf
322
- value: 0.54845
323
- - task:
324
- name: Translation nob-spa
325
- type: translation
326
- args: nob-spa
327
- dataset:
328
- name: flores101-devtest
329
- type: flores_101
330
- args: nob spa devtest
331
- metrics:
332
- - name: BLEU
333
- type: bleu
334
- value: 21.8
335
- - name: chr-F
336
- type: chrf
337
- value: 0.50661
338
- - task:
339
- name: Translation swe-cat
340
- type: translation
341
- args: swe-cat
342
- dataset:
343
- name: flores101-devtest
344
- type: flores_101
345
- args: swe cat devtest
346
- metrics:
347
- - name: BLEU
348
- type: bleu
349
- value: 32.4
350
- - name: chr-F
351
- type: chrf
352
- value: 0.58542
353
- - task:
354
- name: Translation swe-fra
355
- type: translation
356
- args: swe-fra
357
- dataset:
358
- name: flores101-devtest
359
- type: flores_101
360
- args: swe fra devtest
361
- metrics:
362
- - name: BLEU
363
- type: bleu
364
- value: 39.3
365
- - name: chr-F
366
- type: chrf
367
- value: 0.63688
368
- - task:
369
- name: Translation swe-glg
370
- type: translation
371
- args: swe-glg
372
- dataset:
373
- name: flores101-devtest
374
- type: flores_101
375
- args: swe glg devtest
376
- metrics:
377
- - name: BLEU
378
- type: bleu
379
- value: 26.0
380
- - name: chr-F
381
- type: chrf
382
- value: 0.53989
383
- - task:
384
- name: Translation swe-ita
385
- type: translation
386
- args: swe-ita
387
- dataset:
388
- name: flores101-devtest
389
- type: flores_101
390
- args: swe ita devtest
391
- metrics:
392
- - name: BLEU
393
- type: bleu
394
- value: 25.9
395
- - name: chr-F
396
- type: chrf
397
- value: 0.55232
398
- - task:
399
- name: Translation swe-por
400
- type: translation
401
- args: swe-por
402
- dataset:
403
- name: flores101-devtest
404
- type: flores_101
405
- args: swe por devtest
406
- metrics:
407
- - name: BLEU
408
- type: bleu
409
- value: 36.5
410
- - name: chr-F
411
- type: chrf
412
- value: 0.61882
413
- - task:
414
- name: Translation swe-ron
415
- type: translation
416
- args: swe-ron
417
- dataset:
418
- name: flores101-devtest
419
- type: flores_101
420
- args: swe ron devtest
421
- metrics:
422
- - name: BLEU
423
- type: bleu
424
- value: 31.0
425
- - name: chr-F
426
- type: chrf
427
- value: 0.57419
428
- - task:
429
- name: Translation swe-spa
430
- type: translation
431
- args: swe-spa
432
- dataset:
433
- name: flores101-devtest
434
- type: flores_101
435
- args: swe spa devtest
436
- metrics:
437
- - name: BLEU
438
- type: bleu
439
- value: 23.8
440
- - name: chr-F
441
- type: chrf
442
- value: 0.52175
443
- - task:
444
  name: Translation dan-fra
445
- type: translation
446
- args: dan-fra
447
  dataset:
448
  name: tatoeba-test-v2021-08-07
449
  type: tatoeba_mt
450
  args: dan-fra
451
  metrics:
452
- - name: BLEU
453
- type: bleu
454
- value: 63.8
455
- - name: chr-F
456
- type: chrf
457
- value: 0.76671
458
- - task:
459
- name: Translation dan-ita
460
- type: translation
461
- args: dan-ita
462
- dataset:
463
- name: tatoeba-test-v2021-08-07
464
- type: tatoeba_mt
465
- args: dan-ita
466
- metrics:
467
- - name: BLEU
468
- type: bleu
469
- value: 56.2
470
- - name: chr-F
471
- type: chrf
472
- value: 0.74658
473
- - task:
474
- name: Translation dan-por
475
- type: translation
476
- args: dan-por
477
- dataset:
478
- name: tatoeba-test-v2021-08-07
479
- type: tatoeba_mt
480
- args: dan-por
481
- metrics:
482
- - name: BLEU
483
- type: bleu
484
- value: 57.8
485
- - name: chr-F
486
- type: chrf
487
- value: 0.74944
488
- - task:
489
- name: Translation dan-spa
490
- type: translation
491
- args: dan-spa
492
- dataset:
493
- name: tatoeba-test-v2021-08-07
494
- type: tatoeba_mt
495
- args: dan-spa
496
- metrics:
497
- - name: BLEU
498
- type: bleu
499
- value: 54.8
500
- - name: chr-F
501
- type: chrf
502
- value: 0.72328
503
- - task:
504
- name: Translation isl-ita
505
- type: translation
506
- args: isl-ita
507
- dataset:
508
- name: tatoeba-test-v2021-08-07
509
- type: tatoeba_mt
510
- args: isl-ita
511
- metrics:
512
- - name: BLEU
513
- type: bleu
514
- value: 51.0
515
- - name: chr-F
516
- type: chrf
517
- value: 0.69354
518
- - task:
519
- name: Translation isl-spa
520
- type: translation
521
- args: isl-spa
522
- dataset:
523
- name: tatoeba-test-v2021-08-07
524
- type: tatoeba_mt
525
- args: isl-spa
526
- metrics:
527
- - name: BLEU
528
- type: bleu
529
- value: 49.2
530
- - name: chr-F
531
- type: chrf
532
- value: 0.66008
533
- - task:
534
- name: Translation nob-fra
535
- type: translation
536
- args: nob-fra
537
- dataset:
538
- name: tatoeba-test-v2021-08-07
539
- type: tatoeba_mt
540
- args: nob-fra
541
- metrics:
542
- - name: BLEU
543
- type: bleu
544
- value: 54.4
545
- - name: chr-F
546
- type: chrf
547
- value: 0.70854
548
- - task:
549
- name: Translation nob-spa
550
- type: translation
551
- args: nob-spa
552
- dataset:
553
- name: tatoeba-test-v2021-08-07
554
- type: tatoeba_mt
555
- args: nob-spa
556
- metrics:
557
- - name: BLEU
558
- type: bleu
559
- value: 55.9
560
- - name: chr-F
561
- type: chrf
562
- value: 0.73672
563
- - task:
564
- name: Translation swe-fra
565
- type: translation
566
- args: swe-fra
567
- dataset:
568
- name: tatoeba-test-v2021-08-07
569
- type: tatoeba_mt
570
- args: swe-fra
571
- metrics:
572
- - name: BLEU
573
- type: bleu
574
- value: 59.2
575
- - name: chr-F
576
- type: chrf
577
- value: 0.73014
578
- - task:
579
- name: Translation swe-ita
580
- type: translation
581
- args: swe-ita
582
- dataset:
583
- name: tatoeba-test-v2021-08-07
584
- type: tatoeba_mt
585
- args: swe-ita
586
- metrics:
587
- - name: BLEU
588
- type: bleu
589
- value: 56.6
590
- - name: chr-F
591
- type: chrf
592
- value: 0.73211
593
- - task:
594
- name: Translation swe-por
595
- type: translation
596
- args: swe-por
597
- dataset:
598
- name: tatoeba-test-v2021-08-07
599
- type: tatoeba_mt
600
- args: swe-por
601
- metrics:
602
- - name: BLEU
603
- type: bleu
604
- value: 48.7
605
- - name: chr-F
606
- type: chrf
607
- value: 0.68146
608
- - task:
609
- name: Translation swe-spa
610
- type: translation
611
- args: swe-spa
612
- dataset:
613
- name: tatoeba-test-v2021-08-07
614
- type: tatoeba_mt
615
- args: swe-spa
616
- metrics:
617
- - name: BLEU
618
- type: bleu
619
- value: 55.3
620
- - name: chr-F
621
- type: chrf
622
- value: 0.71373
623
  ---
624
  # opus-mt-tc-big-gmq-itc
625
 
@@ -675,8 +330,8 @@ A short example code:
675
  from transformers import MarianMTModel, MarianTokenizer
676
 
677
  src_text = [
678
- ">>spa<< Jag är inte religiös.",
679
- ">>por<< Livet er for kort til å lære seg tysk."
680
  ]
681
 
682
  model_name = "pytorch-models/opus-mt-tc-big-gmq-itc"
@@ -689,7 +344,7 @@ for t in translated:
689
 
690
  # expected output:
691
  # No soy religioso.
692
- # A vida é muito curta para aprender alemão.
693
  ```
694
 
695
  You can also use OPUS-MT models with the transformers pipelines, for example:
@@ -697,7 +352,7 @@ You can also use OPUS-MT models with the transformers pipelines, for example:
697
  ```python
698
  from transformers import pipeline
699
  pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-gmq-itc")
700
- print(pipe(">>spa<< Jag är inte religiös."))
701
 
702
  # expected output: No soy religioso.
703
  ```
@@ -762,7 +417,7 @@ print(pipe(">>spa<< Jag är inte religiös."))
762
 
763
  ## Citation Information
764
 
765
- * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
766
 
767
  ```
768
  @inproceedings{tiedemann-thottingal-2020-opus,
@@ -792,7 +447,7 @@ print(pipe(">>spa<< Jag är inte religiös."))
792
 
793
  ## Acknowledgements
794
 
795
- The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
796
 
797
  ## Model conversion info
798
 
 
11
  - pt
12
  - ro
13
  - sv
14
+ - multilingual
15
+ license: cc-by-4.0
16
  tags:
17
  - translation
18
  - opus-mt-tc
 
 
19
  model-index:
20
  - name: opus-mt-tc-big-gmq-itc
21
  results:
22
  - task:
 
23
  type: translation
24
+ name: Translation dan-cat
25
  dataset:
26
  name: flores101-devtest
27
  type: flores_101
28
  args: dan cat devtest
29
  metrics:
30
+ - type: bleu
31
+ value: 33.4
32
+ name: BLEU
33
+ - type: chrf
34
+ value: 0.59224
35
+ name: chr-F
36
+ - type: bleu
37
+ value: 38.3
38
+ name: BLEU
39
+ - type: chrf
40
+ value: 0.63387
41
+ name: chr-F
42
+ - type: bleu
43
+ value: 26.4
44
+ name: BLEU
45
+ - type: chrf
46
+ value: 0.54446
47
+ name: chr-F
48
+ - type: bleu
49
+ value: 25.7
50
+ name: BLEU
51
+ - type: chrf
52
+ value: 0.55237
53
+ name: chr-F
54
+ - type: bleu
55
+ value: 36.9
56
+ name: BLEU
57
+ - type: chrf
58
+ value: 0.62233
59
+ name: chr-F
60
+ - type: bleu
61
+ value: 31.8
62
+ name: BLEU
63
+ - type: chrf
64
+ value: 0.58235
65
+ name: chr-F
66
+ - type: bleu
67
+ value: 24.3
68
+ name: BLEU
69
+ - type: chrf
70
+ value: 0.52453
71
+ name: chr-F
72
+ - type: bleu
73
+ value: 22.7
74
+ name: BLEU
75
+ - type: chrf
76
+ value: 0.4893
77
+ name: chr-F
78
+ - type: bleu
79
+ value: 26.2
80
+ name: BLEU
81
+ - type: chrf
82
+ value: 0.52704
83
+ name: chr-F
84
+ - type: bleu
85
+ value: 18.0
86
+ name: BLEU
87
+ - type: chrf
88
+ value: 0.45387
89
+ name: chr-F
90
+ - type: bleu
91
+ value: 18.6
92
+ name: BLEU
93
+ - type: chrf
94
+ value: 0.47303
95
+ name: chr-F
96
+ - type: bleu
97
+ value: 24.9
98
+ name: BLEU
99
+ - type: chrf
100
+ value: 0.51381
101
+ name: chr-F
102
+ - type: bleu
103
+ value: 21.6
104
+ name: BLEU
105
+ - type: chrf
106
+ value: 0.48224
107
+ name: chr-F
108
+ - type: bleu
109
+ value: 18.1
110
+ name: BLEU
111
+ - type: chrf
112
+ value: 0.45786
113
+ name: chr-F
114
+ - type: bleu
115
+ value: 28.9
116
+ name: BLEU
117
+ - type: chrf
118
+ value: 0.55984
119
+ name: chr-F
120
+ - type: bleu
121
+ value: 33.8
122
+ name: BLEU
123
+ - type: chrf
124
+ value: 0.60102
125
+ name: chr-F
126
+ - type: bleu
127
+ value: 23.4
128
+ name: BLEU
129
+ - type: chrf
130
+ value: 0.52145
131
+ name: chr-F
132
+ - type: bleu
133
+ value: 22.2
134
+ name: BLEU
135
+ - type: chrf
136
+ value: 0.52619
137
+ name: chr-F
138
+ - type: bleu
139
+ value: 32.2
140
+ name: BLEU
141
+ - type: chrf
142
+ value: 0.58836
143
+ name: chr-F
144
+ - type: bleu
145
+ value: 27.6
146
+ name: BLEU
147
+ - type: chrf
148
+ value: 0.54845
149
+ name: chr-F
150
+ - type: bleu
151
+ value: 21.8
152
+ name: BLEU
153
+ - type: chrf
154
+ value: 0.50661
155
+ name: chr-F
156
+ - type: bleu
157
+ value: 32.4
158
+ name: BLEU
159
+ - type: chrf
160
+ value: 0.58542
161
+ name: chr-F
162
+ - type: bleu
163
+ value: 39.3
164
+ name: BLEU
165
+ - type: chrf
166
+ value: 0.63688
167
+ name: chr-F
168
+ - type: bleu
169
+ value: 26.0
170
+ name: BLEU
171
+ - type: chrf
172
+ value: 0.53989
173
+ name: chr-F
174
+ - type: bleu
175
+ value: 25.9
176
+ name: BLEU
177
+ - type: chrf
178
+ value: 0.55232
179
+ name: chr-F
180
+ - type: bleu
181
+ value: 36.5
182
+ name: BLEU
183
+ - type: chrf
184
+ value: 0.61882
185
+ name: chr-F
186
+ - type: bleu
187
+ value: 31.0
188
+ name: BLEU
189
+ - type: chrf
190
+ value: 0.57419
191
+ name: chr-F
192
+ - type: bleu
193
+ value: 23.8
194
+ name: BLEU
195
+ - type: chrf
196
+ value: 0.52175
197
+ name: chr-F
198
  - task:
 
199
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
200
  name: Translation dan-fra
 
 
201
  dataset:
202
  name: tatoeba-test-v2021-08-07
203
  type: tatoeba_mt
204
  args: dan-fra
205
  metrics:
206
+ - type: bleu
207
+ value: 63.8
208
+ name: BLEU
209
+ - type: chrf
210
+ value: 0.76671
211
+ name: chr-F
212
+ - type: bleu
213
+ value: 56.2
214
+ name: BLEU
215
+ - type: chrf
216
+ value: 0.74658
217
+ name: chr-F
218
+ - type: bleu
219
+ value: 57.8
220
+ name: BLEU
221
+ - type: chrf
222
+ value: 0.74944
223
+ name: chr-F
224
+ - type: bleu
225
+ value: 54.8
226
+ name: BLEU
227
+ - type: chrf
228
+ value: 0.72328
229
+ name: chr-F
230
+ - type: bleu
231
+ value: 51.0
232
+ name: BLEU
233
+ - type: chrf
234
+ value: 0.69354
235
+ name: chr-F
236
+ - type: bleu
237
+ value: 49.2
238
+ name: BLEU
239
+ - type: chrf
240
+ value: 0.66008
241
+ name: chr-F
242
+ - type: bleu
243
+ value: 54.4
244
+ name: BLEU
245
+ - type: chrf
246
+ value: 0.70854
247
+ name: chr-F
248
+ - type: bleu
249
+ value: 55.9
250
+ name: BLEU
251
+ - type: chrf
252
+ value: 0.73672
253
+ name: chr-F
254
+ - type: bleu
255
+ value: 59.2
256
+ name: BLEU
257
+ - type: chrf
258
+ value: 0.73014
259
+ name: chr-F
260
+ - type: bleu
261
+ value: 56.6
262
+ name: BLEU
263
+ - type: chrf
264
+ value: 0.73211
265
+ name: chr-F
266
+ - type: bleu
267
+ value: 48.7
268
+ name: BLEU
269
+ - type: chrf
270
+ value: 0.68146
271
+ name: chr-F
272
+ - type: bleu
273
+ value: 55.3
274
+ name: BLEU
275
+ - type: chrf
276
+ value: 0.71373
277
+ name: chr-F
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
278
  ---
279
  # opus-mt-tc-big-gmq-itc
280
 
 
330
  from transformers import MarianMTModel, MarianTokenizer
331
 
332
  src_text = [
333
+ ">>spa<< Jag �r inte religi�s.",
334
+ ">>por<< Livet er for kort til l�re seg tysk."
335
  ]
336
 
337
  model_name = "pytorch-models/opus-mt-tc-big-gmq-itc"
 
344
 
345
  # expected output:
346
  # No soy religioso.
347
+ # A vida muito curta para aprender alem�o.
348
  ```
349
 
350
  You can also use OPUS-MT models with the transformers pipelines, for example:
 
352
  ```python
353
  from transformers import pipeline
354
  pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-gmq-itc")
355
+ print(pipe(">>spa<< Jag �r inte religi�s."))
356
 
357
  # expected output: No soy religioso.
358
  ```
 
417
 
418
  ## Citation Information
419
 
420
+ * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
421
 
422
  ```
423
  @inproceedings{tiedemann-thottingal-2020-opus,
 
447
 
448
  ## Acknowledgements
449
 
450
+ The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
451
 
452
  ## Model conversion info
453