lbourdois commited on
Commit
4f67e31
1 Parent(s): 1d0213b

Add multilingual to the language tag

Browse files

Hi! A PR to add multilingual to the language tag to improve the referencing.

Files changed (1) hide show
  1. README.md +838 -1991
README.md CHANGED
@@ -11,2055 +11,902 @@ language:
11
  - pms
12
  - pt
13
  - ro
14
-
15
- tags:
16
- - translation
17
- - opus-mt-tc
18
-
19
- license: cc-by-4.0
20
- model-index:
21
- - name: opus-mt-tc-big-itc-itc
22
- results:
23
- - task:
24
- name: Translation ast-cat
25
- type: translation
26
- args: ast-cat
27
- dataset:
28
- name: flores101-devtest
29
- type: flores_101
30
- args: ast cat devtest
31
- metrics:
32
- - name: BLEU
33
- type: bleu
34
- value: 31.8
35
- - name: chr-F
36
- type: chrf
37
- value: 0.57870
38
- - task:
39
- name: Translation ast-fra
40
- type: translation
41
- args: ast-fra
42
- dataset:
43
- name: flores101-devtest
44
- type: flores_101
45
- args: ast fra devtest
46
- metrics:
47
- - name: BLEU
48
- type: bleu
49
- value: 31.1
50
- - name: chr-F
51
- type: chrf
52
- value: 0.56761
53
- - task:
54
- name: Translation ast-glg
55
- type: translation
56
- args: ast-glg
57
- dataset:
58
- name: flores101-devtest
59
- type: flores_101
60
- args: ast glg devtest
61
- metrics:
62
- - name: BLEU
63
- type: bleu
64
- value: 27.9
65
- - name: chr-F
66
- type: chrf
67
- value: 0.55161
68
- - task:
69
- name: Translation ast-ita
70
- type: translation
71
- args: ast-ita
72
- dataset:
73
- name: flores101-devtest
74
- type: flores_101
75
- args: ast ita devtest
76
- metrics:
77
- - name: BLEU
78
- type: bleu
79
- value: 22.1
80
- - name: chr-F
81
- type: chrf
82
- value: 0.51764
83
- - task:
84
- name: Translation ast-oci
85
- type: translation
86
- args: ast-oci
87
- dataset:
88
- name: flores101-devtest
89
- type: flores_101
90
- args: ast oci devtest
91
- metrics:
92
- - name: BLEU
93
- type: bleu
94
- value: 20.6
95
- - name: chr-F
96
- type: chrf
97
- value: 0.49545
98
- - task:
99
- name: Translation ast-por
100
- type: translation
101
- args: ast-por
102
- dataset:
103
- name: flores101-devtest
104
- type: flores_101
105
- args: ast por devtest
106
- metrics:
107
- - name: BLEU
108
- type: bleu
109
- value: 31.5
110
- - name: chr-F
111
- type: chrf
112
- value: 0.57347
113
- - task:
114
- name: Translation ast-ron
115
- type: translation
116
- args: ast-ron
117
- dataset:
118
- name: flores101-devtest
119
- type: flores_101
120
- args: ast ron devtest
121
- metrics:
122
- - name: BLEU
123
- type: bleu
124
- value: 24.8
125
- - name: chr-F
126
- type: chrf
127
- value: 0.52317
128
- - task:
129
- name: Translation ast-spa
130
- type: translation
131
- args: ast-spa
132
- dataset:
133
- name: flores101-devtest
134
- type: flores_101
135
- args: ast spa devtest
136
- metrics:
137
- - name: BLEU
138
- type: bleu
139
- value: 21.2
140
- - name: chr-F
141
- type: chrf
142
- value: 0.49741
143
- - task:
144
- name: Translation cat-ast
145
- type: translation
146
- args: cat-ast
147
- dataset:
148
- name: flores101-devtest
149
- type: flores_101
150
- args: cat ast devtest
151
- metrics:
152
- - name: BLEU
153
- type: bleu
154
- value: 24.7
155
- - name: chr-F
156
- type: chrf
157
- value: 0.56754
158
- - task:
159
- name: Translation cat-fra
160
- type: translation
161
- args: cat-fra
162
- dataset:
163
- name: flores101-devtest
164
- type: flores_101
165
- args: cat fra devtest
166
- metrics:
167
- - name: BLEU
168
- type: bleu
169
- value: 38.4
170
- - name: chr-F
171
- type: chrf
172
- value: 0.63368
173
- - task:
174
- name: Translation cat-glg
175
- type: translation
176
- args: cat-glg
177
- dataset:
178
- name: flores101-devtest
179
- type: flores_101
180
- args: cat glg devtest
181
- metrics:
182
- - name: BLEU
183
- type: bleu
184
- value: 32.2
185
- - name: chr-F
186
- type: chrf
187
- value: 0.59596
188
- - task:
189
- name: Translation cat-ita
190
- type: translation
191
- args: cat-ita
192
- dataset:
193
- name: flores101-devtest
194
- type: flores_101
195
- args: cat ita devtest
196
- metrics:
197
- - name: BLEU
198
- type: bleu
199
- value: 26.3
200
- - name: chr-F
201
- type: chrf
202
- value: 0.55886
203
- - task:
204
- name: Translation cat-oci
205
- type: translation
206
- args: cat-oci
207
- dataset:
208
- name: flores101-devtest
209
- type: flores_101
210
- args: cat oci devtest
211
- metrics:
212
- - name: BLEU
213
- type: bleu
214
- value: 24.6
215
- - name: chr-F
216
- type: chrf
217
- value: 0.54285
218
- - task:
219
- name: Translation cat-por
220
- type: translation
221
- args: cat-por
222
- dataset:
223
- name: flores101-devtest
224
- type: flores_101
225
- args: cat por devtest
226
- metrics:
227
- - name: BLEU
228
- type: bleu
229
- value: 37.7
230
- - name: chr-F
231
- type: chrf
232
- value: 0.62913
233
- - task:
234
- name: Translation cat-ron
235
- type: translation
236
- args: cat-ron
237
- dataset:
238
- name: flores101-devtest
239
- type: flores_101
240
- args: cat ron devtest
241
- metrics:
242
- - name: BLEU
243
- type: bleu
244
- value: 29.5
245
- - name: chr-F
246
- type: chrf
247
- value: 0.56885
248
- - task:
249
- name: Translation cat-spa
250
- type: translation
251
- args: cat-spa
252
- dataset:
253
- name: flores101-devtest
254
- type: flores_101
255
- args: cat spa devtest
256
- metrics:
257
- - name: BLEU
258
- type: bleu
259
- value: 24.6
260
- - name: chr-F
261
- type: chrf
262
- value: 0.53372
263
- - task:
264
- name: Translation fra-ast
265
- type: translation
266
- args: fra-ast
267
- dataset:
268
- name: flores101-devtest
269
- type: flores_101
270
- args: fra ast devtest
271
- metrics:
272
- - name: BLEU
273
- type: bleu
274
- value: 20.7
275
- - name: chr-F
276
- type: chrf
277
- value: 0.52696
278
- - task:
279
- name: Translation fra-cat
280
- type: translation
281
- args: fra-cat
282
- dataset:
283
- name: flores101-devtest
284
- type: flores_101
285
- args: fra cat devtest
286
- metrics:
287
- - name: BLEU
288
- type: bleu
289
- value: 34.6
290
- - name: chr-F
291
- type: chrf
292
- value: 0.60492
293
- - task:
294
- name: Translation fra-glg
295
- type: translation
296
- args: fra-glg
297
- dataset:
298
- name: flores101-devtest
299
- type: flores_101
300
- args: fra glg devtest
301
- metrics:
302
- - name: BLEU
303
- type: bleu
304
- value: 30.3
305
- - name: chr-F
306
- type: chrf
307
- value: 0.57485
308
- - task:
309
- name: Translation fra-ita
310
- type: translation
311
- args: fra-ita
312
- dataset:
313
- name: flores101-devtest
314
- type: flores_101
315
- args: fra ita devtest
316
- metrics:
317
- - name: BLEU
318
- type: bleu
319
- value: 27.3
320
- - name: chr-F
321
- type: chrf
322
- value: 0.56493
323
- - task:
324
- name: Translation fra-oci
325
- type: translation
326
- args: fra-oci
327
- dataset:
328
- name: flores101-devtest
329
- type: flores_101
330
- args: fra oci devtest
331
- metrics:
332
- - name: BLEU
333
- type: bleu
334
- value: 28.2
335
- - name: chr-F
336
- type: chrf
337
- value: 0.57449
338
- - task:
339
- name: Translation fra-por
340
- type: translation
341
- args: fra-por
342
- dataset:
343
- name: flores101-devtest
344
- type: flores_101
345
- args: fra por devtest
346
- metrics:
347
- - name: BLEU
348
- type: bleu
349
- value: 36.9
350
- - name: chr-F
351
- type: chrf
352
- value: 0.62211
353
- - task:
354
- name: Translation fra-ron
355
- type: translation
356
- args: fra-ron
357
- dataset:
358
- name: flores101-devtest
359
- type: flores_101
360
- args: fra ron devtest
361
- metrics:
362
- - name: BLEU
363
- type: bleu
364
- value: 29.4
365
- - name: chr-F
366
- type: chrf
367
- value: 0.56998
368
- - task:
369
- name: Translation fra-spa
370
- type: translation
371
- args: fra-spa
372
- dataset:
373
- name: flores101-devtest
374
- type: flores_101
375
- args: fra spa devtest
376
- metrics:
377
- - name: BLEU
378
- type: bleu
379
- value: 24.2
380
- - name: chr-F
381
- type: chrf
382
- value: 0.52880
383
- - task:
384
- name: Translation glg-ast
385
- type: translation
386
- args: glg-ast
387
- dataset:
388
- name: flores101-devtest
389
- type: flores_101
390
- args: glg ast devtest
391
- metrics:
392
- - name: BLEU
393
- type: bleu
394
- value: 22.4
395
- - name: chr-F
396
- type: chrf
397
- value: 0.55090
398
- - task:
399
- name: Translation glg-cat
400
- type: translation
401
- args: glg-cat
402
- dataset:
403
- name: flores101-devtest
404
- type: flores_101
405
- args: glg cat devtest
406
- metrics:
407
- - name: BLEU
408
- type: bleu
409
- value: 32.6
410
- - name: chr-F
411
- type: chrf
412
- value: 0.60550
413
- - task:
414
- name: Translation glg-fra
415
- type: translation
416
- args: glg-fra
417
- dataset:
418
- name: flores101-devtest
419
- type: flores_101
420
- args: glg fra devtest
421
- metrics:
422
- - name: BLEU
423
- type: bleu
424
- value: 36.0
425
- - name: chr-F
426
- type: chrf
427
- value: 0.62026
428
- - task:
429
- name: Translation glg-ita
430
- type: translation
431
- args: glg-ita
432
- dataset:
433
- name: flores101-devtest
434
- type: flores_101
435
- args: glg ita devtest
436
- metrics:
437
- - name: BLEU
438
- type: bleu
439
- value: 25.9
440
- - name: chr-F
441
- type: chrf
442
- value: 0.55834
443
- - task:
444
- name: Translation glg-oci
445
- type: translation
446
- args: glg-oci
447
- dataset:
448
- name: flores101-devtest
449
- type: flores_101
450
- args: glg oci devtest
451
- metrics:
452
- - name: BLEU
453
- type: bleu
454
- value: 21.9
455
- - name: chr-F
456
- type: chrf
457
- value: 0.52520
458
- - task:
459
- name: Translation glg-por
460
- type: translation
461
- args: glg-por
462
- dataset:
463
- name: flores101-devtest
464
- type: flores_101
465
- args: glg por devtest
466
- metrics:
467
- - name: BLEU
468
- type: bleu
469
- value: 32.7
470
- - name: chr-F
471
- type: chrf
472
- value: 0.60027
473
- - task:
474
- name: Translation glg-ron
475
- type: translation
476
- args: glg-ron
477
- dataset:
478
- name: flores101-devtest
479
- type: flores_101
480
- args: glg ron devtest
481
- metrics:
482
- - name: BLEU
483
- type: bleu
484
- value: 27.8
485
- - name: chr-F
486
- type: chrf
487
- value: 0.55621
488
- - task:
489
- name: Translation glg-spa
490
- type: translation
491
- args: glg-spa
492
- dataset:
493
- name: flores101-devtest
494
- type: flores_101
495
- args: glg spa devtest
496
- metrics:
497
- - name: BLEU
498
- type: bleu
499
- value: 24.4
500
- - name: chr-F
501
- type: chrf
502
- value: 0.53219
503
- - task:
504
- name: Translation ita-ast
505
- type: translation
506
- args: ita-ast
507
- dataset:
508
- name: flores101-devtest
509
- type: flores_101
510
- args: ita ast devtest
511
- metrics:
512
- - name: BLEU
513
- type: bleu
514
- value: 17.1
515
- - name: chr-F
516
- type: chrf
517
- value: 0.50741
518
- - task:
519
- name: Translation ita-cat
520
- type: translation
521
- args: ita-cat
522
- dataset:
523
- name: flores101-devtest
524
- type: flores_101
525
- args: ita cat devtest
526
- metrics:
527
- - name: BLEU
528
- type: bleu
529
- value: 27.9
530
- - name: chr-F
531
- type: chrf
532
- value: 0.57061
533
- - task:
534
- name: Translation ita-fra
535
- type: translation
536
- args: ita-fra
537
- dataset:
538
- name: flores101-devtest
539
- type: flores_101
540
- args: ita fra devtest
541
- metrics:
542
- - name: BLEU
543
- type: bleu
544
- value: 32.0
545
- - name: chr-F
546
- type: chrf
547
- value: 0.60199
548
- - task:
549
- name: Translation ita-glg
550
- type: translation
551
- args: ita-glg
552
- dataset:
553
- name: flores101-devtest
554
- type: flores_101
555
- args: ita glg devtest
556
- metrics:
557
- - name: BLEU
558
- type: bleu
559
- value: 25.9
560
- - name: chr-F
561
- type: chrf
562
- value: 0.55312
563
- - task:
564
- name: Translation ita-oci
565
- type: translation
566
- args: ita-oci
567
- dataset:
568
- name: flores101-devtest
569
- type: flores_101
570
- args: ita oci devtest
571
- metrics:
572
- - name: BLEU
573
- type: bleu
574
- value: 18.1
575
- - name: chr-F
576
- type: chrf
577
- value: 0.48447
578
- - task:
579
- name: Translation ita-por
580
- type: translation
581
- args: ita-por
582
- dataset:
583
- name: flores101-devtest
584
- type: flores_101
585
- args: ita por devtest
586
- metrics:
587
- - name: BLEU
588
- type: bleu
589
- value: 29.0
590
- - name: chr-F
591
- type: chrf
592
- value: 0.58162
593
- - task:
594
- name: Translation ita-ron
595
- type: translation
596
- args: ita-ron
597
- dataset:
598
- name: flores101-devtest
599
- type: flores_101
600
- args: ita ron devtest
601
- metrics:
602
- - name: BLEU
603
- type: bleu
604
- value: 24.2
605
- - name: chr-F
606
- type: chrf
607
- value: 0.53703
608
- - task:
609
- name: Translation ita-spa
610
- type: translation
611
- args: ita-spa
612
- dataset:
613
- name: flores101-devtest
614
- type: flores_101
615
- args: ita spa devtest
616
- metrics:
617
- - name: BLEU
618
- type: bleu
619
- value: 23.1
620
- - name: chr-F
621
- type: chrf
622
- value: 0.52238
623
- - task:
624
- name: Translation oci-ast
625
- type: translation
626
- args: oci-ast
627
- dataset:
628
- name: flores101-devtest
629
- type: flores_101
630
- args: oci ast devtest
631
- metrics:
632
- - name: BLEU
633
- type: bleu
634
- value: 20.2
635
- - name: chr-F
636
- type: chrf
637
- value: 0.53010
638
- - task:
639
- name: Translation oci-cat
640
- type: translation
641
- args: oci-cat
642
- dataset:
643
- name: flores101-devtest
644
- type: flores_101
645
- args: oci cat devtest
646
- metrics:
647
- - name: BLEU
648
- type: bleu
649
- value: 32.2
650
- - name: chr-F
651
- type: chrf
652
- value: 0.59946
653
- - task:
654
- name: Translation oci-fra
655
- type: translation
656
- args: oci-fra
657
- dataset:
658
- name: flores101-devtest
659
- type: flores_101
660
- args: oci fra devtest
661
- metrics:
662
- - name: BLEU
663
- type: bleu
664
- value: 39.0
665
- - name: chr-F
666
- type: chrf
667
- value: 0.64290
668
- - task:
669
- name: Translation oci-glg
670
- type: translation
671
- args: oci-glg
672
- dataset:
673
- name: flores101-devtest
674
- type: flores_101
675
- args: oci glg devtest
676
- metrics:
677
- - name: BLEU
678
- type: bleu
679
- value: 28.0
680
- - name: chr-F
681
- type: chrf
682
- value: 0.56737
683
- - task:
684
- name: Translation oci-ita
685
- type: translation
686
- args: oci-ita
687
- dataset:
688
- name: flores101-devtest
689
- type: flores_101
690
- args: oci ita devtest
691
- metrics:
692
- - name: BLEU
693
- type: bleu
694
- value: 24.2
695
- - name: chr-F
696
- type: chrf
697
- value: 0.54220
698
- - task:
699
- name: Translation oci-por
700
- type: translation
701
- args: oci-por
702
- dataset:
703
- name: flores101-devtest
704
- type: flores_101
705
- args: oci por devtest
706
- metrics:
707
- - name: BLEU
708
- type: bleu
709
- value: 35.7
710
- - name: chr-F
711
- type: chrf
712
- value: 0.62127
713
- - task:
714
- name: Translation oci-ron
715
- type: translation
716
- args: oci-ron
717
- dataset:
718
- name: flores101-devtest
719
- type: flores_101
720
- args: oci ron devtest
721
- metrics:
722
- - name: BLEU
723
- type: bleu
724
- value: 28.0
725
- - name: chr-F
726
- type: chrf
727
- value: 0.55906
728
- - task:
729
- name: Translation oci-spa
730
- type: translation
731
- args: oci-spa
732
- dataset:
733
- name: flores101-devtest
734
- type: flores_101
735
- args: oci spa devtest
736
- metrics:
737
- - name: BLEU
738
- type: bleu
739
- value: 22.8
740
- - name: chr-F
741
- type: chrf
742
- value: 0.52110
743
- - task:
744
- name: Translation por-ast
745
- type: translation
746
- args: por-ast
747
- dataset:
748
- name: flores101-devtest
749
- type: flores_101
750
- args: por ast devtest
751
- metrics:
752
- - name: BLEU
753
- type: bleu
754
- value: 22.5
755
- - name: chr-F
756
- type: chrf
757
- value: 0.54539
758
- - task:
759
- name: Translation por-cat
760
- type: translation
761
- args: por-cat
762
- dataset:
763
- name: flores101-devtest
764
- type: flores_101
765
- args: por cat devtest
766
- metrics:
767
- - name: BLEU
768
- type: bleu
769
- value: 36.4
770
- - name: chr-F
771
- type: chrf
772
- value: 0.61809
773
- - task:
774
- name: Translation por-fra
775
- type: translation
776
- args: por-fra
777
- dataset:
778
- name: flores101-devtest
779
- type: flores_101
780
- args: por fra devtest
781
- metrics:
782
- - name: BLEU
783
- type: bleu
784
- value: 39.7
785
- - name: chr-F
786
- type: chrf
787
- value: 0.64343
788
- - task:
789
- name: Translation por-glg
790
- type: translation
791
- args: por-glg
792
- dataset:
793
- name: flores101-devtest
794
- type: flores_101
795
- args: por glg devtest
796
- metrics:
797
- - name: BLEU
798
- type: bleu
799
- value: 30.4
800
- - name: chr-F
801
- type: chrf
802
- value: 0.57965
803
- - task:
804
- name: Translation por-ita
805
- type: translation
806
- args: por-ita
807
- dataset:
808
- name: flores101-devtest
809
- type: flores_101
810
- args: por ita devtest
811
- metrics:
812
- - name: BLEU
813
- type: bleu
814
- value: 26.3
815
- - name: chr-F
816
- type: chrf
817
- value: 0.55841
818
- - task:
819
- name: Translation por-oci
820
- type: translation
821
- args: por-oci
822
- dataset:
823
- name: flores101-devtest
824
- type: flores_101
825
- args: por oci devtest
826
- metrics:
827
- - name: BLEU
828
- type: bleu
829
- value: 25.3
830
- - name: chr-F
831
- type: chrf
832
- value: 0.54829
833
- - task:
834
- name: Translation por-ron
835
- type: translation
836
- args: por-ron
837
- dataset:
838
- name: flores101-devtest
839
- type: flores_101
840
- args: por ron devtest
841
- metrics:
842
- - name: BLEU
843
- type: bleu
844
- value: 29.8
845
- - name: chr-F
846
- type: chrf
847
- value: 0.57283
848
- - task:
849
- name: Translation por-spa
850
- type: translation
851
- args: por-spa
852
- dataset:
853
- name: flores101-devtest
854
- type: flores_101
855
- args: por spa devtest
856
- metrics:
857
- - name: BLEU
858
- type: bleu
859
- value: 25.2
860
- - name: chr-F
861
- type: chrf
862
- value: 0.53513
863
- - task:
864
- name: Translation ron-ast
865
- type: translation
866
- args: ron-ast
867
- dataset:
868
- name: flores101-devtest
869
- type: flores_101
870
- args: ron ast devtest
871
- metrics:
872
- - name: BLEU
873
- type: bleu
874
- value: 20.1
875
- - name: chr-F
876
- type: chrf
877
- value: 0.52265
878
- - task:
879
- name: Translation ron-cat
880
- type: translation
881
- args: ron-cat
882
- dataset:
883
- name: flores101-devtest
884
- type: flores_101
885
- args: ron cat devtest
886
- metrics:
887
- - name: BLEU
888
- type: bleu
889
- value: 32.6
890
- - name: chr-F
891
- type: chrf
892
- value: 0.59689
893
- - task:
894
- name: Translation ron-fra
895
- type: translation
896
- args: ron-fra
897
- dataset:
898
- name: flores101-devtest
899
- type: flores_101
900
- args: ron fra devtest
901
- metrics:
902
- - name: BLEU
903
- type: bleu
904
- value: 37.4
905
- - name: chr-F
906
- type: chrf
907
- value: 0.63060
908
- - task:
909
- name: Translation ron-glg
910
- type: translation
911
- args: ron-glg
912
- dataset:
913
- name: flores101-devtest
914
- type: flores_101
915
- args: ron glg devtest
916
- metrics:
917
- - name: BLEU
918
- type: bleu
919
- value: 29.3
920
- - name: chr-F
921
- type: chrf
922
- value: 0.56677
923
- - task:
924
- name: Translation ron-ita
925
- type: translation
926
- args: ron-ita
927
- dataset:
928
- name: flores101-devtest
929
- type: flores_101
930
- args: ron ita devtest
931
- metrics:
932
- - name: BLEU
933
- type: bleu
934
- value: 25.6
935
- - name: chr-F
936
- type: chrf
937
- value: 0.55485
938
- - task:
939
- name: Translation ron-oci
940
- type: translation
941
- args: ron-oci
942
- dataset:
943
- name: flores101-devtest
944
- type: flores_101
945
- args: ron oci devtest
946
- metrics:
947
- - name: BLEU
948
- type: bleu
949
- value: 21.8
950
- - name: chr-F
951
- type: chrf
952
- value: 0.52433
953
- - task:
954
- name: Translation ron-por
955
- type: translation
956
- args: ron-por
957
- dataset:
958
- name: flores101-devtest
959
- type: flores_101
960
- args: ron por devtest
961
- metrics:
962
- - name: BLEU
963
- type: bleu
964
- value: 36.1
965
- - name: chr-F
966
- type: chrf
967
- value: 0.61831
968
- - task:
969
- name: Translation ron-spa
970
- type: translation
971
- args: ron-spa
972
- dataset:
973
- name: flores101-devtest
974
- type: flores_101
975
- args: ron spa devtest
976
- metrics:
977
- - name: BLEU
978
- type: bleu
979
- value: 24.1
980
- - name: chr-F
981
- type: chrf
982
- value: 0.52712
983
- - task:
984
- name: Translation spa-ast
985
- type: translation
986
- args: spa-ast
987
- dataset:
988
- name: flores101-devtest
989
- type: flores_101
990
- args: spa ast devtest
991
- metrics:
992
- - name: BLEU
993
- type: bleu
994
- value: 15.7
995
- - name: chr-F
996
- type: chrf
997
- value: 0.49008
998
- - task:
999
- name: Translation spa-cat
1000
- type: translation
1001
- args: spa-cat
1002
- dataset:
1003
- name: flores101-devtest
1004
- type: flores_101
1005
- args: spa cat devtest
1006
- metrics:
1007
- - name: BLEU
1008
- type: bleu
1009
- value: 23.2
1010
- - name: chr-F
1011
- type: chrf
1012
- value: 0.53905
1013
- - task:
1014
- name: Translation spa-fra
1015
- type: translation
1016
- args: spa-fra
1017
- dataset:
1018
- name: flores101-devtest
1019
- type: flores_101
1020
- args: spa fra devtest
1021
- metrics:
1022
- - name: BLEU
1023
- type: bleu
1024
- value: 27.4
1025
- - name: chr-F
1026
- type: chrf
1027
- value: 0.57078
1028
- - task:
1029
- name: Translation spa-glg
1030
- type: translation
1031
- args: spa-glg
1032
- dataset:
1033
- name: flores101-devtest
1034
- type: flores_101
1035
- args: spa glg devtest
1036
- metrics:
1037
- - name: BLEU
1038
- type: bleu
1039
- value: 22.0
1040
- - name: chr-F
1041
- type: chrf
1042
- value: 0.52563
1043
- - task:
1044
- name: Translation spa-ita
1045
- type: translation
1046
- args: spa-ita
1047
- dataset:
1048
- name: flores101-devtest
1049
- type: flores_101
1050
- args: spa ita devtest
1051
- metrics:
1052
- - name: BLEU
1053
- type: bleu
1054
- value: 22.3
1055
- - name: chr-F
1056
- type: chrf
1057
- value: 0.52783
1058
- - task:
1059
- name: Translation spa-oci
1060
- type: translation
1061
- args: spa-oci
1062
- dataset:
1063
- name: flores101-devtest
1064
- type: flores_101
1065
- args: spa oci devtest
1066
- metrics:
1067
- - name: BLEU
1068
- type: bleu
1069
- value: 16.3
1070
- - name: chr-F
1071
- type: chrf
1072
- value: 0.48064
1073
- - task:
1074
- name: Translation spa-por
1075
- type: translation
1076
- args: spa-por
1077
- dataset:
1078
- name: flores101-devtest
1079
- type: flores_101
1080
- args: spa por devtest
1081
- metrics:
1082
- - name: BLEU
1083
- type: bleu
1084
- value: 25.8
1085
- - name: chr-F
1086
- type: chrf
1087
- value: 0.55736
1088
- - task:
1089
- name: Translation spa-ron
1090
- type: translation
1091
- args: spa-ron
1092
- dataset:
1093
- name: flores101-devtest
1094
- type: flores_101
1095
- args: spa ron devtest
1096
- metrics:
1097
- - name: BLEU
1098
- type: bleu
1099
- value: 21.4
1100
- - name: chr-F
1101
- type: chrf
1102
- value: 0.51623
1103
- - task:
1104
- name: Translation fra-spa
1105
- type: translation
1106
- args: fra-spa
1107
- dataset:
1108
- name: news-test2008
1109
- type: news-test2008
1110
- args: fra-spa
1111
- metrics:
1112
- - name: BLEU
1113
- type: bleu
1114
- value: 33.9
1115
- - name: chr-F
1116
- type: chrf
1117
- value: 0.58939
1118
- - task:
1119
- name: Translation spa-fra
1120
- type: translation
1121
- args: spa-fra
1122
- dataset:
1123
- name: news-test2008
1124
- type: news-test2008
1125
- args: spa-fra
1126
- metrics:
1127
- - name: BLEU
1128
- type: bleu
1129
- value: 32.4
1130
- - name: chr-F
1131
- type: chrf
1132
- value: 0.58695
1133
- - task:
1134
- name: Translation cat-fra
1135
- type: translation
1136
- args: cat-fra
1137
- dataset:
1138
- name: tatoeba-test-v2021-08-07
1139
- type: tatoeba_mt
1140
- args: cat-fra
1141
- metrics:
1142
- - name: BLEU
1143
- type: bleu
1144
- value: 54.6
1145
- - name: chr-F
1146
- type: chrf
1147
- value: 0.71201
1148
- - task:
1149
- name: Translation cat-ita
1150
- type: translation
1151
- args: cat-ita
1152
- dataset:
1153
- name: tatoeba-test-v2021-08-07
1154
- type: tatoeba_mt
1155
- args: cat-ita
1156
- metrics:
1157
- - name: BLEU
1158
- type: bleu
1159
- value: 58.4
1160
- - name: chr-F
1161
- type: chrf
1162
- value: 0.74198
1163
- - task:
1164
- name: Translation cat-por
1165
- type: translation
1166
- args: cat-por
1167
- dataset:
1168
- name: tatoeba-test-v2021-08-07
1169
- type: tatoeba_mt
1170
- args: cat-por
1171
- metrics:
1172
- - name: BLEU
1173
- type: bleu
1174
- value: 57.4
1175
- - name: chr-F
1176
- type: chrf
1177
- value: 0.74930
1178
- - task:
1179
- name: Translation cat-spa
1180
- type: translation
1181
- args: cat-spa
1182
- dataset:
1183
- name: tatoeba-test-v2021-08-07
1184
- type: tatoeba_mt
1185
- args: cat-spa
1186
- metrics:
1187
- - name: BLEU
1188
- type: bleu
1189
- value: 78.1
1190
- - name: chr-F
1191
- type: chrf
1192
- value: 0.87844
1193
- - task:
1194
- name: Translation fra-cat
1195
- type: translation
1196
- args: fra-cat
1197
- dataset:
1198
- name: tatoeba-test-v2021-08-07
1199
- type: tatoeba_mt
1200
- args: fra-cat
1201
- metrics:
1202
- - name: BLEU
1203
- type: bleu
1204
- value: 46.2
1205
- - name: chr-F
1206
- type: chrf
1207
- value: 0.66525
1208
- - task:
1209
- name: Translation fra-ita
1210
- type: translation
1211
- args: fra-ita
1212
- dataset:
1213
- name: tatoeba-test-v2021-08-07
1214
- type: tatoeba_mt
1215
- args: fra-ita
1216
- metrics:
1217
- - name: BLEU
1218
- type: bleu
1219
- value: 53.8
1220
- - name: chr-F
1221
- type: chrf
1222
- value: 0.72742
1223
- - task:
1224
- name: Translation fra-por
1225
- type: translation
1226
- args: fra-por
1227
- dataset:
1228
- name: tatoeba-test-v2021-08-07
1229
- type: tatoeba_mt
1230
- args: fra-por
1231
- metrics:
1232
- - name: BLEU
1233
- type: bleu
1234
- value: 48.6
1235
- - name: chr-F
1236
- type: chrf
1237
- value: 0.68413
1238
- - task:
1239
- name: Translation fra-ron
1240
- type: translation
1241
- args: fra-ron
1242
- dataset:
1243
- name: tatoeba-test-v2021-08-07
1244
- type: tatoeba_mt
1245
- args: fra-ron
1246
- metrics:
1247
- - name: BLEU
1248
- type: bleu
1249
- value: 44.0
1250
- - name: chr-F
1251
- type: chrf
1252
- value: 0.65009
1253
- - task:
1254
- name: Translation fra-spa
1255
- type: translation
1256
- args: fra-spa
1257
- dataset:
1258
- name: tatoeba-test-v2021-08-07
1259
- type: tatoeba_mt
1260
- args: fra-spa
1261
- metrics:
1262
- - name: BLEU
1263
- type: bleu
1264
- value: 54.8
1265
- - name: chr-F
1266
- type: chrf
1267
- value: 0.72080
1268
- - task:
1269
- name: Translation glg-por
1270
- type: translation
1271
- args: glg-por
1272
- dataset:
1273
- name: tatoeba-test-v2021-08-07
1274
- type: tatoeba_mt
1275
- args: glg-por
1276
- metrics:
1277
- - name: BLEU
1278
- type: bleu
1279
- value: 61.1
1280
- - name: chr-F
1281
- type: chrf
1282
- value: 0.76720
1283
- - task:
1284
- name: Translation glg-spa
1285
- type: translation
1286
- args: glg-spa
1287
- dataset:
1288
- name: tatoeba-test-v2021-08-07
1289
- type: tatoeba_mt
1290
- args: glg-spa
1291
- metrics:
1292
- - name: BLEU
1293
- type: bleu
1294
- value: 71.7
1295
- - name: chr-F
1296
- type: chrf
1297
- value: 0.82362
1298
- - task:
1299
- name: Translation ita-cat
1300
- type: translation
1301
- args: ita-cat
1302
- dataset:
1303
- name: tatoeba-test-v2021-08-07
1304
- type: tatoeba_mt
1305
- args: ita-cat
1306
- metrics:
1307
- - name: BLEU
1308
- type: bleu
1309
- value: 56.4
1310
- - name: chr-F
1311
- type: chrf
1312
- value: 0.72529
1313
- - task:
1314
- name: Translation ita-fra
1315
- type: translation
1316
- args: ita-fra
1317
- dataset:
1318
- name: tatoeba-test-v2021-08-07
1319
- type: tatoeba_mt
1320
- args: ita-fra
1321
- metrics:
1322
- - name: BLEU
1323
- type: bleu
1324
- value: 65.2
1325
- - name: chr-F
1326
- type: chrf
1327
- value: 0.77932
1328
- - task:
1329
- name: Translation ita-por
1330
- type: translation
1331
- args: ita-por
1332
- dataset:
1333
- name: tatoeba-test-v2021-08-07
1334
- type: tatoeba_mt
1335
- args: ita-por
1336
- metrics:
1337
- - name: BLEU
1338
- type: bleu
1339
- value: 54.0
1340
- - name: chr-F
1341
- type: chrf
1342
- value: 0.72798
1343
- - task:
1344
- name: Translation ita-ron
1345
- type: translation
1346
- args: ita-ron
1347
- dataset:
1348
- name: tatoeba-test-v2021-08-07
1349
- type: tatoeba_mt
1350
- args: ita-ron
1351
- metrics:
1352
- - name: BLEU
1353
- type: bleu
1354
- value: 51.1
1355
- - name: chr-F
1356
- type: chrf
1357
- value: 0.70814
1358
- - task:
1359
- name: Translation ita-spa
1360
- type: translation
1361
- args: ita-spa
1362
- dataset:
1363
- name: tatoeba-test-v2021-08-07
1364
- type: tatoeba_mt
1365
- args: ita-spa
1366
- metrics:
1367
- - name: BLEU
1368
- type: bleu
1369
- value: 62.9
1370
- - name: chr-F
1371
- type: chrf
1372
- value: 0.77455
1373
- - task:
1374
- name: Translation lad-spa
1375
- type: translation
1376
- args: lad-spa
1377
- dataset:
1378
- name: tatoeba-test-v2021-08-07
1379
- type: tatoeba_mt
1380
- args: lad-spa
1381
- metrics:
1382
- - name: BLEU
1383
- type: bleu
1384
- value: 34.7
1385
- - name: chr-F
1386
- type: chrf
1387
- value: 0.52243
1388
- - task:
1389
- name: Translation lad_Latn-spa
1390
- type: translation
1391
- args: lad_Latn-spa
1392
- dataset:
1393
- name: tatoeba-test-v2021-08-07
1394
- type: tatoeba_mt
1395
- args: lad_Latn-spa
1396
- metrics:
1397
- - name: BLEU
1398
- type: bleu
1399
- value: 42.6
1400
- - name: chr-F
1401
- type: chrf
1402
- value: 0.59363
1403
- - task:
1404
- name: Translation oci-fra
1405
- type: translation
1406
- args: oci-fra
1407
- dataset:
1408
- name: tatoeba-test-v2021-08-07
1409
- type: tatoeba_mt
1410
- args: oci-fra
1411
- metrics:
1412
- - name: BLEU
1413
- type: bleu
1414
- value: 29.6
1415
- - name: chr-F
1416
- type: chrf
1417
- value: 0.49660
1418
- - task:
1419
- name: Translation pms-ita
1420
- type: translation
1421
- args: pms-ita
1422
- dataset:
1423
- name: tatoeba-test-v2021-08-07
1424
- type: tatoeba_mt
1425
- args: pms-ita
1426
- metrics:
1427
- - name: BLEU
1428
- type: bleu
1429
- value: 20.0
1430
- - name: chr-F
1431
- type: chrf
1432
- value: 0.40221
1433
- - task:
1434
- name: Translation por-cat
1435
- type: translation
1436
- args: por-cat
1437
- dataset:
1438
- name: tatoeba-test-v2021-08-07
1439
- type: tatoeba_mt
1440
- args: por-cat
1441
- metrics:
1442
- - name: BLEU
1443
- type: bleu
1444
- value: 52.2
1445
- - name: chr-F
1446
- type: chrf
1447
- value: 0.71146
1448
- - task:
1449
- name: Translation por-fra
1450
- type: translation
1451
- args: por-fra
1452
- dataset:
1453
- name: tatoeba-test-v2021-08-07
1454
- type: tatoeba_mt
1455
- args: por-fra
1456
- metrics:
1457
- - name: BLEU
1458
- type: bleu
1459
- value: 60.9
1460
- - name: chr-F
1461
- type: chrf
1462
- value: 0.75565
1463
- - task:
1464
- name: Translation por-glg
1465
- type: translation
1466
- args: por-glg
1467
- dataset:
1468
- name: tatoeba-test-v2021-08-07
1469
- type: tatoeba_mt
1470
- args: por-glg
1471
- metrics:
1472
- - name: BLEU
1473
- type: bleu
1474
- value: 59.0
1475
- - name: chr-F
1476
- type: chrf
1477
- value: 0.75348
1478
- - task:
1479
- name: Translation por-ita
1480
- type: translation
1481
- args: por-ita
1482
- dataset:
1483
- name: tatoeba-test-v2021-08-07
1484
- type: tatoeba_mt
1485
- args: por-ita
1486
- metrics:
1487
- - name: BLEU
1488
- type: bleu
1489
- value: 58.8
1490
- - name: chr-F
1491
- type: chrf
1492
- value: 0.76883
1493
- - task:
1494
- name: Translation por-ron
1495
- type: translation
1496
- args: por-ron
1497
- dataset:
1498
- name: tatoeba-test-v2021-08-07
1499
- type: tatoeba_mt
1500
- args: por-ron
1501
- metrics:
1502
- - name: BLEU
1503
- type: bleu
1504
- value: 46.6
1505
- - name: chr-F
1506
- type: chrf
1507
- value: 0.67838
1508
- - task:
1509
- name: Translation por-spa
1510
- type: translation
1511
- args: por-spa
1512
- dataset:
1513
- name: tatoeba-test-v2021-08-07
1514
- type: tatoeba_mt
1515
- args: por-spa
1516
- metrics:
1517
- - name: BLEU
1518
- type: bleu
1519
- value: 64.8
1520
- - name: chr-F
1521
- type: chrf
1522
- value: 0.79336
1523
- - task:
1524
- name: Translation ron-fra
1525
- type: translation
1526
- args: ron-fra
1527
- dataset:
1528
- name: tatoeba-test-v2021-08-07
1529
- type: tatoeba_mt
1530
- args: ron-fra
1531
- metrics:
1532
- - name: BLEU
1533
- type: bleu
1534
- value: 55.0
1535
- - name: chr-F
1536
- type: chrf
1537
- value: 0.70307
1538
- - task:
1539
- name: Translation ron-ita
1540
- type: translation
1541
- args: ron-ita
1542
- dataset:
1543
- name: tatoeba-test-v2021-08-07
1544
- type: tatoeba_mt
1545
- args: ron-ita
1546
- metrics:
1547
- - name: BLEU
1548
- type: bleu
1549
- value: 53.7
1550
- - name: chr-F
1551
- type: chrf
1552
- value: 0.73862
1553
- - task:
1554
- name: Translation ron-por
1555
- type: translation
1556
- args: ron-por
1557
- dataset:
1558
- name: tatoeba-test-v2021-08-07
1559
- type: tatoeba_mt
1560
- args: ron-por
1561
- metrics:
1562
- - name: BLEU
1563
- type: bleu
1564
- value: 50.7
1565
- - name: chr-F
1566
- type: chrf
1567
- value: 0.70889
1568
- - task:
1569
- name: Translation ron-spa
1570
- type: translation
1571
- args: ron-spa
1572
- dataset:
1573
- name: tatoeba-test-v2021-08-07
1574
- type: tatoeba_mt
1575
- args: ron-spa
1576
- metrics:
1577
- - name: BLEU
1578
- type: bleu
1579
- value: 57.2
1580
- - name: chr-F
1581
- type: chrf
1582
- value: 0.73529
1583
- - task:
1584
- name: Translation spa-cat
1585
- type: translation
1586
- args: spa-cat
1587
- dataset:
1588
- name: tatoeba-test-v2021-08-07
1589
- type: tatoeba_mt
1590
- args: spa-cat
1591
- metrics:
1592
- - name: BLEU
1593
- type: bleu
1594
- value: 67.9
1595
- - name: chr-F
1596
- type: chrf
1597
- value: 0.82758
1598
- - task:
1599
- name: Translation spa-fra
1600
- type: translation
1601
- args: spa-fra
1602
- dataset:
1603
- name: tatoeba-test-v2021-08-07
1604
- type: tatoeba_mt
1605
- args: spa-fra
1606
- metrics:
1607
- - name: BLEU
1608
- type: bleu
1609
- value: 57.3
1610
- - name: chr-F
1611
- type: chrf
1612
- value: 0.73113
1613
  - task:
1614
- name: Translation spa-glg
1615
  type: translation
1616
- args: spa-glg
1617
  dataset:
1618
- name: tatoeba-test-v2021-08-07
1619
- type: tatoeba_mt
1620
- args: spa-glg
1621
  metrics:
1622
- - name: BLEU
1623
- type: bleu
1624
- value: 63.0
1625
- - name: chr-F
1626
- type: chrf
1627
- value: 0.77332
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1628
  - task:
1629
- name: Translation spa-ita
1630
  type: translation
1631
- args: spa-ita
1632
  dataset:
1633
- name: tatoeba-test-v2021-08-07
1634
- type: tatoeba_mt
1635
- args: spa-ita
1636
  metrics:
1637
- - name: BLEU
1638
- type: bleu
1639
- value: 60.3
1640
- - name: chr-F
1641
- type: chrf
1642
- value: 0.77046
 
 
 
 
 
 
1643
  - task:
1644
- name: Translation spa-por
1645
  type: translation
1646
- args: spa-por
1647
  dataset:
1648
  name: tatoeba-test-v2021-08-07
1649
  type: tatoeba_mt
1650
- args: spa-por
1651
  metrics:
1652
- - name: BLEU
1653
- type: bleu
1654
- value: 59.1
1655
- - name: chr-F
1656
- type: chrf
1657
- value: 0.75854
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1658
  - task:
1659
- name: Translation spa-ron
1660
  type: translation
1661
- args: spa-ron
1662
- dataset:
1663
- name: tatoeba-test-v2021-08-07
1664
- type: tatoeba_mt
1665
- args: spa-ron
1666
- metrics:
1667
- - name: BLEU
1668
- type: bleu
1669
- value: 45.5
1670
- - name: chr-F
1671
- type: chrf
1672
- value: 0.66679
1673
- - task:
1674
  name: Translation fra-ita
1675
- type: translation
1676
- args: fra-ita
1677
  dataset:
1678
  name: newstest2009
1679
  type: wmt-2009-news
1680
  args: fra-ita
1681
  metrics:
1682
- - name: BLEU
1683
- type: bleu
1684
- value: 31.2
1685
- - name: chr-F
1686
- type: chrf
1687
- value: 0.59764
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1688
  - task:
1689
- name: Translation fra-spa
1690
- type: translation
1691
- args: fra-spa
1692
- dataset:
1693
- name: newstest2009
1694
- type: wmt-2009-news
1695
- args: fra-spa
1696
- metrics:
1697
- - name: BLEU
1698
- type: bleu
1699
- value: 32.5
1700
- - name: chr-F
1701
- type: chrf
1702
- value: 0.58829
1703
- - task:
1704
- name: Translation ita-fra
1705
- type: translation
1706
- args: ita-fra
1707
- dataset:
1708
- name: newstest2009
1709
- type: wmt-2009-news
1710
- args: ita-fra
1711
- metrics:
1712
- - name: BLEU
1713
- type: bleu
1714
- value: 31.6
1715
- - name: chr-F
1716
- type: chrf
1717
- value: 0.59084
1718
- - task:
1719
- name: Translation ita-spa
1720
- type: translation
1721
- args: ita-spa
1722
- dataset:
1723
- name: newstest2009
1724
- type: wmt-2009-news
1725
- args: ita-spa
1726
- metrics:
1727
- - name: BLEU
1728
- type: bleu
1729
- value: 33.5
1730
- - name: chr-F
1731
- type: chrf
1732
- value: 0.59669
1733
- - task:
1734
- name: Translation spa-fra
1735
- type: translation
1736
- args: spa-fra
1737
- dataset:
1738
- name: newstest2009
1739
- type: wmt-2009-news
1740
- args: spa-fra
1741
- metrics:
1742
- - name: BLEU
1743
- type: bleu
1744
- value: 32.3
1745
- - name: chr-F
1746
- type: chrf
1747
- value: 0.59096
1748
- - task:
1749
- name: Translation spa-ita
1750
  type: translation
1751
- args: spa-ita
1752
- dataset:
1753
- name: newstest2009
1754
- type: wmt-2009-news
1755
- args: spa-ita
1756
- metrics:
1757
- - name: BLEU
1758
- type: bleu
1759
- value: 33.2
1760
- - name: chr-F
1761
- type: chrf
1762
- value: 0.60783
1763
- - task:
1764
  name: Translation fra-spa
1765
- type: translation
1766
- args: fra-spa
1767
  dataset:
1768
  name: newstest2010
1769
  type: wmt-2010-news
1770
  args: fra-spa
1771
  metrics:
1772
- - name: BLEU
1773
- type: bleu
1774
- value: 37.8
1775
- - name: chr-F
1776
- type: chrf
1777
- value: 0.62250
 
 
 
 
 
 
1778
  - task:
1779
- name: Translation spa-fra
1780
  type: translation
1781
- args: spa-fra
1782
- dataset:
1783
- name: newstest2010
1784
- type: wmt-2010-news
1785
- args: spa-fra
1786
- metrics:
1787
- - name: BLEU
1788
- type: bleu
1789
- value: 36.2
1790
- - name: chr-F
1791
- type: chrf
1792
- value: 0.61953
1793
- - task:
1794
  name: Translation fra-spa
1795
- type: translation
1796
- args: fra-spa
1797
  dataset:
1798
  name: newstest2011
1799
  type: wmt-2011-news
1800
  args: fra-spa
1801
  metrics:
1802
- - name: BLEU
1803
- type: bleu
1804
- value: 39.8
1805
- - name: chr-F
1806
- type: chrf
1807
- value: 0.62953
 
 
 
 
 
 
1808
  - task:
1809
- name: Translation spa-fra
1810
  type: translation
1811
- args: spa-fra
1812
- dataset:
1813
- name: newstest2011
1814
- type: wmt-2011-news
1815
- args: spa-fra
1816
- metrics:
1817
- - name: BLEU
1818
- type: bleu
1819
- value: 34.9
1820
- - name: chr-F
1821
- type: chrf
1822
- value: 0.61130
1823
- - task:
1824
  name: Translation fra-spa
1825
- type: translation
1826
- args: fra-spa
1827
  dataset:
1828
  name: newstest2012
1829
  type: wmt-2012-news
1830
  args: fra-spa
1831
  metrics:
1832
- - name: BLEU
1833
- type: bleu
1834
- value: 39.0
1835
- - name: chr-F
1836
- type: chrf
1837
- value: 0.62397
 
 
 
 
 
 
1838
  - task:
1839
- name: Translation spa-fra
1840
  type: translation
1841
- args: spa-fra
1842
- dataset:
1843
- name: newstest2012
1844
- type: wmt-2012-news
1845
- args: spa-fra
1846
- metrics:
1847
- - name: BLEU
1848
- type: bleu
1849
- value: 34.3
1850
- - name: chr-F
1851
- type: chrf
1852
- value: 0.60927
1853
- - task:
1854
  name: Translation fra-spa
1855
- type: translation
1856
- args: fra-spa
1857
  dataset:
1858
  name: newstest2013
1859
  type: wmt-2013-news
1860
  args: fra-spa
1861
  metrics:
1862
- - name: BLEU
1863
- type: bleu
1864
- value: 34.9
1865
- - name: chr-F
1866
- type: chrf
1867
- value: 0.59312
 
 
 
 
 
 
1868
  - task:
1869
- name: Translation spa-fra
1870
  type: translation
1871
- args: spa-fra
1872
- dataset:
1873
- name: newstest2013
1874
- type: wmt-2013-news
1875
- args: spa-fra
1876
- metrics:
1877
- - name: BLEU
1878
- type: bleu
1879
- value: 33.6
1880
- - name: chr-F
1881
- type: chrf
1882
- value: 0.59468
1883
- - task:
1884
  name: Translation cat-ita
1885
- type: translation
1886
- args: cat-ita
1887
  dataset:
1888
  name: wmt21-ml-wp
1889
  type: wmt21-ml-wp
1890
  args: cat-ita
1891
  metrics:
1892
- - name: BLEU
1893
- type: bleu
1894
- value: 47.8
1895
- - name: chr-F
1896
- type: chrf
1897
- value: 0.69968
1898
- - task:
1899
- name: Translation cat-oci
1900
- type: translation
1901
- args: cat-oci
1902
- dataset:
1903
- name: wmt21-ml-wp
1904
- type: wmt21-ml-wp
1905
- args: cat-oci
1906
- metrics:
1907
- - name: BLEU
1908
- type: bleu
1909
- value: 51.6
1910
- - name: chr-F
1911
- type: chrf
1912
- value: 0.73808
1913
- - task:
1914
- name: Translation cat-ron
1915
- type: translation
1916
- args: cat-ron
1917
- dataset:
1918
- name: wmt21-ml-wp
1919
- type: wmt21-ml-wp
1920
- args: cat-ron
1921
- metrics:
1922
- - name: BLEU
1923
- type: bleu
1924
- value: 29.0
1925
- - name: chr-F
1926
- type: chrf
1927
- value: 0.51178
1928
- - task:
1929
- name: Translation ita-cat
1930
- type: translation
1931
- args: ita-cat
1932
- dataset:
1933
- name: wmt21-ml-wp
1934
- type: wmt21-ml-wp
1935
- args: ita-cat
1936
- metrics:
1937
- - name: BLEU
1938
- type: bleu
1939
- value: 48.9
1940
- - name: chr-F
1941
- type: chrf
1942
- value: 0.70538
1943
- - task:
1944
- name: Translation ita-oci
1945
- type: translation
1946
- args: ita-oci
1947
- dataset:
1948
- name: wmt21-ml-wp
1949
- type: wmt21-ml-wp
1950
- args: ita-oci
1951
- metrics:
1952
- - name: BLEU
1953
- type: bleu
1954
- value: 32.0
1955
- - name: chr-F
1956
- type: chrf
1957
- value: 0.59025
1958
- - task:
1959
- name: Translation ita-ron
1960
- type: translation
1961
- args: ita-ron
1962
- dataset:
1963
- name: wmt21-ml-wp
1964
- type: wmt21-ml-wp
1965
- args: ita-ron
1966
- metrics:
1967
- - name: BLEU
1968
- type: bleu
1969
- value: 28.9
1970
- - name: chr-F
1971
- type: chrf
1972
- value: 0.51261
1973
- - task:
1974
- name: Translation oci-cat
1975
- type: translation
1976
- args: oci-cat
1977
- dataset:
1978
- name: wmt21-ml-wp
1979
- type: wmt21-ml-wp
1980
- args: oci-cat
1981
- metrics:
1982
- - name: BLEU
1983
- type: bleu
1984
- value: 66.1
1985
- - name: chr-F
1986
- type: chrf
1987
- value: 0.80908
1988
- - task:
1989
- name: Translation oci-ita
1990
- type: translation
1991
- args: oci-ita
1992
- dataset:
1993
- name: wmt21-ml-wp
1994
- type: wmt21-ml-wp
1995
- args: oci-ita
1996
- metrics:
1997
- - name: BLEU
1998
- type: bleu
1999
- value: 39.6
2000
- - name: chr-F
2001
- type: chrf
2002
- value: 0.63584
2003
- - task:
2004
- name: Translation oci-ron
2005
- type: translation
2006
- args: oci-ron
2007
- dataset:
2008
- name: wmt21-ml-wp
2009
- type: wmt21-ml-wp
2010
- args: oci-ron
2011
- metrics:
2012
- - name: BLEU
2013
- type: bleu
2014
- value: 24.6
2015
- - name: chr-F
2016
- type: chrf
2017
- value: 0.47384
2018
- - task:
2019
- name: Translation ron-cat
2020
- type: translation
2021
- args: ron-cat
2022
- dataset:
2023
- name: wmt21-ml-wp
2024
- type: wmt21-ml-wp
2025
- args: ron-cat
2026
- metrics:
2027
- - name: BLEU
2028
- type: bleu
2029
- value: 31.1
2030
- - name: chr-F
2031
- type: chrf
2032
- value: 0.52994
2033
- - task:
2034
- name: Translation ron-ita
2035
- type: translation
2036
- args: ron-ita
2037
- dataset:
2038
- name: wmt21-ml-wp
2039
- type: wmt21-ml-wp
2040
- args: ron-ita
2041
- metrics:
2042
- - name: BLEU
2043
- type: bleu
2044
- value: 29.6
2045
- - name: chr-F
2046
- type: chrf
2047
- value: 0.52714
2048
- - task:
2049
- name: Translation ron-oci
2050
- type: translation
2051
- args: ron-oci
2052
- dataset:
2053
- name: wmt21-ml-wp
2054
- type: wmt21-ml-wp
2055
- args: ron-oci
2056
- metrics:
2057
- - name: BLEU
2058
- type: bleu
2059
- value: 21.3
2060
- - name: chr-F
2061
- type: chrf
2062
- value: 0.45932
2063
  ---
2064
  # opus-mt-tc-big-itc-itc
2065
 
@@ -2115,7 +962,7 @@ A short example code:
2115
  from transformers import MarianMTModel, MarianTokenizer
2116
 
2117
  src_text = [
2118
- ">>fra<< Charras anglés?",
2119
  ">>fra<< Vull veure't."
2120
  ]
2121
 
@@ -2137,7 +984,7 @@ You can also use OPUS-MT models with the transformers pipelines, for example:
2137
  ```python
2138
  from transformers import pipeline
2139
  pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-itc-itc")
2140
- print(pipe(">>fra<< Charras anglés?"))
2141
 
2142
  # expected output: Conversations anglaises ?
2143
  ```
@@ -2305,7 +1152,7 @@ print(pipe(">>fra<< Charras anglés?"))
2305
 
2306
  ## Citation Information
2307
 
2308
- * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
2309
 
2310
  ```
2311
  @inproceedings{tiedemann-thottingal-2020-opus,
@@ -2335,7 +1182,7 @@ print(pipe(">>fra<< Charras anglés?"))
2335
 
2336
  ## Acknowledgements
2337
 
2338
- The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
2339
 
2340
  ## Model conversion info
2341
 
 
11
  - pms
12
  - pt
13
  - ro
14
+ - multilingual
15
+ license: cc-by-4.0
16
+ tags:
17
+ - translation
18
+ - opus-mt-tc
19
+ model-index:
20
+ - name: opus-mt-tc-big-itc-itc
21
+ results:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
22
  - task:
 
23
  type: translation
24
+ name: Translation ast-cat
25
  dataset:
26
+ name: flores101-devtest
27
+ type: flores_101
28
+ args: ast cat devtest
29
  metrics:
30
+ - type: bleu
31
+ value: 31.8
32
+ name: BLEU
33
+ - type: chrf
34
+ value: 0.5787
35
+ name: chr-F
36
+ - type: bleu
37
+ value: 31.1
38
+ name: BLEU
39
+ - type: chrf
40
+ value: 0.56761
41
+ name: chr-F
42
+ - type: bleu
43
+ value: 27.9
44
+ name: BLEU
45
+ - type: chrf
46
+ value: 0.55161
47
+ name: chr-F
48
+ - type: bleu
49
+ value: 22.1
50
+ name: BLEU
51
+ - type: chrf
52
+ value: 0.51764
53
+ name: chr-F
54
+ - type: bleu
55
+ value: 20.6
56
+ name: BLEU
57
+ - type: chrf
58
+ value: 0.49545
59
+ name: chr-F
60
+ - type: bleu
61
+ value: 31.5
62
+ name: BLEU
63
+ - type: chrf
64
+ value: 0.57347
65
+ name: chr-F
66
+ - type: bleu
67
+ value: 24.8
68
+ name: BLEU
69
+ - type: chrf
70
+ value: 0.52317
71
+ name: chr-F
72
+ - type: bleu
73
+ value: 21.2
74
+ name: BLEU
75
+ - type: chrf
76
+ value: 0.49741
77
+ name: chr-F
78
+ - type: bleu
79
+ value: 24.7
80
+ name: BLEU
81
+ - type: chrf
82
+ value: 0.56754
83
+ name: chr-F
84
+ - type: bleu
85
+ value: 38.4
86
+ name: BLEU
87
+ - type: chrf
88
+ value: 0.63368
89
+ name: chr-F
90
+ - type: bleu
91
+ value: 32.2
92
+ name: BLEU
93
+ - type: chrf
94
+ value: 0.59596
95
+ name: chr-F
96
+ - type: bleu
97
+ value: 26.3
98
+ name: BLEU
99
+ - type: chrf
100
+ value: 0.55886
101
+ name: chr-F
102
+ - type: bleu
103
+ value: 24.6
104
+ name: BLEU
105
+ - type: chrf
106
+ value: 0.54285
107
+ name: chr-F
108
+ - type: bleu
109
+ value: 37.7
110
+ name: BLEU
111
+ - type: chrf
112
+ value: 0.62913
113
+ name: chr-F
114
+ - type: bleu
115
+ value: 29.5
116
+ name: BLEU
117
+ - type: chrf
118
+ value: 0.56885
119
+ name: chr-F
120
+ - type: bleu
121
+ value: 24.6
122
+ name: BLEU
123
+ - type: chrf
124
+ value: 0.53372
125
+ name: chr-F
126
+ - type: bleu
127
+ value: 20.7
128
+ name: BLEU
129
+ - type: chrf
130
+ value: 0.52696
131
+ name: chr-F
132
+ - type: bleu
133
+ value: 34.6
134
+ name: BLEU
135
+ - type: chrf
136
+ value: 0.60492
137
+ name: chr-F
138
+ - type: bleu
139
+ value: 30.3
140
+ name: BLEU
141
+ - type: chrf
142
+ value: 0.57485
143
+ name: chr-F
144
+ - type: bleu
145
+ value: 27.3
146
+ name: BLEU
147
+ - type: chrf
148
+ value: 0.56493
149
+ name: chr-F
150
+ - type: bleu
151
+ value: 28.2
152
+ name: BLEU
153
+ - type: chrf
154
+ value: 0.57449
155
+ name: chr-F
156
+ - type: bleu
157
+ value: 36.9
158
+ name: BLEU
159
+ - type: chrf
160
+ value: 0.62211
161
+ name: chr-F
162
+ - type: bleu
163
+ value: 29.4
164
+ name: BLEU
165
+ - type: chrf
166
+ value: 0.56998
167
+ name: chr-F
168
+ - type: bleu
169
+ value: 24.2
170
+ name: BLEU
171
+ - type: chrf
172
+ value: 0.5288
173
+ name: chr-F
174
+ - type: bleu
175
+ value: 22.4
176
+ name: BLEU
177
+ - type: chrf
178
+ value: 0.5509
179
+ name: chr-F
180
+ - type: bleu
181
+ value: 32.6
182
+ name: BLEU
183
+ - type: chrf
184
+ value: 0.6055
185
+ name: chr-F
186
+ - type: bleu
187
+ value: 36.0
188
+ name: BLEU
189
+ - type: chrf
190
+ value: 0.62026
191
+ name: chr-F
192
+ - type: bleu
193
+ value: 25.9
194
+ name: BLEU
195
+ - type: chrf
196
+ value: 0.55834
197
+ name: chr-F
198
+ - type: bleu
199
+ value: 21.9
200
+ name: BLEU
201
+ - type: chrf
202
+ value: 0.5252
203
+ name: chr-F
204
+ - type: bleu
205
+ value: 32.7
206
+ name: BLEU
207
+ - type: chrf
208
+ value: 0.60027
209
+ name: chr-F
210
+ - type: bleu
211
+ value: 27.8
212
+ name: BLEU
213
+ - type: chrf
214
+ value: 0.55621
215
+ name: chr-F
216
+ - type: bleu
217
+ value: 24.4
218
+ name: BLEU
219
+ - type: chrf
220
+ value: 0.53219
221
+ name: chr-F
222
+ - type: bleu
223
+ value: 17.1
224
+ name: BLEU
225
+ - type: chrf
226
+ value: 0.50741
227
+ name: chr-F
228
+ - type: bleu
229
+ value: 27.9
230
+ name: BLEU
231
+ - type: chrf
232
+ value: 0.57061
233
+ name: chr-F
234
+ - type: bleu
235
+ value: 32.0
236
+ name: BLEU
237
+ - type: chrf
238
+ value: 0.60199
239
+ name: chr-F
240
+ - type: bleu
241
+ value: 25.9
242
+ name: BLEU
243
+ - type: chrf
244
+ value: 0.55312
245
+ name: chr-F
246
+ - type: bleu
247
+ value: 18.1
248
+ name: BLEU
249
+ - type: chrf
250
+ value: 0.48447
251
+ name: chr-F
252
+ - type: bleu
253
+ value: 29.0
254
+ name: BLEU
255
+ - type: chrf
256
+ value: 0.58162
257
+ name: chr-F
258
+ - type: bleu
259
+ value: 24.2
260
+ name: BLEU
261
+ - type: chrf
262
+ value: 0.53703
263
+ name: chr-F
264
+ - type: bleu
265
+ value: 23.1
266
+ name: BLEU
267
+ - type: chrf
268
+ value: 0.52238
269
+ name: chr-F
270
+ - type: bleu
271
+ value: 20.2
272
+ name: BLEU
273
+ - type: chrf
274
+ value: 0.5301
275
+ name: chr-F
276
+ - type: bleu
277
+ value: 32.2
278
+ name: BLEU
279
+ - type: chrf
280
+ value: 0.59946
281
+ name: chr-F
282
+ - type: bleu
283
+ value: 39.0
284
+ name: BLEU
285
+ - type: chrf
286
+ value: 0.6429
287
+ name: chr-F
288
+ - type: bleu
289
+ value: 28.0
290
+ name: BLEU
291
+ - type: chrf
292
+ value: 0.56737
293
+ name: chr-F
294
+ - type: bleu
295
+ value: 24.2
296
+ name: BLEU
297
+ - type: chrf
298
+ value: 0.5422
299
+ name: chr-F
300
+ - type: bleu
301
+ value: 35.7
302
+ name: BLEU
303
+ - type: chrf
304
+ value: 0.62127
305
+ name: chr-F
306
+ - type: bleu
307
+ value: 28.0
308
+ name: BLEU
309
+ - type: chrf
310
+ value: 0.55906
311
+ name: chr-F
312
+ - type: bleu
313
+ value: 22.8
314
+ name: BLEU
315
+ - type: chrf
316
+ value: 0.5211
317
+ name: chr-F
318
+ - type: bleu
319
+ value: 22.5
320
+ name: BLEU
321
+ - type: chrf
322
+ value: 0.54539
323
+ name: chr-F
324
+ - type: bleu
325
+ value: 36.4
326
+ name: BLEU
327
+ - type: chrf
328
+ value: 0.61809
329
+ name: chr-F
330
+ - type: bleu
331
+ value: 39.7
332
+ name: BLEU
333
+ - type: chrf
334
+ value: 0.64343
335
+ name: chr-F
336
+ - type: bleu
337
+ value: 30.4
338
+ name: BLEU
339
+ - type: chrf
340
+ value: 0.57965
341
+ name: chr-F
342
+ - type: bleu
343
+ value: 26.3
344
+ name: BLEU
345
+ - type: chrf
346
+ value: 0.55841
347
+ name: chr-F
348
+ - type: bleu
349
+ value: 25.3
350
+ name: BLEU
351
+ - type: chrf
352
+ value: 0.54829
353
+ name: chr-F
354
+ - type: bleu
355
+ value: 29.8
356
+ name: BLEU
357
+ - type: chrf
358
+ value: 0.57283
359
+ name: chr-F
360
+ - type: bleu
361
+ value: 25.2
362
+ name: BLEU
363
+ - type: chrf
364
+ value: 0.53513
365
+ name: chr-F
366
+ - type: bleu
367
+ value: 20.1
368
+ name: BLEU
369
+ - type: chrf
370
+ value: 0.52265
371
+ name: chr-F
372
+ - type: bleu
373
+ value: 32.6
374
+ name: BLEU
375
+ - type: chrf
376
+ value: 0.59689
377
+ name: chr-F
378
+ - type: bleu
379
+ value: 37.4
380
+ name: BLEU
381
+ - type: chrf
382
+ value: 0.6306
383
+ name: chr-F
384
+ - type: bleu
385
+ value: 29.3
386
+ name: BLEU
387
+ - type: chrf
388
+ value: 0.56677
389
+ name: chr-F
390
+ - type: bleu
391
+ value: 25.6
392
+ name: BLEU
393
+ - type: chrf
394
+ value: 0.55485
395
+ name: chr-F
396
+ - type: bleu
397
+ value: 21.8
398
+ name: BLEU
399
+ - type: chrf
400
+ value: 0.52433
401
+ name: chr-F
402
+ - type: bleu
403
+ value: 36.1
404
+ name: BLEU
405
+ - type: chrf
406
+ value: 0.61831
407
+ name: chr-F
408
+ - type: bleu
409
+ value: 24.1
410
+ name: BLEU
411
+ - type: chrf
412
+ value: 0.52712
413
+ name: chr-F
414
+ - type: bleu
415
+ value: 15.7
416
+ name: BLEU
417
+ - type: chrf
418
+ value: 0.49008
419
+ name: chr-F
420
+ - type: bleu
421
+ value: 23.2
422
+ name: BLEU
423
+ - type: chrf
424
+ value: 0.53905
425
+ name: chr-F
426
+ - type: bleu
427
+ value: 27.4
428
+ name: BLEU
429
+ - type: chrf
430
+ value: 0.57078
431
+ name: chr-F
432
+ - type: bleu
433
+ value: 22.0
434
+ name: BLEU
435
+ - type: chrf
436
+ value: 0.52563
437
+ name: chr-F
438
+ - type: bleu
439
+ value: 22.3
440
+ name: BLEU
441
+ - type: chrf
442
+ value: 0.52783
443
+ name: chr-F
444
+ - type: bleu
445
+ value: 16.3
446
+ name: BLEU
447
+ - type: chrf
448
+ value: 0.48064
449
+ name: chr-F
450
+ - type: bleu
451
+ value: 25.8
452
+ name: BLEU
453
+ - type: chrf
454
+ value: 0.55736
455
+ name: chr-F
456
+ - type: bleu
457
+ value: 21.4
458
+ name: BLEU
459
+ - type: chrf
460
+ value: 0.51623
461
+ name: chr-F
462
  - task:
 
463
  type: translation
464
+ name: Translation fra-spa
465
  dataset:
466
+ name: news-test2008
467
+ type: news-test2008
468
+ args: fra-spa
469
  metrics:
470
+ - type: bleu
471
+ value: 33.9
472
+ name: BLEU
473
+ - type: chrf
474
+ value: 0.58939
475
+ name: chr-F
476
+ - type: bleu
477
+ value: 32.4
478
+ name: BLEU
479
+ - type: chrf
480
+ value: 0.58695
481
+ name: chr-F
482
  - task:
 
483
  type: translation
484
+ name: Translation cat-fra
485
  dataset:
486
  name: tatoeba-test-v2021-08-07
487
  type: tatoeba_mt
488
+ args: cat-fra
489
  metrics:
490
+ - type: bleu
491
+ value: 54.6
492
+ name: BLEU
493
+ - type: chrf
494
+ value: 0.71201
495
+ name: chr-F
496
+ - type: bleu
497
+ value: 58.4
498
+ name: BLEU
499
+ - type: chrf
500
+ value: 0.74198
501
+ name: chr-F
502
+ - type: bleu
503
+ value: 57.4
504
+ name: BLEU
505
+ - type: chrf
506
+ value: 0.7493
507
+ name: chr-F
508
+ - type: bleu
509
+ value: 78.1
510
+ name: BLEU
511
+ - type: chrf
512
+ value: 0.87844
513
+ name: chr-F
514
+ - type: bleu
515
+ value: 46.2
516
+ name: BLEU
517
+ - type: chrf
518
+ value: 0.66525
519
+ name: chr-F
520
+ - type: bleu
521
+ value: 53.8
522
+ name: BLEU
523
+ - type: chrf
524
+ value: 0.72742
525
+ name: chr-F
526
+ - type: bleu
527
+ value: 48.6
528
+ name: BLEU
529
+ - type: chrf
530
+ value: 0.68413
531
+ name: chr-F
532
+ - type: bleu
533
+ value: 44.0
534
+ name: BLEU
535
+ - type: chrf
536
+ value: 0.65009
537
+ name: chr-F
538
+ - type: bleu
539
+ value: 54.8
540
+ name: BLEU
541
+ - type: chrf
542
+ value: 0.7208
543
+ name: chr-F
544
+ - type: bleu
545
+ value: 61.1
546
+ name: BLEU
547
+ - type: chrf
548
+ value: 0.7672
549
+ name: chr-F
550
+ - type: bleu
551
+ value: 71.7
552
+ name: BLEU
553
+ - type: chrf
554
+ value: 0.82362
555
+ name: chr-F
556
+ - type: bleu
557
+ value: 56.4
558
+ name: BLEU
559
+ - type: chrf
560
+ value: 0.72529
561
+ name: chr-F
562
+ - type: bleu
563
+ value: 65.2
564
+ name: BLEU
565
+ - type: chrf
566
+ value: 0.77932
567
+ name: chr-F
568
+ - type: bleu
569
+ value: 54.0
570
+ name: BLEU
571
+ - type: chrf
572
+ value: 0.72798
573
+ name: chr-F
574
+ - type: bleu
575
+ value: 51.1
576
+ name: BLEU
577
+ - type: chrf
578
+ value: 0.70814
579
+ name: chr-F
580
+ - type: bleu
581
+ value: 62.9
582
+ name: BLEU
583
+ - type: chrf
584
+ value: 0.77455
585
+ name: chr-F
586
+ - type: bleu
587
+ value: 34.7
588
+ name: BLEU
589
+ - type: chrf
590
+ value: 0.52243
591
+ name: chr-F
592
+ - type: bleu
593
+ value: 42.6
594
+ name: BLEU
595
+ - type: chrf
596
+ value: 0.59363
597
+ name: chr-F
598
+ - type: bleu
599
+ value: 29.6
600
+ name: BLEU
601
+ - type: chrf
602
+ value: 0.4966
603
+ name: chr-F
604
+ - type: bleu
605
+ value: 20.0
606
+ name: BLEU
607
+ - type: chrf
608
+ value: 0.40221
609
+ name: chr-F
610
+ - type: bleu
611
+ value: 52.2
612
+ name: BLEU
613
+ - type: chrf
614
+ value: 0.71146
615
+ name: chr-F
616
+ - type: bleu
617
+ value: 60.9
618
+ name: BLEU
619
+ - type: chrf
620
+ value: 0.75565
621
+ name: chr-F
622
+ - type: bleu
623
+ value: 59.0
624
+ name: BLEU
625
+ - type: chrf
626
+ value: 0.75348
627
+ name: chr-F
628
+ - type: bleu
629
+ value: 58.8
630
+ name: BLEU
631
+ - type: chrf
632
+ value: 0.76883
633
+ name: chr-F
634
+ - type: bleu
635
+ value: 46.6
636
+ name: BLEU
637
+ - type: chrf
638
+ value: 0.67838
639
+ name: chr-F
640
+ - type: bleu
641
+ value: 64.8
642
+ name: BLEU
643
+ - type: chrf
644
+ value: 0.79336
645
+ name: chr-F
646
+ - type: bleu
647
+ value: 55.0
648
+ name: BLEU
649
+ - type: chrf
650
+ value: 0.70307
651
+ name: chr-F
652
+ - type: bleu
653
+ value: 53.7
654
+ name: BLEU
655
+ - type: chrf
656
+ value: 0.73862
657
+ name: chr-F
658
+ - type: bleu
659
+ value: 50.7
660
+ name: BLEU
661
+ - type: chrf
662
+ value: 0.70889
663
+ name: chr-F
664
+ - type: bleu
665
+ value: 57.2
666
+ name: BLEU
667
+ - type: chrf
668
+ value: 0.73529
669
+ name: chr-F
670
+ - type: bleu
671
+ value: 67.9
672
+ name: BLEU
673
+ - type: chrf
674
+ value: 0.82758
675
+ name: chr-F
676
+ - type: bleu
677
+ value: 57.3
678
+ name: BLEU
679
+ - type: chrf
680
+ value: 0.73113
681
+ name: chr-F
682
+ - type: bleu
683
+ value: 63.0
684
+ name: BLEU
685
+ - type: chrf
686
+ value: 0.77332
687
+ name: chr-F
688
+ - type: bleu
689
+ value: 60.3
690
+ name: BLEU
691
+ - type: chrf
692
+ value: 0.77046
693
+ name: chr-F
694
+ - type: bleu
695
+ value: 59.1
696
+ name: BLEU
697
+ - type: chrf
698
+ value: 0.75854
699
+ name: chr-F
700
+ - type: bleu
701
+ value: 45.5
702
+ name: BLEU
703
+ - type: chrf
704
+ value: 0.66679
705
+ name: chr-F
706
  - task:
 
707
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
708
  name: Translation fra-ita
 
 
709
  dataset:
710
  name: newstest2009
711
  type: wmt-2009-news
712
  args: fra-ita
713
  metrics:
714
+ - type: bleu
715
+ value: 31.2
716
+ name: BLEU
717
+ - type: chrf
718
+ value: 0.59764
719
+ name: chr-F
720
+ - type: bleu
721
+ value: 32.5
722
+ name: BLEU
723
+ - type: chrf
724
+ value: 0.58829
725
+ name: chr-F
726
+ - type: bleu
727
+ value: 31.6
728
+ name: BLEU
729
+ - type: chrf
730
+ value: 0.59084
731
+ name: chr-F
732
+ - type: bleu
733
+ value: 33.5
734
+ name: BLEU
735
+ - type: chrf
736
+ value: 0.59669
737
+ name: chr-F
738
+ - type: bleu
739
+ value: 32.3
740
+ name: BLEU
741
+ - type: chrf
742
+ value: 0.59096
743
+ name: chr-F
744
+ - type: bleu
745
+ value: 33.2
746
+ name: BLEU
747
+ - type: chrf
748
+ value: 0.60783
749
+ name: chr-F
750
  - task:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
751
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
752
  name: Translation fra-spa
 
 
753
  dataset:
754
  name: newstest2010
755
  type: wmt-2010-news
756
  args: fra-spa
757
  metrics:
758
+ - type: bleu
759
+ value: 37.8
760
+ name: BLEU
761
+ - type: chrf
762
+ value: 0.6225
763
+ name: chr-F
764
+ - type: bleu
765
+ value: 36.2
766
+ name: BLEU
767
+ - type: chrf
768
+ value: 0.61953
769
+ name: chr-F
770
  - task:
 
771
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
772
  name: Translation fra-spa
 
 
773
  dataset:
774
  name: newstest2011
775
  type: wmt-2011-news
776
  args: fra-spa
777
  metrics:
778
+ - type: bleu
779
+ value: 39.8
780
+ name: BLEU
781
+ - type: chrf
782
+ value: 0.62953
783
+ name: chr-F
784
+ - type: bleu
785
+ value: 34.9
786
+ name: BLEU
787
+ - type: chrf
788
+ value: 0.6113
789
+ name: chr-F
790
  - task:
 
791
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
792
  name: Translation fra-spa
 
 
793
  dataset:
794
  name: newstest2012
795
  type: wmt-2012-news
796
  args: fra-spa
797
  metrics:
798
+ - type: bleu
799
+ value: 39.0
800
+ name: BLEU
801
+ - type: chrf
802
+ value: 0.62397
803
+ name: chr-F
804
+ - type: bleu
805
+ value: 34.3
806
+ name: BLEU
807
+ - type: chrf
808
+ value: 0.60927
809
+ name: chr-F
810
  - task:
 
811
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
812
  name: Translation fra-spa
 
 
813
  dataset:
814
  name: newstest2013
815
  type: wmt-2013-news
816
  args: fra-spa
817
  metrics:
818
+ - type: bleu
819
+ value: 34.9
820
+ name: BLEU
821
+ - type: chrf
822
+ value: 0.59312
823
+ name: chr-F
824
+ - type: bleu
825
+ value: 33.6
826
+ name: BLEU
827
+ - type: chrf
828
+ value: 0.59468
829
+ name: chr-F
830
  - task:
 
831
  type: translation
 
 
 
 
 
 
 
 
 
 
 
 
 
832
  name: Translation cat-ita
 
 
833
  dataset:
834
  name: wmt21-ml-wp
835
  type: wmt21-ml-wp
836
  args: cat-ita
837
  metrics:
838
+ - type: bleu
839
+ value: 47.8
840
+ name: BLEU
841
+ - type: chrf
842
+ value: 0.69968
843
+ name: chr-F
844
+ - type: bleu
845
+ value: 51.6
846
+ name: BLEU
847
+ - type: chrf
848
+ value: 0.73808
849
+ name: chr-F
850
+ - type: bleu
851
+ value: 29.0
852
+ name: BLEU
853
+ - type: chrf
854
+ value: 0.51178
855
+ name: chr-F
856
+ - type: bleu
857
+ value: 48.9
858
+ name: BLEU
859
+ - type: chrf
860
+ value: 0.70538
861
+ name: chr-F
862
+ - type: bleu
863
+ value: 32.0
864
+ name: BLEU
865
+ - type: chrf
866
+ value: 0.59025
867
+ name: chr-F
868
+ - type: bleu
869
+ value: 28.9
870
+ name: BLEU
871
+ - type: chrf
872
+ value: 0.51261
873
+ name: chr-F
874
+ - type: bleu
875
+ value: 66.1
876
+ name: BLEU
877
+ - type: chrf
878
+ value: 0.80908
879
+ name: chr-F
880
+ - type: bleu
881
+ value: 39.6
882
+ name: BLEU
883
+ - type: chrf
884
+ value: 0.63584
885
+ name: chr-F
886
+ - type: bleu
887
+ value: 24.6
888
+ name: BLEU
889
+ - type: chrf
890
+ value: 0.47384
891
+ name: chr-F
892
+ - type: bleu
893
+ value: 31.1
894
+ name: BLEU
895
+ - type: chrf
896
+ value: 0.52994
897
+ name: chr-F
898
+ - type: bleu
899
+ value: 29.6
900
+ name: BLEU
901
+ - type: chrf
902
+ value: 0.52714
903
+ name: chr-F
904
+ - type: bleu
905
+ value: 21.3
906
+ name: BLEU
907
+ - type: chrf
908
+ value: 0.45932
909
+ name: chr-F
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
910
  ---
911
  # opus-mt-tc-big-itc-itc
912
 
 
962
  from transformers import MarianMTModel, MarianTokenizer
963
 
964
  src_text = [
965
+ ">>fra<< Charras angl�s?",
966
  ">>fra<< Vull veure't."
967
  ]
968
 
 
984
  ```python
985
  from transformers import pipeline
986
  pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-itc-itc")
987
+ print(pipe(">>fra<< Charras angl�s?"))
988
 
989
  # expected output: Conversations anglaises ?
990
  ```
 
1152
 
1153
  ## Citation Information
1154
 
1155
+ * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
1156
 
1157
  ```
1158
  @inproceedings{tiedemann-thottingal-2020-opus,
 
1182
 
1183
  ## Acknowledgements
1184
 
1185
+ The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
1186
 
1187
  ## Model conversion info
1188