system
HF staff
commited on
Commit
a5ff089
1 Parent(s): 7dce02f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +135 -0
README.md ADDED
@@ -0,0 +1,135 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - translation
5
+
6
+ license: apache-2.0
7
+ ---
8
+
9
+ ### eng-itc
10
+
11
+ * source group: English
12
+ * target group: Italic languages
13
+ * OPUS readme: [eng-itc](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-itc/README.md)
14
+
15
+ * model: transformer
16
+ * source language(s): eng
17
+ * target language(s): arg ast cat cos egl ext fra frm_Latn gcf_Latn glg hat ind ita lad lad_Latn lat_Latn lij lld_Latn lmo max_Latn mfe min mwl oci pap pms por roh ron scn spa tmw_Latn vec wln zlm_Latn zsm_Latn
18
+ * model: transformer
19
+ * pre-processing: normalization + SentencePiece (spm32k,spm32k)
20
+ * a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
21
+ * download original weights: [opus2m-2020-08-01.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-itc/opus2m-2020-08-01.zip)
22
+ * test set translations: [opus2m-2020-08-01.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-itc/opus2m-2020-08-01.test.txt)
23
+ * test set scores: [opus2m-2020-08-01.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-itc/opus2m-2020-08-01.eval.txt)
24
+
25
+ ## Benchmarks
26
+
27
+ | testset | BLEU | chr-F |
28
+ |-----------------------|-------|-------|
29
+ | newsdev2016-enro-engron.eng.ron | 27.1 | 0.565 |
30
+ | newsdiscussdev2015-enfr-engfra.eng.fra | 29.9 | 0.574 |
31
+ | newsdiscusstest2015-enfr-engfra.eng.fra | 35.3 | 0.609 |
32
+ | newssyscomb2009-engfra.eng.fra | 27.7 | 0.567 |
33
+ | newssyscomb2009-engita.eng.ita | 28.6 | 0.586 |
34
+ | newssyscomb2009-engspa.eng.spa | 29.8 | 0.569 |
35
+ | news-test2008-engfra.eng.fra | 25.0 | 0.536 |
36
+ | news-test2008-engspa.eng.spa | 27.1 | 0.548 |
37
+ | newstest2009-engfra.eng.fra | 26.7 | 0.557 |
38
+ | newstest2009-engita.eng.ita | 28.9 | 0.583 |
39
+ | newstest2009-engspa.eng.spa | 28.9 | 0.567 |
40
+ | newstest2010-engfra.eng.fra | 29.6 | 0.574 |
41
+ | newstest2010-engspa.eng.spa | 33.8 | 0.598 |
42
+ | newstest2011-engfra.eng.fra | 30.9 | 0.590 |
43
+ | newstest2011-engspa.eng.spa | 34.8 | 0.598 |
44
+ | newstest2012-engfra.eng.fra | 29.1 | 0.574 |
45
+ | newstest2012-engspa.eng.spa | 34.9 | 0.600 |
46
+ | newstest2013-engfra.eng.fra | 30.1 | 0.567 |
47
+ | newstest2013-engspa.eng.spa | 31.8 | 0.576 |
48
+ | newstest2016-enro-engron.eng.ron | 25.9 | 0.548 |
49
+ | Tatoeba-test.eng-arg.eng.arg | 1.6 | 0.120 |
50
+ | Tatoeba-test.eng-ast.eng.ast | 17.2 | 0.389 |
51
+ | Tatoeba-test.eng-cat.eng.cat | 47.6 | 0.668 |
52
+ | Tatoeba-test.eng-cos.eng.cos | 4.3 | 0.287 |
53
+ | Tatoeba-test.eng-egl.eng.egl | 0.9 | 0.101 |
54
+ | Tatoeba-test.eng-ext.eng.ext | 8.7 | 0.287 |
55
+ | Tatoeba-test.eng-fra.eng.fra | 44.9 | 0.635 |
56
+ | Tatoeba-test.eng-frm.eng.frm | 1.0 | 0.225 |
57
+ | Tatoeba-test.eng-gcf.eng.gcf | 0.7 | 0.115 |
58
+ | Tatoeba-test.eng-glg.eng.glg | 44.9 | 0.648 |
59
+ | Tatoeba-test.eng-hat.eng.hat | 30.9 | 0.533 |
60
+ | Tatoeba-test.eng-ita.eng.ita | 45.4 | 0.673 |
61
+ | Tatoeba-test.eng-lad.eng.lad | 5.6 | 0.279 |
62
+ | Tatoeba-test.eng-lat.eng.lat | 12.1 | 0.380 |
63
+ | Tatoeba-test.eng-lij.eng.lij | 1.4 | 0.183 |
64
+ | Tatoeba-test.eng-lld.eng.lld | 0.5 | 0.199 |
65
+ | Tatoeba-test.eng-lmo.eng.lmo | 0.7 | 0.187 |
66
+ | Tatoeba-test.eng-mfe.eng.mfe | 83.6 | 0.909 |
67
+ | Tatoeba-test.eng-msa.eng.msa | 31.3 | 0.549 |
68
+ | Tatoeba-test.eng.multi | 38.0 | 0.588 |
69
+ | Tatoeba-test.eng-mwl.eng.mwl | 2.7 | 0.322 |
70
+ | Tatoeba-test.eng-oci.eng.oci | 8.2 | 0.293 |
71
+ | Tatoeba-test.eng-pap.eng.pap | 46.7 | 0.663 |
72
+ | Tatoeba-test.eng-pms.eng.pms | 2.1 | 0.194 |
73
+ | Tatoeba-test.eng-por.eng.por | 41.2 | 0.635 |
74
+ | Tatoeba-test.eng-roh.eng.roh | 2.6 | 0.237 |
75
+ | Tatoeba-test.eng-ron.eng.ron | 40.6 | 0.632 |
76
+ | Tatoeba-test.eng-scn.eng.scn | 1.6 | 0.181 |
77
+ | Tatoeba-test.eng-spa.eng.spa | 49.5 | 0.685 |
78
+ | Tatoeba-test.eng-vec.eng.vec | 1.6 | 0.223 |
79
+ | Tatoeba-test.eng-wln.eng.wln | 7.1 | 0.250 |
80
+
81
+
82
+ ### System Info:
83
+ - hf_name: eng-itc
84
+
85
+ - source_languages: eng
86
+
87
+ - target_languages: itc
88
+
89
+ - opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-itc/README.md
90
+
91
+ - original_repo: Tatoeba-Challenge
92
+
93
+ - tags: ['translation']
94
+
95
+ - prepro: normalization + SentencePiece (spm32k,spm32k)
96
+
97
+ - url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-itc/opus2m-2020-08-01.zip
98
+
99
+ - url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-itc/opus2m-2020-08-01.test.txt
100
+
101
+ - src_alpha3: eng
102
+
103
+ - tgt_alpha3: itc
104
+
105
+ - short_pair: en-itc
106
+
107
+ - chrF2_score: 0.588
108
+
109
+ - bleu: 38.0
110
+
111
+ - brevity_penalty: 0.9670000000000001
112
+
113
+ - ref_len: 73951.0
114
+
115
+ - src_name: English
116
+
117
+ - tgt_name: Italic languages
118
+
119
+ - train_date: 2020-08-01
120
+
121
+ - src_alpha2: en
122
+
123
+ - tgt_alpha2: itc
124
+
125
+ - prefer_old: False
126
+
127
+ - long_pair: eng-itc
128
+
129
+ - helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
130
+
131
+ - transformers_git_sha: 46e9f53347bbe9e989f0335f98465f30886d8173
132
+
133
+ - port_machine: brutasse
134
+
135
+ - port_time: 2020-08-18-01:48