system HF staff commited on
Commit
93cead0
1 Parent(s): 6dbf0cb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +116 -0
README.md ADDED
@@ -0,0 +1,116 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - translation
5
+
6
+ license: apache-2.0
7
+ ---
8
+
9
+ ### eng-iir
10
+
11
+ * source group: English
12
+ * target group: Indo-Iranian languages
13
+ * OPUS readme: [eng-iir](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-iir/README.md)
14
+
15
+ * model: transformer
16
+ * source language(s): eng
17
+ * target language(s): asm awa ben bho gom guj hif_Latn hin jdt_Cyrl kur_Arab kur_Latn mai mar npi ori oss pan_Guru pes pes_Latn pes_Thaa pnb pus rom san_Deva sin snd_Arab tgk_Cyrl tly_Latn urd zza
18
+ * model: transformer
19
+ * pre-processing: normalization + SentencePiece (spm32k,spm32k)
20
+ * a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
21
+ * download original weights: [opus2m-2020-08-01.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-iir/opus2m-2020-08-01.zip)
22
+ * test set translations: [opus2m-2020-08-01.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-iir/opus2m-2020-08-01.test.txt)
23
+ * test set scores: [opus2m-2020-08-01.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-iir/opus2m-2020-08-01.eval.txt)
24
+
25
+ ## Benchmarks
26
+
27
+ | testset | BLEU | chr-F |
28
+ |-----------------------|-------|-------|
29
+ | newsdev2014-enghin.eng.hin | 6.7 | 0.326 |
30
+ | newsdev2019-engu-engguj.eng.guj | 6.0 | 0.283 |
31
+ | newstest2014-hien-enghin.eng.hin | 10.4 | 0.353 |
32
+ | newstest2019-engu-engguj.eng.guj | 6.6 | 0.282 |
33
+ | Tatoeba-test.eng-asm.eng.asm | 2.7 | 0.249 |
34
+ | Tatoeba-test.eng-awa.eng.awa | 0.4 | 0.122 |
35
+ | Tatoeba-test.eng-ben.eng.ben | 15.3 | 0.459 |
36
+ | Tatoeba-test.eng-bho.eng.bho | 3.7 | 0.161 |
37
+ | Tatoeba-test.eng-fas.eng.fas | 3.4 | 0.227 |
38
+ | Tatoeba-test.eng-guj.eng.guj | 18.5 | 0.365 |
39
+ | Tatoeba-test.eng-hif.eng.hif | 1.0 | 0.064 |
40
+ | Tatoeba-test.eng-hin.eng.hin | 17.0 | 0.461 |
41
+ | Tatoeba-test.eng-jdt.eng.jdt | 3.9 | 0.122 |
42
+ | Tatoeba-test.eng-kok.eng.kok | 5.5 | 0.059 |
43
+ | Tatoeba-test.eng-kur.eng.kur | 4.0 | 0.125 |
44
+ | Tatoeba-test.eng-lah.eng.lah | 0.3 | 0.008 |
45
+ | Tatoeba-test.eng-mai.eng.mai | 9.3 | 0.445 |
46
+ | Tatoeba-test.eng-mar.eng.mar | 20.7 | 0.473 |
47
+ | Tatoeba-test.eng.multi | 13.7 | 0.392 |
48
+ | Tatoeba-test.eng-nep.eng.nep | 0.6 | 0.060 |
49
+ | Tatoeba-test.eng-ori.eng.ori | 2.4 | 0.193 |
50
+ | Tatoeba-test.eng-oss.eng.oss | 2.1 | 0.174 |
51
+ | Tatoeba-test.eng-pan.eng.pan | 9.7 | 0.355 |
52
+ | Tatoeba-test.eng-pus.eng.pus | 1.0 | 0.126 |
53
+ | Tatoeba-test.eng-rom.eng.rom | 1.3 | 0.230 |
54
+ | Tatoeba-test.eng-san.eng.san | 1.3 | 0.101 |
55
+ | Tatoeba-test.eng-sin.eng.sin | 11.7 | 0.384 |
56
+ | Tatoeba-test.eng-snd.eng.snd | 2.8 | 0.180 |
57
+ | Tatoeba-test.eng-tgk.eng.tgk | 8.1 | 0.353 |
58
+ | Tatoeba-test.eng-tly.eng.tly | 0.5 | 0.015 |
59
+ | Tatoeba-test.eng-urd.eng.urd | 12.3 | 0.409 |
60
+ | Tatoeba-test.eng-zza.eng.zza | 0.5 | 0.025 |
61
+
62
+
63
+ ### System Info:
64
+ - hf_name: eng-iir
65
+
66
+ - source_languages: eng
67
+
68
+ - target_languages: iir
69
+
70
+ - opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-iir/README.md
71
+
72
+ - original_repo: Tatoeba-Challenge
73
+
74
+ - tags: ['translation']
75
+
76
+ - prepro: normalization + SentencePiece (spm32k,spm32k)
77
+
78
+ - url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-iir/opus2m-2020-08-01.zip
79
+
80
+ - url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-iir/opus2m-2020-08-01.test.txt
81
+
82
+ - src_alpha3: eng
83
+
84
+ - tgt_alpha3: iir
85
+
86
+ - short_pair: en-iir
87
+
88
+ - chrF2_score: 0.392
89
+
90
+ - bleu: 13.7
91
+
92
+ - brevity_penalty: 1.0
93
+
94
+ - ref_len: 63351.0
95
+
96
+ - src_name: English
97
+
98
+ - tgt_name: Indo-Iranian languages
99
+
100
+ - train_date: 2020-08-01
101
+
102
+ - src_alpha2: en
103
+
104
+ - tgt_alpha2: iir
105
+
106
+ - prefer_old: False
107
+
108
+ - long_pair: eng-iir
109
+
110
+ - helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
111
+
112
+ - transformers_git_sha: 46e9f53347bbe9e989f0335f98465f30886d8173
113
+
114
+ - port_machine: brutasse
115
+
116
+ - port_time: 2020-08-18-01:48