File size: 2,726 Bytes
c4ad8bc
457d05e
 
 
 
c4ad8bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
457d05e
 
 
 
 
 
 
 
 
 
c4ad8bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
457d05e
c4ad8bc
 
 
457d05e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
language: 
-zh
-ms

tags:
- translation

license: apache-2.0
---

### zho-msa

* source group: Chinese 
* target group: Malay (macrolanguage) 
*  OPUS readme: [zho-msa](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/zho-msa/README.md)

*  model: transformer-align
* source language(s): cmn_Bopo cmn_Hani cmn_Latn hak_Hani yue_Bopo yue_Hani
* target language(s): ind zsm_Latn
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/zho-msa/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/zho-msa/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/zho-msa/opus-2020-06-17.eval.txt)

## Benchmarks

| testset               | BLEU  | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.zho.msa 	| 13.9 	| 0.390 |


### System Info: 
- hf_name: zho-msa

- source_languages: zho

- target_languages: msa

- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/zho-msa/README.md

- original_repo: Tatoeba-Challenge

- tags: ['translation']

- languages: ['zh', 'ms']

- src_constituents: {'cmn_Hans', 'nan', 'nan_Hani', 'gan', 'yue', 'cmn_Kana', 'yue_Hani', 'wuu_Bopo', 'cmn_Latn', 'yue_Hira', 'cmn_Hani', 'cjy_Hans', 'cmn', 'lzh_Hang', 'lzh_Hira', 'cmn_Hant', 'lzh_Bopo', 'zho', 'zho_Hans', 'zho_Hant', 'lzh_Hani', 'yue_Hang', 'wuu', 'yue_Kana', 'wuu_Latn', 'yue_Bopo', 'cjy_Hant', 'yue_Hans', 'lzh', 'cmn_Hira', 'lzh_Yiii', 'lzh_Hans', 'cmn_Bopo', 'cmn_Hang', 'hak_Hani', 'cmn_Yiii', 'yue_Hant', 'lzh_Kana', 'wuu_Hani'}

- tgt_constituents: {'zsm_Latn', 'ind', 'max_Latn', 'zlm_Latn', 'min'}

- src_multilingual: False

- tgt_multilingual: False

- prepro:  normalization + SentencePiece (spm32k,spm32k)

- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/zho-msa/opus-2020-06-17.zip

- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/zho-msa/opus-2020-06-17.test.txt

- src_alpha3: zho

- tgt_alpha3: msa

- short_pair: zh-ms

- chrF2_score: 0.39

- bleu: 13.9

- brevity_penalty: 0.9229999999999999

- ref_len: 2762.0

- src_name: Chinese

- tgt_name: Malay (macrolanguage)

- train_date: 2020-06-17

- src_alpha2: zh

- tgt_alpha2: ms

- prefer_old: False

- long_pair: zho-msa

- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535

- transformers_git_sha: 6bdf998dffa70030e42f512a586f33a15e648edd

- port_machine: brutasse

- port_time: 2020-08-19-00:09