imdbo commited on
Commit
aaf356b
1 Parent(s): b761ab1

Update README_English.md

Browse files
Files changed (1) hide show
  1. README_English.md +14 -11
README_English.md CHANGED
@@ -13,40 +13,43 @@ metrics:
13
  License: MIT
14
  ---
15
 
16
- **Model Description**
17
 
18
- OpenNMT model for English-Galician using a transformer architecture.
19
 
20
  **How to translate**
21
 
22
  + Open bash terminal
23
- + Install [Python 3.9](https://www.python.org/downloads/release/python-390/)
24
  + Install [Open NMT toolkit v.2.2](https://github.com/OpenNMT/OpenNMT-py)
25
  + Translate an input_text using the NOS-MT-gl-es model with the following command:
26
 
27
  ```bash
28
- onmt_translate -src input_text -model NOS-MT-gl-es.pt -output ./output_file.txt -replace_unk -gpu 0
29
  ```
30
- + The result of the translation will be in the PATH indicated by the -output flag.
31
 
32
  **Training**
33
 
34
- In the training we have used authentic and synthetic corpora from [ProxectoNós](https://github.com/proxectonos/corpora). The former are corpora of translations directly produced by human translators. The latter are corpora of English-Portuguese translations, which we have converted into English-Galician by means of Portuguese-Galician translation with Opentrad/Apertium and transliteration for out-of-vocabulary words.
 
 
 
35
 
36
  **Training process**
37
 
38
- + Tokenization of the datasets made with linguakit tokeniser https://github.com/citiususc/Linguakit
39
- + The vocabulary for the models was generated through the script [learn_bpe.py](https://github.com/OpenNMT/OpenNMT-py/blob/master/tools/learn_bpe.py) of OpenNMT
40
- + Using .yaml in this repository you can replicate the training process as follows
41
 
42
  ```bash
43
  onmt_build_vocab -config bpe-gl-es_emb.yaml -n_sample 100000
44
  onmt_train -config bpe-gl-es_emb.yaml
45
  ```
46
 
47
- **Hyper-parameters**
48
 
49
- The parameters used for the development of the model can be directly consulted in the same .yaml file bpe-en-gl_emb.yaml
50
 
51
  **Evaluation**
52
 
 
13
  License: MIT
14
  ---
15
 
16
+ **Model description**
17
 
18
+ Model developed with OpenNMT for the Galician-Spanish pair using the transformer architecture.
19
 
20
  **How to translate**
21
 
22
  + Open bash terminal
23
+ + Install [Python 3.9](https://www.python.org/downloads/release/python-390/)
24
  + Install [Open NMT toolkit v.2.2](https://github.com/OpenNMT/OpenNMT-py)
25
  + Translate an input_text using the NOS-MT-gl-es model with the following command:
26
 
27
  ```bash
28
+ onmt_translate -src input_text -model NOS-MT-es-gl -output ./output_file.txt -replace_unk -phrase_table phrase_table-gl-es.txt -gpu 0
29
  ```
30
+ + The resulting translation will be in the PATH indicated by the -output flag.
31
 
32
  **Training**
33
 
34
+ To train this model, we have used authentic and synthetic corpora from [ProxectoNós](https://github.com/proxectonos/corpora).
35
+
36
+ Authentic corpora are corpora produced by human translators. Synthetic corpora are Spanish-Portuguese translations, which have been converted to Spanish-Galician by means of Portuguese-Galician translation with Opentrad/Apertium and transliteration for out-of-vocabulary words.
37
+
38
 
39
  **Training process**
40
 
41
+ + Tokenisation was performed with a modified version of the [linguakit](https://github.com/citiususc/Linguakit) tokeniser (tokenizer.pl) that does not append a new line after each token.
42
+ + All BPE models were generated with the script [learn_bpe.py](https://github.com/OpenNMT/OpenNMT-py/blob/master/tools/learn_bpe.py)
43
+ + Using the .yaml in this repository it is possible to replicate the original training process. Before training the model, please verify that the path to each target (tgt) and (src) file is correct. Once this is done, proceed as follows:
44
 
45
  ```bash
46
  onmt_build_vocab -config bpe-gl-es_emb.yaml -n_sample 100000
47
  onmt_train -config bpe-gl-es_emb.yaml
48
  ```
49
 
50
+ **Hyperparameters**
51
 
52
+ You may find the parameters used for this model inside the file bpe-gl-es_emb.yaml
53
 
54
  **Evaluation**
55