guocheng98 commited on
Commit
5cd8050
1 Parent(s): 7b6ccdc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -14
README.md CHANGED
@@ -11,17 +11,11 @@ license: apache-2.0
11
 
12
  # HelsinkiNLP-FineTuned-Legal-es-zh
13
 
14
- This model is a fine-tuned version of [Helsinki-NLP/opus-tatoeba-es-zh](https://huggingface.co/Helsinki-NLP/opus-tatoeba-es-zh) on a dataset constructed by the author himself.
15
- It achieves the following results on the evaluation set:
16
- - Loss: 2.0905
17
-
18
- ## Model description
19
-
20
- Transformer-based NMT model to translate from Spanish to Simplified Chinese, fine-tuned for legal domain.
21
 
22
  ## Intended uses & limitations
23
 
24
- This model is the result of the master graduation thesis for the Tradumatics: Translation Technologies program at the Autonomous University of Barcelona.
25
 
26
  The thesis intends to explain various theories and certain algorithm details about neural machine translation, thus this fine-tuned model only serves as a hands-on practice example for that objective, without any intention of productive usage.
27
 
@@ -31,9 +25,7 @@ The dataset is constructed from the Chinese translation of Spanish Civil Code, S
31
 
32
  There are 9972 sentence pairs constructed. 1000 are used for evaluation and the rest for training.
33
 
34
- ## Training procedure
35
-
36
- ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
  - learning_rate: 2e-05
@@ -48,7 +40,7 @@ The following hyperparameters were used during training:
48
  - weight_decay: 0.01
49
  - early_stopping_patience: 8
50
 
51
- ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss |
54
  |:-------------:|:-----:|:----:|:---------------:|
@@ -75,8 +67,7 @@ The following hyperparameters were used during training:
75
  | 1.1238 | 7.49 | 8400 | 2.1102 |
76
  | 1.1417 | 7.84 | 8800 | 2.1078 |
77
 
78
-
79
- ### Framework versions
80
 
81
  - Transformers 4.7.0
82
  - Pytorch 1.8.1+cu101
 
11
 
12
  # HelsinkiNLP-FineTuned-Legal-es-zh
13
 
14
+ This model is a fine-tuned version of [Helsinki-NLP/opus-tatoeba-es-zh](https://huggingface.co/Helsinki-NLP/opus-tatoeba-es-zh) on a dataset of legal domain constructed by the author himself.
 
 
 
 
 
 
15
 
16
  ## Intended uses & limitations
17
 
18
+ This model is the result of the master graduation thesis for the Tradumatics: Translation Technologies program at the Autonomous University of Barcelona. Please refer to GitHub repo created for this thesis for full-text and relative open-sourced materials: https://github.com/guocheng98/MUTTT2020_TFM_ZGC
19
 
20
  The thesis intends to explain various theories and certain algorithm details about neural machine translation, thus this fine-tuned model only serves as a hands-on practice example for that objective, without any intention of productive usage.
21
 
 
25
 
26
  There are 9972 sentence pairs constructed. 1000 are used for evaluation and the rest for training.
27
 
28
+ ## Training hyperparameters
 
 
29
 
30
  The following hyperparameters were used during training:
31
  - learning_rate: 2e-05
 
40
  - weight_decay: 0.01
41
  - early_stopping_patience: 8
42
 
43
+ ## Training results
44
 
45
  | Training Loss | Epoch | Step | Validation Loss |
46
  |:-------------:|:-----:|:----:|:---------------:|
 
67
  | 1.1238 | 7.49 | 8400 | 2.1102 |
68
  | 1.1417 | 7.84 | 8800 | 2.1078 |
69
 
70
+ ## Framework versions
 
71
 
72
  - Transformers 4.7.0
73
  - Pytorch 1.8.1+cu101