mekjr1 commited on
Commit
1bcf70c
1 Parent(s): 0771eba

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -106
README.md CHANGED
@@ -17,9 +17,9 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-es](https://huggingface.co/Helsinki-NLP/opus-mt-en-es) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.8177
21
- - Bleu: 10.6846
22
- - Gen Len: 80.192
23
 
24
  ## Model description
25
 
@@ -44,112 +44,22 @@ The following hyperparameters were used during training:
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
- - num_epochs: 100
48
 
49
  ### Training results
50
 
51
- | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
52
- |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|
53
- | No log | 1.0 | 194 | 2.4716 | 2.7812 | 104.418 |
54
- | No log | 2.0 | 388 | 2.2062 | 4.7912 | 90.8804 |
55
- | 2.5909 | 3.0 | 582 | 2.0808 | 4.7359 | 89.7445 |
56
- | 2.5909 | 4.0 | 776 | 1.9838 | 5.4938 | 90.2541 |
57
- | 2.5909 | 5.0 | 970 | 1.9263 | 5.9674 | 88.1477 |
58
- | 1.8846 | 6.0 | 1164 | 1.8758 | 6.5445 | 85.1374 |
59
- | 1.8846 | 7.0 | 1358 | 1.8382 | 6.7454 | 85.5761 |
60
- | 1.6936 | 8.0 | 1552 | 1.8062 | 7.2213 | 84.6795 |
61
- | 1.6936 | 9.0 | 1746 | 1.7869 | 7.5251 | 82.096 |
62
- | 1.6936 | 10.0 | 1940 | 1.7595 | 7.6866 | 80.4313 |
63
- | 1.5678 | 11.0 | 2134 | 1.7422 | 7.8867 | 82.4653 |
64
- | 1.5678 | 12.0 | 2328 | 1.7329 | 7.7104 | 82.7578 |
65
- | 1.4739 | 13.0 | 2522 | 1.7160 | 8.2123 | 82.0177 |
66
- | 1.4739 | 14.0 | 2716 | 1.6995 | 8.4257 | 82.0635 |
67
- | 1.4739 | 15.0 | 2910 | 1.6958 | 8.4004 | 81.777 |
68
- | 1.3942 | 16.0 | 3104 | 1.6836 | 8.7362 | 82.8154 |
69
- | 1.3942 | 17.0 | 3298 | 1.6832 | 8.7795 | 83.514 |
70
- | 1.3942 | 18.0 | 3492 | 1.6652 | 9.0184 | 81.6455 |
71
- | 1.333 | 19.0 | 3686 | 1.6649 | 8.8325 | 82.8301 |
72
- | 1.333 | 20.0 | 3880 | 1.6597 | 9.6337 | 78.9956 |
73
- | 1.2724 | 21.0 | 4074 | 1.6613 | 9.092 | 82.9749 |
74
- | 1.2724 | 22.0 | 4268 | 1.6594 | 9.4435 | 80.8582 |
75
- | 1.2724 | 23.0 | 4462 | 1.6551 | 9.6593 | 79.1374 |
76
- | 1.2217 | 24.0 | 4656 | 1.6556 | 9.462 | 82.1108 |
77
- | 1.2217 | 25.0 | 4850 | 1.6547 | 9.9703 | 79.5554 |
78
- | 1.1765 | 26.0 | 5044 | 1.6479 | 9.9437 | 79.4047 |
79
- | 1.1765 | 27.0 | 5238 | 1.6452 | 9.9039 | 79.3309 |
80
- | 1.1765 | 28.0 | 5432 | 1.6468 | 10.0735 | 79.6987 |
81
- | 1.1318 | 29.0 | 5626 | 1.6581 | 9.6459 | 81.4948 |
82
- | 1.1318 | 30.0 | 5820 | 1.6525 | 10.0369 | 79.5185 |
83
- | 1.0966 | 31.0 | 6014 | 1.6599 | 9.8151 | 80.6662 |
84
- | 1.0966 | 32.0 | 6208 | 1.6610 | 9.7488 | 81.452 |
85
- | 1.0966 | 33.0 | 6402 | 1.6562 | 10.1888 | 79.5126 |
86
- | 1.0591 | 34.0 | 6596 | 1.6592 | 10.225 | 79.7297 |
87
- | 1.0591 | 35.0 | 6790 | 1.6561 | 10.0509 | 80.4018 |
88
- | 1.0591 | 36.0 | 6984 | 1.6640 | 10.3803 | 79.5672 |
89
- | 1.0272 | 37.0 | 7178 | 1.6624 | 10.1816 | 79.7873 |
90
- | 1.0272 | 38.0 | 7372 | 1.6586 | 10.2491 | 80.0768 |
91
- | 0.9947 | 39.0 | 7566 | 1.6655 | 10.4109 | 80.5391 |
92
- | 0.9947 | 40.0 | 7760 | 1.6641 | 10.5554 | 80.1418 |
93
- | 0.9947 | 41.0 | 7954 | 1.6703 | 10.3092 | 80.5716 |
94
- | 0.9679 | 42.0 | 8148 | 1.6731 | 10.4295 | 79.5303 |
95
- | 0.9679 | 43.0 | 8342 | 1.6778 | 10.8462 | 78.7843 |
96
- | 0.9401 | 44.0 | 8536 | 1.6781 | 10.5304 | 79.1019 |
97
- | 0.9401 | 45.0 | 8730 | 1.6861 | 10.7039 | 80.0399 |
98
- | 0.9401 | 46.0 | 8924 | 1.6854 | 10.8367 | 79.4801 |
99
- | 0.9144 | 47.0 | 9118 | 1.6932 | 10.5007 | 79.9749 |
100
- | 0.9144 | 48.0 | 9312 | 1.6904 | 10.5349 | 79.26 |
101
- | 0.8919 | 49.0 | 9506 | 1.7008 | 10.2924 | 80.3722 |
102
- | 0.8919 | 50.0 | 9700 | 1.6994 | 10.7679 | 80.2304 |
103
- | 0.8919 | 51.0 | 9894 | 1.7049 | 10.8375 | 78.3914 |
104
- | 0.8677 | 52.0 | 10088 | 1.7083 | 10.7612 | 79.1315 |
105
- | 0.8677 | 53.0 | 10282 | 1.7119 | 10.9224 | 79.6381 |
106
- | 0.8677 | 54.0 | 10476 | 1.7194 | 10.5454 | 79.7001 |
107
- | 0.8497 | 55.0 | 10670 | 1.7252 | 10.5037 | 80.5731 |
108
- | 0.8497 | 56.0 | 10864 | 1.7241 | 10.5646 | 79.7179 |
109
- | 0.8278 | 57.0 | 11058 | 1.7282 | 10.3493 | 80.5096 |
110
- | 0.8278 | 58.0 | 11252 | 1.7302 | 10.6999 | 80.031 |
111
- | 0.8278 | 59.0 | 11446 | 1.7241 | 10.8572 | 79.1078 |
112
- | 0.8116 | 60.0 | 11640 | 1.7355 | 10.8868 | 80.322 |
113
- | 0.8116 | 61.0 | 11834 | 1.7386 | 10.7791 | 80.325 |
114
- | 0.7945 | 62.0 | 12028 | 1.7487 | 10.4076 | 80.5406 |
115
- | 0.7945 | 63.0 | 12222 | 1.7534 | 10.7947 | 80.0414 |
116
- | 0.7945 | 64.0 | 12416 | 1.7494 | 10.6789 | 80.0916 |
117
- | 0.7776 | 65.0 | 12610 | 1.7529 | 10.6775 | 80.2201 |
118
- | 0.7776 | 66.0 | 12804 | 1.7537 | 10.5656 | 79.195 |
119
- | 0.7776 | 67.0 | 12998 | 1.7557 | 10.7947 | 80.1123 |
120
- | 0.7646 | 68.0 | 13192 | 1.7620 | 10.727 | 81.2127 |
121
- | 0.7646 | 69.0 | 13386 | 1.7681 | 10.6704 | 80.1462 |
122
- | 0.7512 | 70.0 | 13580 | 1.7704 | 10.6575 | 80.904 |
123
- | 0.7512 | 71.0 | 13774 | 1.7733 | 10.583 | 80.4845 |
124
- | 0.7512 | 72.0 | 13968 | 1.7748 | 10.503 | 80.6617 |
125
- | 0.7381 | 73.0 | 14162 | 1.7707 | 10.6811 | 80.0694 |
126
- | 0.7381 | 74.0 | 14356 | 1.7866 | 10.4964 | 80.7489 |
127
- | 0.7265 | 75.0 | 14550 | 1.7888 | 10.7888 | 80.2038 |
128
- | 0.7265 | 76.0 | 14744 | 1.7900 | 10.5229 | 80.5569 |
129
- | 0.7265 | 77.0 | 14938 | 1.7871 | 10.7917 | 79.7164 |
130
- | 0.7171 | 78.0 | 15132 | 1.7901 | 10.6404 | 80.4535 |
131
- | 0.7171 | 79.0 | 15326 | 1.7986 | 10.6813 | 81.0207 |
132
- | 0.7088 | 80.0 | 15520 | 1.7975 | 10.7134 | 80.3693 |
133
- | 0.7088 | 81.0 | 15714 | 1.7939 | 10.7821 | 79.9823 |
134
- | 0.7088 | 82.0 | 15908 | 1.7988 | 10.7032 | 80.288 |
135
- | 0.6977 | 83.0 | 16102 | 1.8014 | 10.6628 | 80.6662 |
136
- | 0.6977 | 84.0 | 16296 | 1.8063 | 10.8345 | 80.7829 |
137
- | 0.6977 | 85.0 | 16490 | 1.8100 | 10.5796 | 80.3929 |
138
- | 0.6916 | 86.0 | 16684 | 1.8074 | 10.5306 | 79.8597 |
139
- | 0.6916 | 87.0 | 16878 | 1.8101 | 10.6093 | 80.4328 |
140
- | 0.6851 | 88.0 | 17072 | 1.8076 | 10.7705 | 80.164 |
141
- | 0.6851 | 89.0 | 17266 | 1.8116 | 10.7186 | 80.3146 |
142
- | 0.6851 | 90.0 | 17460 | 1.8122 | 10.6272 | 80.3929 |
143
- | 0.6819 | 91.0 | 17654 | 1.8124 | 10.7119 | 80.4254 |
144
- | 0.6819 | 92.0 | 17848 | 1.8139 | 10.6502 | 80.1034 |
145
- | 0.6729 | 93.0 | 18042 | 1.8133 | 10.4559 | 80.4047 |
146
- | 0.6729 | 94.0 | 18236 | 1.8156 | 10.6593 | 80.4874 |
147
- | 0.6729 | 95.0 | 18430 | 1.8175 | 10.5293 | 80.4549 |
148
- | 0.6731 | 96.0 | 18624 | 1.8152 | 10.7104 | 80.1905 |
149
- | 0.6731 | 97.0 | 18818 | 1.8176 | 10.6327 | 80.1078 |
150
- | 0.6686 | 98.0 | 19012 | 1.8177 | 10.7107 | 80.1019 |
151
- | 0.6686 | 99.0 | 19206 | 1.8185 | 10.6076 | 80.3516 |
152
- | 0.6686 | 100.0 | 19400 | 1.8177 | 10.6846 | 80.192 |
153
 
154
 
155
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-es](https://huggingface.co/Helsinki-NLP/opus-mt-en-es) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.9026
21
+ - Bleu: 6.2542
22
+ - Gen Len: 85.8168
23
 
24
  ## Model description
25
 
 
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
+ - num_epochs: 10
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
52
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:--------:|
53
+ | No log | 1.0 | 194 | 2.4894 | 2.7357 | 106.7282 |
54
+ | No log | 2.0 | 388 | 2.2333 | 4.5116 | 90.6795 |
55
+ | 2.6136 | 3.0 | 582 | 2.1176 | 4.7062 | 91.1743 |
56
+ | 2.6136 | 4.0 | 776 | 2.0314 | 5.3422 | 89.0635 |
57
+ | 2.6136 | 5.0 | 970 | 1.9925 | 5.5413 | 90.2363 |
58
+ | 1.9444 | 6.0 | 1164 | 1.9523 | 6.1301 | 84.5288 |
59
+ | 1.9444 | 7.0 | 1358 | 1.9292 | 5.8244 | 88.6617 |
60
+ | 1.8038 | 8.0 | 1552 | 1.9137 | 6.2976 | 88.0 |
61
+ | 1.8038 | 9.0 | 1746 | 1.9062 | 6.2889 | 85.0679 |
62
+ | 1.8038 | 10.0 | 1940 | 1.9026 | 6.2542 | 85.8168 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
63
 
64
 
65
  ### Framework versions