marciovbarbosa commited on
Commit
81485df
1 Parent(s): 796e369

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -57
README.md CHANGED
@@ -19,7 +19,7 @@ model-index:
19
  metrics:
20
  - name: Bleu
21
  type: bleu
22
- value: 11.3921
23
  ---
24
 
25
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -29,9 +29,9 @@ should probably proofread and complete it, then remove this comment. -->
29
 
30
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the wmt16 dataset.
31
  It achieves the following results on the evaluation set:
32
- - Loss: 1.8219
33
- - Bleu: 11.3921
34
- - Gen Len: 17.2471
35
 
36
  ## Model description
37
 
@@ -56,62 +56,22 @@ The following hyperparameters were used during training:
56
  - seed: 42
57
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
  - lr_scheduler_type: linear
59
- - num_epochs: 50
60
 
61
  ### Training results
62
 
63
- | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
64
- |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|
65
- | No log | 1.0 | 272 | 2.1014 | 5.5136 | 17.4975 |
66
- | 2.5302 | 2.0 | 544 | 2.0258 | 7.4515 | 17.3941 |
67
- | 2.5302 | 3.0 | 816 | 1.9866 | 8.3061 | 17.3441 |
68
- | 2.3778 | 4.0 | 1088 | 1.9602 | 8.9169 | 17.3588 |
69
- | 2.3778 | 5.0 | 1360 | 1.9382 | 9.3651 | 17.3204 |
70
- | 2.2676 | 6.0 | 1632 | 1.9215 | 9.6428 | 17.3588 |
71
- | 2.2676 | 7.0 | 1904 | 1.9067 | 9.8039 | 17.3418 |
72
- | 2.2096 | 8.0 | 2176 | 1.8984 | 9.8545 | 17.3264 |
73
- | 2.2096 | 9.0 | 2448 | 1.8883 | 10.03 | 17.3278 |
74
- | 2.1501 | 10.0 | 2720 | 1.8797 | 10.2398 | 17.3358 |
75
- | 2.1501 | 11.0 | 2992 | 1.8738 | 10.3086 | 17.3258 |
76
- | 2.1025 | 12.0 | 3264 | 1.8677 | 10.3851 | 17.3181 |
77
- | 2.0638 | 13.0 | 3536 | 1.8623 | 10.489 | 17.3014 |
78
- | 2.0638 | 14.0 | 3808 | 1.8574 | 10.4969 | 17.3204 |
79
- | 2.034 | 15.0 | 4080 | 1.8528 | 10.7067 | 17.3178 |
80
- | 2.034 | 16.0 | 4352 | 1.8493 | 10.6867 | 17.3408 |
81
- | 1.9852 | 17.0 | 4624 | 1.8473 | 10.8333 | 17.3198 |
82
- | 1.9852 | 18.0 | 4896 | 1.8429 | 10.8907 | 17.3001 |
83
- | 1.9646 | 19.0 | 5168 | 1.8405 | 10.9049 | 17.3154 |
84
- | 1.9646 | 20.0 | 5440 | 1.8385 | 10.9549 | 17.3124 |
85
- | 1.9264 | 21.0 | 5712 | 1.8361 | 11.0046 | 17.3068 |
86
- | 1.9264 | 22.0 | 5984 | 1.8338 | 11.1415 | 17.2954 |
87
- | 1.9161 | 23.0 | 6256 | 1.8333 | 11.1041 | 17.2938 |
88
- | 1.882 | 24.0 | 6528 | 1.8323 | 11.0801 | 17.2651 |
89
- | 1.882 | 25.0 | 6800 | 1.8309 | 11.157 | 17.2921 |
90
- | 1.8751 | 26.0 | 7072 | 1.8290 | 11.1713 | 17.2951 |
91
- | 1.8751 | 27.0 | 7344 | 1.8279 | 11.2006 | 17.2861 |
92
- | 1.8425 | 28.0 | 7616 | 1.8267 | 11.1761 | 17.2658 |
93
- | 1.8425 | 29.0 | 7888 | 1.8278 | 11.148 | 17.2841 |
94
- | 1.8306 | 30.0 | 8160 | 1.8261 | 11.1765 | 17.2748 |
95
- | 1.8306 | 31.0 | 8432 | 1.8255 | 11.2723 | 17.2454 |
96
- | 1.8229 | 32.0 | 8704 | 1.8247 | 11.2715 | 17.2621 |
97
- | 1.8229 | 33.0 | 8976 | 1.8231 | 11.2896 | 17.2698 |
98
- | 1.7975 | 34.0 | 9248 | 1.8245 | 11.322 | 17.2491 |
99
- | 1.7919 | 35.0 | 9520 | 1.8238 | 11.3854 | 17.2711 |
100
- | 1.7919 | 36.0 | 9792 | 1.8237 | 11.3304 | 17.2634 |
101
- | 1.7781 | 37.0 | 10064 | 1.8225 | 11.3184 | 17.2644 |
102
- | 1.7781 | 38.0 | 10336 | 1.8230 | 11.3382 | 17.2651 |
103
- | 1.7819 | 39.0 | 10608 | 1.8228 | 11.3656 | 17.2658 |
104
- | 1.7819 | 40.0 | 10880 | 1.8221 | 11.3934 | 17.2544 |
105
- | 1.7592 | 41.0 | 11152 | 1.8223 | 11.3625 | 17.2421 |
106
- | 1.7592 | 42.0 | 11424 | 1.8221 | 11.4068 | 17.2511 |
107
- | 1.7529 | 43.0 | 11696 | 1.8224 | 11.4199 | 17.2541 |
108
- | 1.7529 | 44.0 | 11968 | 1.8224 | 11.4051 | 17.2561 |
109
- | 1.7482 | 45.0 | 12240 | 1.8223 | 11.4195 | 17.2504 |
110
- | 1.7461 | 46.0 | 12512 | 1.8220 | 11.3873 | 17.2497 |
111
- | 1.7461 | 47.0 | 12784 | 1.8220 | 11.4214 | 17.2431 |
112
- | 1.739 | 48.0 | 13056 | 1.8218 | 11.3972 | 17.2441 |
113
- | 1.739 | 49.0 | 13328 | 1.8219 | 11.3952 | 17.2457 |
114
- | 1.7362 | 50.0 | 13600 | 1.8219 | 11.3921 | 17.2471 |
115
 
116
 
117
  ### Framework versions
19
  metrics:
20
  - name: Bleu
21
  type: bleu
22
+ value: 9.2166
23
  ---
24
 
25
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
 
30
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the wmt16 dataset.
31
  It achieves the following results on the evaluation set:
32
+ - Loss: 1.9417
33
+ - Bleu: 9.2166
34
+ - Gen Len: 17.3404
35
 
36
  ## Model description
37
 
56
  - seed: 42
57
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
  - lr_scheduler_type: linear
59
+ - num_epochs: 10
60
 
61
  ### Training results
62
 
63
+ | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
64
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
65
+ | No log | 1.0 | 272 | 2.1660 | 3.8515 | 17.6289 |
66
+ | 2.6678 | 2.0 | 544 | 2.0656 | 6.4422 | 17.4842 |
67
+ | 2.6678 | 3.0 | 816 | 2.0203 | 7.4348 | 17.3741 |
68
+ | 2.4316 | 4.0 | 1088 | 1.9926 | 8.0914 | 17.3658 |
69
+ | 2.4316 | 5.0 | 1360 | 1.9739 | 8.6535 | 17.3461 |
70
+ | 2.3307 | 6.0 | 1632 | 1.9603 | 8.8757 | 17.3768 |
71
+ | 2.3307 | 7.0 | 1904 | 1.9509 | 9.0744 | 17.3511 |
72
+ | 2.2945 | 8.0 | 2176 | 1.9466 | 9.1111 | 17.3418 |
73
+ | 2.2945 | 9.0 | 2448 | 1.9427 | 9.1969 | 17.3351 |
74
+ | 2.2666 | 10.0 | 2720 | 1.9417 | 9.2166 | 17.3404 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
75
 
76
 
77
  ### Framework versions