ombarki345
commited on
End of training
Browse files
README.md
CHANGED
@@ -17,8 +17,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
-
- Loss:
|
21 |
-
- Bleu: 0.
|
22 |
- Gen Len: 19.0
|
23 |
|
24 |
## Model description
|
@@ -51,106 +51,106 @@ The following hyperparameters were used during training:
|
|
51 |
|
52 |
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|
53 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
|
54 |
-
| No log | 1.0 | 15 |
|
55 |
-
| No log | 2.0 | 30 |
|
56 |
-
| No log | 3.0 | 45 |
|
57 |
-
| No log | 4.0 | 60 |
|
58 |
-
| No log | 5.0 | 75 |
|
59 |
-
| No log | 6.0 | 90 |
|
60 |
-
| No log | 7.0 | 105 |
|
61 |
-
| No log | 8.0 | 120 |
|
62 |
-
| No log | 9.0 | 135 |
|
63 |
-
| No log | 10.0 | 150 |
|
64 |
-
| No log | 11.0 | 165 |
|
65 |
-
| No log | 12.0 | 180 |
|
66 |
-
| No log | 13.0 | 195 |
|
67 |
-
| No log | 14.0 | 210 |
|
68 |
-
| No log | 15.0 | 225 |
|
69 |
-
| No log | 16.0 | 240 |
|
70 |
-
| No log | 17.0 | 255 |
|
71 |
-
| No log | 18.0 | 270 |
|
72 |
-
| No log | 19.0 | 285 |
|
73 |
-
| No log | 20.0 | 300 |
|
74 |
-
| No log | 21.0 | 315 |
|
75 |
-
| No log | 22.0 | 330 |
|
76 |
-
| No log | 23.0 | 345 |
|
77 |
-
| No log | 24.0 | 360 |
|
78 |
-
| No log | 25.0 | 375 |
|
79 |
-
| No log | 26.0 | 390 |
|
80 |
-
| No log | 27.0 | 405 |
|
81 |
-
| No log | 28.0 | 420 |
|
82 |
-
| No log | 29.0 | 435 |
|
83 |
-
| No log | 30.0 | 450 |
|
84 |
-
| No log | 31.0 | 465 |
|
85 |
-
| No log | 32.0 | 480 |
|
86 |
-
| No log | 33.0 | 495 |
|
87 |
-
|
|
88 |
-
|
|
89 |
-
|
|
90 |
-
|
|
91 |
-
|
|
92 |
-
|
|
93 |
-
|
|
94 |
-
|
|
95 |
-
|
|
96 |
-
|
|
97 |
-
|
|
98 |
-
|
|
99 |
-
|
|
100 |
-
|
|
101 |
-
|
|
102 |
-
|
|
103 |
-
|
|
104 |
-
|
|
105 |
-
|
|
106 |
-
|
|
107 |
-
|
|
108 |
-
|
|
109 |
-
|
|
110 |
-
|
|
111 |
-
|
|
112 |
-
|
|
113 |
-
|
|
114 |
-
|
|
115 |
-
|
|
116 |
-
|
|
117 |
-
|
|
118 |
-
|
|
119 |
-
|
|
120 |
-
|
|
121 |
-
|
|
122 |
-
|
|
123 |
-
|
|
124 |
-
|
|
125 |
-
|
|
126 |
-
|
|
127 |
-
|
|
128 |
-
|
|
129 |
-
|
|
130 |
-
|
|
131 |
-
|
|
132 |
-
|
|
133 |
-
|
|
134 |
-
|
|
135 |
-
|
|
136 |
-
|
|
137 |
-
|
|
138 |
-
|
|
139 |
-
|
|
140 |
-
|
|
141 |
-
|
|
142 |
-
|
|
143 |
-
|
|
144 |
-
|
|
145 |
-
|
|
146 |
-
|
|
147 |
-
|
|
148 |
-
|
|
149 |
-
|
|
150 |
-
|
|
151 |
-
|
|
152 |
-
|
|
153 |
-
|
|
154 |
|
155 |
|
156 |
### Framework versions
|
|
|
17 |
|
18 |
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 3.3107
|
21 |
+
- Bleu: 0.0143
|
22 |
- Gen Len: 19.0
|
23 |
|
24 |
## Model description
|
|
|
51 |
|
52 |
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|
53 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
|
54 |
+
| No log | 1.0 | 15 | 6.3635 | 0.0008 | 19.0 |
|
55 |
+
| No log | 2.0 | 30 | 5.8107 | 0.0011 | 19.0 |
|
56 |
+
| No log | 3.0 | 45 | 5.5407 | 0.0006 | 19.0 |
|
57 |
+
| No log | 4.0 | 60 | 5.3596 | 0.0003 | 19.0 |
|
58 |
+
| No log | 5.0 | 75 | 5.1147 | 0.0004 | 19.0 |
|
59 |
+
| No log | 6.0 | 90 | 5.0018 | 0.0008 | 19.0 |
|
60 |
+
| No log | 7.0 | 105 | 4.8949 | 0.0008 | 19.0 |
|
61 |
+
| No log | 8.0 | 120 | 4.8042 | 0.0007 | 19.0 |
|
62 |
+
| No log | 9.0 | 135 | 4.7144 | 0.0007 | 19.0 |
|
63 |
+
| No log | 10.0 | 150 | 4.6325 | 0.001 | 19.0 |
|
64 |
+
| No log | 11.0 | 165 | 4.5552 | 0.002 | 19.0 |
|
65 |
+
| No log | 12.0 | 180 | 4.4837 | 0.0029 | 19.0 |
|
66 |
+
| No log | 13.0 | 195 | 4.4080 | 0.0077 | 19.0 |
|
67 |
+
| No log | 14.0 | 210 | 4.3378 | 0.0111 | 19.0 |
|
68 |
+
| No log | 15.0 | 225 | 4.2740 | 0.0044 | 19.0 |
|
69 |
+
| No log | 16.0 | 240 | 4.2069 | 0.0035 | 19.0 |
|
70 |
+
| No log | 17.0 | 255 | 4.1457 | 0.0042 | 19.0 |
|
71 |
+
| No log | 18.0 | 270 | 4.0861 | 0.0045 | 19.0 |
|
72 |
+
| No log | 19.0 | 285 | 4.0249 | 0.0049 | 19.0 |
|
73 |
+
| No log | 20.0 | 300 | 3.9686 | 0.0048 | 19.0 |
|
74 |
+
| No log | 21.0 | 315 | 3.9081 | 0.005 | 19.0 |
|
75 |
+
| No log | 22.0 | 330 | 3.8500 | 0.0028 | 19.0 |
|
76 |
+
| No log | 23.0 | 345 | 3.8008 | 0.0012 | 19.0 |
|
77 |
+
| No log | 24.0 | 360 | 3.7656 | 0.0011 | 19.0 |
|
78 |
+
| No log | 25.0 | 375 | 3.7282 | 0.0016 | 19.0 |
|
79 |
+
| No log | 26.0 | 390 | 3.6922 | 0.0017 | 19.0 |
|
80 |
+
| No log | 27.0 | 405 | 3.6588 | 0.0019 | 19.0 |
|
81 |
+
| No log | 28.0 | 420 | 3.6310 | 0.0009 | 19.0 |
|
82 |
+
| No log | 29.0 | 435 | 3.6112 | 0.0091 | 19.0 |
|
83 |
+
| No log | 30.0 | 450 | 3.5963 | 0.0055 | 19.0 |
|
84 |
+
| No log | 31.0 | 465 | 3.5821 | 0.0021 | 19.0 |
|
85 |
+
| No log | 32.0 | 480 | 3.5709 | 0.002 | 19.0 |
|
86 |
+
| No log | 33.0 | 495 | 3.5629 | 0.001 | 19.0 |
|
87 |
+
| 4.6251 | 34.0 | 510 | 3.5503 | 0.0008 | 19.0 |
|
88 |
+
| 4.6251 | 35.0 | 525 | 3.5372 | 0.0008 | 19.0 |
|
89 |
+
| 4.6251 | 36.0 | 540 | 3.5277 | 0.0008 | 19.0 |
|
90 |
+
| 4.6251 | 37.0 | 555 | 3.5173 | 0.0014 | 19.0 |
|
91 |
+
| 4.6251 | 38.0 | 570 | 3.5105 | 0.0032 | 19.0 |
|
92 |
+
| 4.6251 | 39.0 | 585 | 3.5020 | 0.0043 | 19.0 |
|
93 |
+
| 4.6251 | 40.0 | 600 | 3.4959 | 0.0045 | 19.0 |
|
94 |
+
| 4.6251 | 41.0 | 615 | 3.4887 | 0.0039 | 19.0 |
|
95 |
+
| 4.6251 | 42.0 | 630 | 3.4751 | 0.0044 | 19.0 |
|
96 |
+
| 4.6251 | 43.0 | 645 | 3.4673 | 0.0038 | 19.0 |
|
97 |
+
| 4.6251 | 44.0 | 660 | 3.4621 | 0.0047 | 19.0 |
|
98 |
+
| 4.6251 | 45.0 | 675 | 3.4514 | 0.0048 | 19.0 |
|
99 |
+
| 4.6251 | 46.0 | 690 | 3.4448 | 0.0066 | 19.0 |
|
100 |
+
| 4.6251 | 47.0 | 705 | 3.4384 | 0.0067 | 19.0 |
|
101 |
+
| 4.6251 | 48.0 | 720 | 3.4352 | 0.0092 | 19.0 |
|
102 |
+
| 4.6251 | 49.0 | 735 | 3.4242 | 0.0106 | 19.0 |
|
103 |
+
| 4.6251 | 50.0 | 750 | 3.4186 | 0.0129 | 19.0 |
|
104 |
+
| 4.6251 | 51.0 | 765 | 3.4135 | 0.0117 | 19.0 |
|
105 |
+
| 4.6251 | 52.0 | 780 | 3.4100 | 0.0128 | 19.0 |
|
106 |
+
| 4.6251 | 53.0 | 795 | 3.4033 | 0.0046 | 19.0 |
|
107 |
+
| 4.6251 | 54.0 | 810 | 3.3985 | 0.0064 | 19.0 |
|
108 |
+
| 4.6251 | 55.0 | 825 | 3.3941 | 0.009 | 19.0 |
|
109 |
+
| 4.6251 | 56.0 | 840 | 3.3896 | 0.0083 | 19.0 |
|
110 |
+
| 4.6251 | 57.0 | 855 | 3.3888 | 0.0082 | 19.0 |
|
111 |
+
| 4.6251 | 58.0 | 870 | 3.3827 | 0.0064 | 19.0 |
|
112 |
+
| 4.6251 | 59.0 | 885 | 3.3775 | 0.0081 | 19.0 |
|
113 |
+
| 4.6251 | 60.0 | 900 | 3.3726 | 0.0074 | 19.0 |
|
114 |
+
| 4.6251 | 61.0 | 915 | 3.3689 | 0.0065 | 19.0 |
|
115 |
+
| 4.6251 | 62.0 | 930 | 3.3671 | 0.0109 | 19.0 |
|
116 |
+
| 4.6251 | 63.0 | 945 | 3.3640 | 0.0125 | 19.0 |
|
117 |
+
| 4.6251 | 64.0 | 960 | 3.3586 | 0.0164 | 19.0 |
|
118 |
+
| 4.6251 | 65.0 | 975 | 3.3527 | 0.0144 | 19.0 |
|
119 |
+
| 4.6251 | 66.0 | 990 | 3.3526 | 0.0221 | 19.0 |
|
120 |
+
| 3.5088 | 67.0 | 1005 | 3.3534 | 0.0136 | 19.0 |
|
121 |
+
| 3.5088 | 68.0 | 1020 | 3.3490 | 0.0142 | 19.0 |
|
122 |
+
| 3.5088 | 69.0 | 1035 | 3.3479 | 0.0162 | 19.0 |
|
123 |
+
| 3.5088 | 70.0 | 1050 | 3.3452 | 0.0155 | 19.0 |
|
124 |
+
| 3.5088 | 71.0 | 1065 | 3.3422 | 0.016 | 19.0 |
|
125 |
+
| 3.5088 | 72.0 | 1080 | 3.3387 | 0.0104 | 19.0 |
|
126 |
+
| 3.5088 | 73.0 | 1095 | 3.3349 | 0.0119 | 19.0 |
|
127 |
+
| 3.5088 | 74.0 | 1110 | 3.3351 | 0.0096 | 19.0 |
|
128 |
+
| 3.5088 | 75.0 | 1125 | 3.3349 | 0.0166 | 19.0 |
|
129 |
+
| 3.5088 | 76.0 | 1140 | 3.3329 | 0.0132 | 19.0 |
|
130 |
+
| 3.5088 | 77.0 | 1155 | 3.3299 | 0.0131 | 19.0 |
|
131 |
+
| 3.5088 | 78.0 | 1170 | 3.3272 | 0.012 | 19.0 |
|
132 |
+
| 3.5088 | 79.0 | 1185 | 3.3244 | 0.014 | 19.0 |
|
133 |
+
| 3.5088 | 80.0 | 1200 | 3.3239 | 0.0133 | 19.0 |
|
134 |
+
| 3.5088 | 81.0 | 1215 | 3.3239 | 0.0152 | 19.0 |
|
135 |
+
| 3.5088 | 82.0 | 1230 | 3.3231 | 0.0158 | 19.0 |
|
136 |
+
| 3.5088 | 83.0 | 1245 | 3.3209 | 0.0143 | 19.0 |
|
137 |
+
| 3.5088 | 84.0 | 1260 | 3.3198 | 0.0155 | 19.0 |
|
138 |
+
| 3.5088 | 85.0 | 1275 | 3.3185 | 0.0139 | 19.0 |
|
139 |
+
| 3.5088 | 86.0 | 1290 | 3.3176 | 0.0133 | 19.0 |
|
140 |
+
| 3.5088 | 87.0 | 1305 | 3.3164 | 0.0128 | 19.0 |
|
141 |
+
| 3.5088 | 88.0 | 1320 | 3.3175 | 0.0159 | 19.0 |
|
142 |
+
| 3.5088 | 89.0 | 1335 | 3.3166 | 0.0113 | 19.0 |
|
143 |
+
| 3.5088 | 90.0 | 1350 | 3.3156 | 0.0142 | 19.0 |
|
144 |
+
| 3.5088 | 91.0 | 1365 | 3.3151 | 0.0131 | 19.0 |
|
145 |
+
| 3.5088 | 92.0 | 1380 | 3.3146 | 0.0111 | 19.0 |
|
146 |
+
| 3.5088 | 93.0 | 1395 | 3.3131 | 0.0112 | 18.9833 |
|
147 |
+
| 3.5088 | 94.0 | 1410 | 3.3121 | 0.0096 | 19.0 |
|
148 |
+
| 3.5088 | 95.0 | 1425 | 3.3117 | 0.012 | 19.0 |
|
149 |
+
| 3.5088 | 96.0 | 1440 | 3.3112 | 0.0151 | 19.0 |
|
150 |
+
| 3.5088 | 97.0 | 1455 | 3.3106 | 0.0133 | 19.0 |
|
151 |
+
| 3.5088 | 98.0 | 1470 | 3.3104 | 0.0115 | 19.0 |
|
152 |
+
| 3.5088 | 99.0 | 1485 | 3.3107 | 0.0132 | 19.0 |
|
153 |
+
| 3.3133 | 100.0 | 1500 | 3.3107 | 0.0143 | 19.0 |
|
154 |
|
155 |
|
156 |
### Framework versions
|
runs/Mar19_13-05-37_8c0752404c4a/events.out.tfevents.1710853538.8c0752404c4a.34.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c8c729a0dd85afd367d1cae64a0c2cae1236872dcdc904df03a54e6ee5f9b3f8
|
3 |
+
size 43523
|