Kielak2 commited on
Commit
68314d7
1 Parent(s): 2a1156f

End of training

Browse files
README.md CHANGED
@@ -1,4 +1,5 @@
1
  ---
 
2
  tags:
3
  - generated_from_trainer
4
  model-index:
@@ -11,9 +12,9 @@ should probably proofread and complete it, then remove this comment. -->
11
 
12
  # calculator_model_test
13
 
14
- This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset.
15
  It achieves the following results on the evaluation set:
16
- - Loss: 0.7602
17
 
18
  ## Model description
19
 
@@ -33,58 +34,218 @@ More information needed
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 0.001
36
- - train_batch_size: 64
37
- - eval_batch_size: 64
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: linear
41
- - num_epochs: 40
42
  - mixed_precision_training: Native AMP
43
 
44
  ### Training results
45
 
46
  | Training Loss | Epoch | Step | Validation Loss |
47
  |:-------------:|:-----:|:----:|:---------------:|
48
- | 1.6045 | 1.0 | 3 | 1.5511 |
49
- | 1.3068 | 2.0 | 6 | 1.6045 |
50
- | 1.4803 | 3.0 | 9 | 1.3825 |
51
- | 1.3351 | 4.0 | 12 | 1.3222 |
52
- | 1.259 | 5.0 | 15 | 1.4317 |
53
- | 1.237 | 6.0 | 18 | 1.3714 |
54
- | 1.2416 | 7.0 | 21 | 1.3284 |
55
- | 1.1719 | 8.0 | 24 | 1.2900 |
56
- | 1.1413 | 9.0 | 27 | 1.2300 |
57
- | 1.0976 | 10.0 | 30 | 1.2427 |
58
- | 1.1137 | 11.0 | 33 | 1.1367 |
59
- | 1.0512 | 12.0 | 36 | 1.0890 |
60
- | 1.0176 | 13.0 | 39 | 1.1537 |
61
- | 1.0737 | 14.0 | 42 | 1.1180 |
62
- | 0.9744 | 15.0 | 45 | 1.0935 |
63
- | 0.9448 | 16.0 | 48 | 0.9942 |
64
- | 0.9564 | 17.0 | 51 | 1.0161 |
65
- | 0.9895 | 18.0 | 54 | 0.9486 |
66
- | 0.9088 | 19.0 | 57 | 0.9694 |
67
- | 0.9169 | 20.0 | 60 | 0.9542 |
68
- | 0.8814 | 21.0 | 63 | 0.9105 |
69
- | 0.8952 | 22.0 | 66 | 0.9043 |
70
- | 0.8635 | 23.0 | 69 | 0.8930 |
71
- | 0.8357 | 24.0 | 72 | 0.9285 |
72
- | 0.8266 | 25.0 | 75 | 0.8613 |
73
- | 0.8271 | 26.0 | 78 | 0.9072 |
74
- | 0.7921 | 27.0 | 81 | 0.8674 |
75
- | 0.775 | 28.0 | 84 | 0.8476 |
76
- | 0.7766 | 29.0 | 87 | 0.8600 |
77
- | 0.7682 | 30.0 | 90 | 0.8438 |
78
- | 0.7844 | 31.0 | 93 | 0.8022 |
79
- | 0.7114 | 32.0 | 96 | 0.8131 |
80
- | 0.7391 | 33.0 | 99 | 0.7957 |
81
- | 0.7347 | 34.0 | 102 | 0.7810 |
82
- | 0.7104 | 35.0 | 105 | 0.7740 |
83
- | 0.7248 | 36.0 | 108 | 0.7665 |
84
- | 0.7359 | 37.0 | 111 | 0.7819 |
85
- | 0.7358 | 38.0 | 114 | 0.7668 |
86
- | 0.7235 | 39.0 | 117 | 0.7601 |
87
- | 0.7203 | 40.0 | 120 | 0.7602 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
88
 
89
 
90
  ### Framework versions
 
1
  ---
2
+ base_model: Kielak2/calculator_model_test
3
  tags:
4
  - generated_from_trainer
5
  model-index:
 
12
 
13
  # calculator_model_test
14
 
15
+ This model is a fine-tuned version of [Kielak2/calculator_model_test](https://huggingface.co/Kielak2/calculator_model_test) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.2680
18
 
19
  ## Model description
20
 
 
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 0.001
37
+ - train_batch_size: 512
38
+ - eval_batch_size: 512
39
  - seed: 42
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
+ - num_epochs: 200
43
  - mixed_precision_training: Native AMP
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | 0.5001 | 1.0 | 1 | 3.3999 |
50
+ | 3.3159 | 2.0 | 2 | 2.5866 |
51
+ | 2.3974 | 3.0 | 3 | 2.3937 |
52
+ | 2.0971 | 4.0 | 4 | 2.2596 |
53
+ | 2.1181 | 5.0 | 5 | 2.1882 |
54
+ | 2.0458 | 6.0 | 6 | 1.9196 |
55
+ | 1.8142 | 7.0 | 7 | 1.6198 |
56
+ | 1.5353 | 8.0 | 8 | 1.3971 |
57
+ | 1.3998 | 9.0 | 9 | 1.2959 |
58
+ | 1.2674 | 10.0 | 10 | 1.2443 |
59
+ | 1.2303 | 11.0 | 11 | 1.2443 |
60
+ | 1.242 | 12.0 | 12 | 1.1784 |
61
+ | 1.1823 | 13.0 | 13 | 1.1184 |
62
+ | 1.1237 | 14.0 | 14 | 1.1045 |
63
+ | 1.091 | 15.0 | 15 | 1.0504 |
64
+ | 1.0599 | 16.0 | 16 | 1.0647 |
65
+ | 1.0767 | 17.0 | 17 | 1.0645 |
66
+ | 1.0676 | 18.0 | 18 | 1.0431 |
67
+ | 1.035 | 19.0 | 19 | 0.9963 |
68
+ | 0.9796 | 20.0 | 20 | 1.0182 |
69
+ | 1.0002 | 21.0 | 21 | 1.0211 |
70
+ | 1.0016 | 22.0 | 22 | 1.0027 |
71
+ | 0.9601 | 23.0 | 23 | 0.9614 |
72
+ | 0.9323 | 24.0 | 24 | 0.9258 |
73
+ | 0.8823 | 25.0 | 25 | 0.9227 |
74
+ | 0.8984 | 26.0 | 26 | 0.9107 |
75
+ | 0.9054 | 27.0 | 27 | 0.8881 |
76
+ | 0.9126 | 28.0 | 28 | 0.9379 |
77
+ | 0.9166 | 29.0 | 29 | 0.9273 |
78
+ | 0.9053 | 30.0 | 30 | 0.8998 |
79
+ | 0.8671 | 31.0 | 31 | 0.8555 |
80
+ | 0.829 | 32.0 | 32 | 0.8551 |
81
+ | 0.8544 | 33.0 | 33 | 0.8491 |
82
+ | 0.8128 | 34.0 | 34 | 0.8184 |
83
+ | 0.7961 | 35.0 | 35 | 0.8312 |
84
+ | 0.7854 | 36.0 | 36 | 0.8337 |
85
+ | 0.808 | 37.0 | 37 | 0.8184 |
86
+ | 0.8211 | 38.0 | 38 | 0.8191 |
87
+ | 0.7993 | 39.0 | 39 | 0.7743 |
88
+ | 0.7789 | 40.0 | 40 | 0.7454 |
89
+ | 0.7924 | 41.0 | 41 | 0.7314 |
90
+ | 0.7243 | 42.0 | 42 | 0.8436 |
91
+ | 0.7537 | 43.0 | 43 | 0.8050 |
92
+ | 0.7622 | 44.0 | 44 | 0.7724 |
93
+ | 0.7694 | 45.0 | 45 | 0.7963 |
94
+ | 0.7819 | 46.0 | 46 | 0.7872 |
95
+ | 0.739 | 47.0 | 47 | 0.8100 |
96
+ | 0.7456 | 48.0 | 48 | 0.7989 |
97
+ | 0.7214 | 49.0 | 49 | 0.7234 |
98
+ | 0.6545 | 50.0 | 50 | 0.6993 |
99
+ | 0.6834 | 51.0 | 51 | 0.6556 |
100
+ | 0.6664 | 52.0 | 52 | 0.6544 |
101
+ | 0.6141 | 53.0 | 53 | 0.6489 |
102
+ | 0.5929 | 54.0 | 54 | 0.6268 |
103
+ | 0.566 | 55.0 | 55 | 0.6311 |
104
+ | 0.6577 | 56.0 | 56 | 0.5828 |
105
+ | 0.598 | 57.0 | 57 | 0.6526 |
106
+ | 0.6056 | 58.0 | 58 | 0.7250 |
107
+ | 0.6204 | 59.0 | 59 | 0.6612 |
108
+ | 0.5968 | 60.0 | 60 | 0.5759 |
109
+ | 0.5823 | 61.0 | 61 | 0.5836 |
110
+ | 0.5986 | 62.0 | 62 | 0.5375 |
111
+ | 0.5247 | 63.0 | 63 | 0.5993 |
112
+ | 0.5891 | 64.0 | 64 | 0.6175 |
113
+ | 0.6142 | 65.0 | 65 | 0.5691 |
114
+ | 0.5602 | 66.0 | 66 | 0.5180 |
115
+ | 0.5017 | 67.0 | 67 | 0.5726 |
116
+ | 0.5304 | 68.0 | 68 | 0.5362 |
117
+ | 0.4935 | 69.0 | 69 | 0.5311 |
118
+ | 0.5167 | 70.0 | 70 | 0.5698 |
119
+ | 0.526 | 71.0 | 71 | 0.5837 |
120
+ | 0.5538 | 72.0 | 72 | 0.5436 |
121
+ | 0.4825 | 73.0 | 73 | 0.5253 |
122
+ | 0.4596 | 74.0 | 74 | 0.5132 |
123
+ | 0.4722 | 75.0 | 75 | 0.4970 |
124
+ | 0.4662 | 76.0 | 76 | 0.4983 |
125
+ | 0.4991 | 77.0 | 77 | 0.4886 |
126
+ | 0.4613 | 78.0 | 78 | 0.4791 |
127
+ | 0.4589 | 79.0 | 79 | 0.4654 |
128
+ | 0.4617 | 80.0 | 80 | 0.4532 |
129
+ | 0.4491 | 81.0 | 81 | 0.4617 |
130
+ | 0.4471 | 82.0 | 82 | 0.4416 |
131
+ | 0.4216 | 83.0 | 83 | 0.4841 |
132
+ | 0.4516 | 84.0 | 84 | 0.4817 |
133
+ | 0.4372 | 85.0 | 85 | 0.4570 |
134
+ | 0.4385 | 86.0 | 86 | 0.4801 |
135
+ | 0.4546 | 87.0 | 87 | 0.4929 |
136
+ | 0.4381 | 88.0 | 88 | 0.4646 |
137
+ | 0.4314 | 89.0 | 89 | 0.4338 |
138
+ | 0.3989 | 90.0 | 90 | 0.4458 |
139
+ | 0.4442 | 91.0 | 91 | 0.4365 |
140
+ | 0.4316 | 92.0 | 92 | 0.4116 |
141
+ | 0.4012 | 93.0 | 93 | 0.4157 |
142
+ | 0.4116 | 94.0 | 94 | 0.4185 |
143
+ | 0.4101 | 95.0 | 95 | 0.4026 |
144
+ | 0.3975 | 96.0 | 96 | 0.4144 |
145
+ | 0.3985 | 97.0 | 97 | 0.4438 |
146
+ | 0.424 | 98.0 | 98 | 0.4383 |
147
+ | 0.3901 | 99.0 | 99 | 0.4320 |
148
+ | 0.4301 | 100.0 | 100 | 0.4259 |
149
+ | 0.428 | 101.0 | 101 | 0.3934 |
150
+ | 0.3836 | 102.0 | 102 | 0.4109 |
151
+ | 0.3887 | 103.0 | 103 | 0.4203 |
152
+ | 0.423 | 104.0 | 104 | 0.3942 |
153
+ | 0.3722 | 105.0 | 105 | 0.3682 |
154
+ | 0.3909 | 106.0 | 106 | 0.3681 |
155
+ | 0.3776 | 107.0 | 107 | 0.3945 |
156
+ | 0.392 | 108.0 | 108 | 0.3728 |
157
+ | 0.3536 | 109.0 | 109 | 0.3862 |
158
+ | 0.4197 | 110.0 | 110 | 0.4024 |
159
+ | 0.3988 | 111.0 | 111 | 0.3919 |
160
+ | 0.4064 | 112.0 | 112 | 0.4617 |
161
+ | 0.4446 | 113.0 | 113 | 0.5055 |
162
+ | 0.4482 | 114.0 | 114 | 0.4476 |
163
+ | 0.3832 | 115.0 | 115 | 0.3900 |
164
+ | 0.3675 | 116.0 | 116 | 0.4018 |
165
+ | 0.3782 | 117.0 | 117 | 0.3880 |
166
+ | 0.352 | 118.0 | 118 | 0.3790 |
167
+ | 0.3458 | 119.0 | 119 | 0.3794 |
168
+ | 0.3427 | 120.0 | 120 | 0.3671 |
169
+ | 0.3223 | 121.0 | 121 | 0.3703 |
170
+ | 0.3161 | 122.0 | 122 | 0.3888 |
171
+ | 0.3211 | 123.0 | 123 | 0.4134 |
172
+ | 0.3247 | 124.0 | 124 | 0.4017 |
173
+ | 0.333 | 125.0 | 125 | 0.3822 |
174
+ | 0.3227 | 126.0 | 126 | 0.3792 |
175
+ | 0.3264 | 127.0 | 127 | 0.3783 |
176
+ | 0.3284 | 128.0 | 128 | 0.3735 |
177
+ | 0.3199 | 129.0 | 129 | 0.3614 |
178
+ | 0.3344 | 130.0 | 130 | 0.3755 |
179
+ | 0.3148 | 131.0 | 131 | 0.3901 |
180
+ | 0.3592 | 132.0 | 132 | 0.3819 |
181
+ | 0.3358 | 133.0 | 133 | 0.3764 |
182
+ | 0.3047 | 134.0 | 134 | 0.3779 |
183
+ | 0.3538 | 135.0 | 135 | 0.3580 |
184
+ | 0.3257 | 136.0 | 136 | 0.3419 |
185
+ | 0.2865 | 137.0 | 137 | 0.3402 |
186
+ | 0.3037 | 138.0 | 138 | 0.3470 |
187
+ | 0.3098 | 139.0 | 139 | 0.3432 |
188
+ | 0.3087 | 140.0 | 140 | 0.3354 |
189
+ | 0.2926 | 141.0 | 141 | 0.3469 |
190
+ | 0.2811 | 142.0 | 142 | 0.3526 |
191
+ | 0.3072 | 143.0 | 143 | 0.3465 |
192
+ | 0.3092 | 144.0 | 144 | 0.3487 |
193
+ | 0.3048 | 145.0 | 145 | 0.3465 |
194
+ | 0.2961 | 146.0 | 146 | 0.3384 |
195
+ | 0.3149 | 147.0 | 147 | 0.3383 |
196
+ | 0.3147 | 148.0 | 148 | 0.3326 |
197
+ | 0.2927 | 149.0 | 149 | 0.3306 |
198
+ | 0.2765 | 150.0 | 150 | 0.3331 |
199
+ | 0.2755 | 151.0 | 151 | 0.3255 |
200
+ | 0.304 | 152.0 | 152 | 0.3027 |
201
+ | 0.3011 | 153.0 | 153 | 0.3018 |
202
+ | 0.2842 | 154.0 | 154 | 0.3092 |
203
+ | 0.2936 | 155.0 | 155 | 0.3037 |
204
+ | 0.2852 | 156.0 | 156 | 0.3044 |
205
+ | 0.2726 | 157.0 | 157 | 0.3143 |
206
+ | 0.2577 | 158.0 | 158 | 0.3330 |
207
+ | 0.2904 | 159.0 | 159 | 0.3436 |
208
+ | 0.2619 | 160.0 | 160 | 0.3452 |
209
+ | 0.276 | 161.0 | 161 | 0.3475 |
210
+ | 0.2608 | 162.0 | 162 | 0.3454 |
211
+ | 0.2529 | 163.0 | 163 | 0.3336 |
212
+ | 0.2685 | 164.0 | 164 | 0.3183 |
213
+ | 0.2571 | 165.0 | 165 | 0.3048 |
214
+ | 0.2641 | 166.0 | 166 | 0.2957 |
215
+ | 0.2549 | 167.0 | 167 | 0.2926 |
216
+ | 0.243 | 168.0 | 168 | 0.2904 |
217
+ | 0.2574 | 169.0 | 169 | 0.2917 |
218
+ | 0.2597 | 170.0 | 170 | 0.2987 |
219
+ | 0.2512 | 171.0 | 171 | 0.2979 |
220
+ | 0.247 | 172.0 | 172 | 0.2906 |
221
+ | 0.2485 | 173.0 | 173 | 0.2851 |
222
+ | 0.2512 | 174.0 | 174 | 0.2869 |
223
+ | 0.2481 | 175.0 | 175 | 0.2838 |
224
+ | 0.268 | 176.0 | 176 | 0.2866 |
225
+ | 0.2477 | 177.0 | 177 | 0.2902 |
226
+ | 0.2498 | 178.0 | 178 | 0.2963 |
227
+ | 0.2535 | 179.0 | 179 | 0.2963 |
228
+ | 0.2658 | 180.0 | 180 | 0.2939 |
229
+ | 0.2506 | 181.0 | 181 | 0.2943 |
230
+ | 0.251 | 182.0 | 182 | 0.2894 |
231
+ | 0.2491 | 183.0 | 183 | 0.2818 |
232
+ | 0.2484 | 184.0 | 184 | 0.2767 |
233
+ | 0.2373 | 185.0 | 185 | 0.2740 |
234
+ | 0.2481 | 186.0 | 186 | 0.2718 |
235
+ | 0.2438 | 187.0 | 187 | 0.2690 |
236
+ | 0.2168 | 188.0 | 188 | 0.2658 |
237
+ | 0.237 | 189.0 | 189 | 0.2639 |
238
+ | 0.2505 | 190.0 | 190 | 0.2625 |
239
+ | 0.2448 | 191.0 | 191 | 0.2622 |
240
+ | 0.2366 | 192.0 | 192 | 0.2639 |
241
+ | 0.2394 | 193.0 | 193 | 0.2681 |
242
+ | 0.2537 | 194.0 | 194 | 0.2727 |
243
+ | 0.2259 | 195.0 | 195 | 0.2753 |
244
+ | 0.2314 | 196.0 | 196 | 0.2750 |
245
+ | 0.2398 | 197.0 | 197 | 0.2730 |
246
+ | 0.2515 | 198.0 | 198 | 0.2707 |
247
+ | 0.2244 | 199.0 | 199 | 0.2690 |
248
+ | 0.2458 | 200.0 | 200 | 0.2680 |
249
 
250
 
251
  ### Framework versions
config.json CHANGED
@@ -1,4 +1,5 @@
1
  {
 
2
  "architectures": [
3
  "EncoderDecoderModel"
4
  ],
 
1
  {
2
+ "_name_or_path": "Kielak2/calculator_model_test",
3
  "architectures": [
4
  "EncoderDecoderModel"
5
  ],
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:df9c10b98f96c659b466edd2d980791d10d4deb5d08b741932448182b3aeaae4
3
  size 31168616
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:97b31f1d11f35ac6b9077eeb87b7858a91521a1949ad3fea8ed167f1dc9df118
3
  size 31168616
runs/Mar04_10-17-18_c60a5c456cbd/events.out.tfevents.1709547438.c60a5c456cbd.796.4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:96737fedf15abb79910de2d7108b4a73ec6ee163966e9b82e605e8dc27c88355
3
+ size 27948
runs/Mar04_10-17-41_c60a5c456cbd/events.out.tfevents.1709547461.c60a5c456cbd.796.5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:57fa3c3e9d378358812c522beb4508bf1871e190bd88ebbaa1f8be041e852e80
3
+ size 27949
runs/Mar04_10-18-01_c60a5c456cbd/events.out.tfevents.1709547482.c60a5c456cbd.796.6 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:57c9546e2911334929fca3e5cf613f45859894d3782e8b7bbeda9c506daeac61
3
+ size 27951
runs/Mar04_10-18-16_c60a5c456cbd/events.out.tfevents.1709547497.c60a5c456cbd.796.7 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ff932eaba46cd8693ad4bc2d39ae25c62a3bd8187ec1efd5e82b0964ce6d0e82
3
+ size 104295
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f3c4bf818af65907f79e24432ebc4058fd991b8135d7bde889b44d524743e7d0
3
  size 5112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cebe9638621e159942e15d5eff9381d25a3a2c37f9462c3e4d00b0b340bd9fe6
3
  size 5112