DouglasPontes commited on
Commit
46eb42d
1 Parent(s): 5c37bb3

Training in progress, step 32000

Browse files
README.md ADDED
@@ -0,0 +1,357 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: cardiffnlp/twitter-roberta-base-2019-90m
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: 2020-Q1-90p-filtered
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # 2020-Q1-90p-filtered
15
+
16
+ This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-2019-90m](https://huggingface.co/cardiffnlp/twitter-roberta-base-2019-90m) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 3.2574
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 1e-05
38
+ - train_batch_size: 16
39
+ - eval_batch_size: 16
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - lr_scheduler_warmup_steps: 1400
44
+ - training_steps: 2400000
45
+
46
+ ### Training results
47
+
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-----:|:-------:|:---------------:|
50
+ | No log | 0.16 | 8000 | 3.4495 |
51
+ | 3.5684 | 0.33 | 16000 | 3.4166 |
52
+ | 3.5684 | 0.49 | 24000 | 3.3847 |
53
+ | 3.3755 | 0.66 | 32000 | 3.3665 |
54
+ | 3.3755 | 0.82 | 40000 | 3.3654 |
55
+ | 3.3533 | 0.98 | 48000 | 3.3654 |
56
+ | 3.3533 | 1.15 | 56000 | 3.3328 |
57
+ | 3.3014 | 1.31 | 64000 | 3.3210 |
58
+ | 3.3014 | 1.48 | 72000 | 3.3492 |
59
+ | 3.2888 | 1.64 | 80000 | 3.3213 |
60
+ | 3.2888 | 1.8 | 88000 | 3.2708 |
61
+ | 3.2609 | 1.97 | 96000 | 3.2908 |
62
+ | 3.2609 | 2.13 | 104000 | 3.2767 |
63
+ | 3.2159 | 2.29 | 112000 | 3.2592 |
64
+ | 3.2159 | 2.46 | 120000 | 3.2411 |
65
+ | 3.2167 | 2.62 | 128000 | 3.2377 |
66
+ | 3.2167 | 2.79 | 136000 | 3.2485 |
67
+ | 3.199 | 2.95 | 144000 | 3.2609 |
68
+ | 3.199 | 3.11 | 152000 | 3.2553 |
69
+ | 3.1905 | 3.28 | 160000 | 3.2425 |
70
+ | 3.1905 | 3.44 | 168000 | 3.2422 |
71
+ | 3.1822 | 3.61 | 176000 | 3.2624 |
72
+ | 3.1822 | 3.77 | 184000 | 3.2507 |
73
+ | 3.1852 | 3.93 | 192000 | 3.2483 |
74
+ | 3.1852 | 4.1 | 200000 | 3.2514 |
75
+ | 3.1767 | 4.26 | 208000 | 3.2426 |
76
+ | 3.1767 | 4.43 | 216000 | 3.2348 |
77
+ | 3.1767 | 4.59 | 224000 | 3.2735 |
78
+ | 3.1767 | 4.75 | 232000 | 3.2472 |
79
+ | 3.1973 | 4.92 | 240000 | 3.2596 |
80
+ | 3.1973 | 5.08 | 248000 | 3.2606 |
81
+ | 3.1781 | 5.24 | 256000 | 3.2815 |
82
+ | 3.1781 | 5.41 | 264000 | 3.2734 |
83
+ | 3.1803 | 5.57 | 272000 | 3.2739 |
84
+ | 3.1803 | 5.74 | 280000 | 3.2712 |
85
+ | 3.1989 | 5.9 | 288000 | 3.2734 |
86
+ | 3.1989 | 6.06 | 296000 | 3.2939 |
87
+ | 3.1929 | 6.23 | 304000 | 3.2880 |
88
+ | 3.1929 | 6.39 | 312000 | 3.2894 |
89
+ | 3.2083 | 6.56 | 320000 | 3.3086 |
90
+ | 3.2083 | 6.72 | 328000 | 3.3067 |
91
+ | 3.2013 | 6.88 | 336000 | 3.2787 |
92
+ | 3.2013 | 7.05 | 344000 | 3.3153 |
93
+ | 3.2111 | 7.21 | 352000 | 3.3246 |
94
+ | 3.2111 | 7.38 | 360000 | 3.3323 |
95
+ | 3.2186 | 7.54 | 368000 | 3.2938 |
96
+ | 3.2186 | 7.7 | 376000 | 3.3500 |
97
+ | 3.2268 | 7.87 | 384000 | 3.3180 |
98
+ | 3.2268 | 8.03 | 392000 | 3.3171 |
99
+ | 3.233 | 8.2 | 400000 | 3.3462 |
100
+ | 3.233 | 8.36 | 408000 | 3.3413 |
101
+ | 3.2432 | 8.52 | 416000 | 3.3281 |
102
+ | 3.2432 | 8.69 | 424000 | 3.3420 |
103
+ | 3.2586 | 8.85 | 432000 | 3.3609 |
104
+ | 3.2586 | 9.01 | 440000 | 3.3527 |
105
+ | 3.2567 | 9.18 | 448000 | 3.3594 |
106
+ | 3.2567 | 9.34 | 456000 | 3.3497 |
107
+ | 3.2592 | 9.51 | 464000 | 3.3607 |
108
+ | 3.2592 | 9.67 | 472000 | 3.3840 |
109
+ | 3.2793 | 9.83 | 480000 | 3.3668 |
110
+ | 3.2793 | 10.0 | 488000 | 3.3609 |
111
+ | 3.257 | 10.16 | 496000 | 3.3682 |
112
+ | 3.257 | 10.33 | 504000 | 3.4006 |
113
+ | 3.2656 | 10.49 | 512000 | 3.3588 |
114
+ | 3.2656 | 10.65 | 520000 | 3.3799 |
115
+ | 3.2727 | 10.82 | 528000 | 3.3833 |
116
+ | 3.2727 | 10.98 | 536000 | 3.3566 |
117
+ | 3.2705 | 11.15 | 544000 | 3.3794 |
118
+ | 3.2705 | 11.31 | 552000 | 3.3838 |
119
+ | 3.2676 | 11.47 | 560000 | 3.3660 |
120
+ | 3.2676 | 11.64 | 568000 | 3.3938 |
121
+ | 3.258 | 11.8 | 576000 | 3.3661 |
122
+ | 3.258 | 11.97 | 584000 | 3.3490 |
123
+ | 3.2646 | 12.13 | 592000 | 3.3716 |
124
+ | 3.2646 | 12.29 | 600000 | 3.3877 |
125
+ | 3.2578 | 12.46 | 608000 | 3.3930 |
126
+ | 3.2578 | 12.62 | 616000 | 3.3921 |
127
+ | 3.2719 | 12.78 | 624000 | 3.3957 |
128
+ | 3.2719 | 12.95 | 632000 | 3.4196 |
129
+ | 3.2828 | 13.11 | 640000 | 3.4078 |
130
+ | 3.2828 | 13.28 | 648000 | 3.4203 |
131
+ | 3.2805 | 13.44 | 656000 | 3.3900 |
132
+ | 3.2805 | 13.6 | 664000 | 3.4038 |
133
+ | 3.2975 | 13.77 | 672000 | 3.4056 |
134
+ | 3.2975 | 13.93 | 680000 | 3.4284 |
135
+ | 3.2965 | 14.1 | 688000 | 3.4180 |
136
+ | 3.2965 | 14.26 | 696000 | 3.4196 |
137
+ | 3.3069 | 14.42 | 704000 | 3.4257 |
138
+ | 3.3069 | 14.59 | 712000 | 3.4299 |
139
+ | 3.3152 | 14.75 | 720000 | 3.4788 |
140
+ | 3.3152 | 14.92 | 728000 | 3.4425 |
141
+ | 3.3125 | 15.08 | 736000 | 3.4301 |
142
+ | 3.3125 | 15.24 | 744000 | 3.4441 |
143
+ | 3.3174 | 15.41 | 752000 | 3.4396 |
144
+ | 3.3174 | 15.57 | 760000 | 3.4639 |
145
+ | 3.3242 | 15.73 | 768000 | 3.4524 |
146
+ | 3.3242 | 15.9 | 776000 | 3.4560 |
147
+ | 3.3385 | 16.06 | 784000 | 3.4780 |
148
+ | 3.3385 | 16.23 | 792000 | 3.4774 |
149
+ | 3.3371 | 16.39 | 800000 | 3.4772 |
150
+ | 3.3371 | 16.55 | 808000 | 3.4955 |
151
+ | 3.3633 | 16.72 | 816000 | 3.4861 |
152
+ | 3.3633 | 16.88 | 824000 | 3.5063 |
153
+ | 3.3678 | 17.05 | 832000 | 3.5044 |
154
+ | 3.3678 | 17.21 | 840000 | 3.5202 |
155
+ | 3.3634 | 17.37 | 848000 | 3.4941 |
156
+ | 3.3634 | 17.54 | 856000 | 3.5223 |
157
+ | 3.3797 | 17.7 | 864000 | 3.5028 |
158
+ | 3.3797 | 17.87 | 872000 | 3.5264 |
159
+ | 3.3802 | 18.03 | 880000 | 3.5313 |
160
+ | 3.3802 | 18.19 | 888000 | 3.4963 |
161
+ | 3.357 | 18.36 | 896000 | 3.5171 |
162
+ | 3.357 | 18.52 | 904000 | 3.5307 |
163
+ | 3.3866 | 18.69 | 912000 | 3.5222 |
164
+ | 3.3866 | 18.85 | 920000 | 3.5319 |
165
+ | 3.3818 | 19.01 | 928000 | 3.5326 |
166
+ | 3.3818 | 19.18 | 936000 | 3.5116 |
167
+ | 3.3754 | 19.34 | 944000 | 3.5229 |
168
+ | 3.3754 | 19.5 | 952000 | 3.5383 |
169
+ | 3.3893 | 19.67 | 960000 | 3.5445 |
170
+ | 3.3893 | 19.83 | 968000 | 3.5231 |
171
+ | 3.3899 | 20.0 | 976000 | 3.5310 |
172
+ | 3.3899 | 20.16 | 984000 | 3.5329 |
173
+ | 3.3918 | 20.32 | 992000 | 3.5159 |
174
+ | 3.3918 | 20.49 | 1000000 | 3.5628 |
175
+ | 3.3786 | 20.65 | 1008000 | 3.5291 |
176
+ | 3.3786 | 20.82 | 1016000 | 3.5163 |
177
+ | 3.3862 | 20.98 | 1024000 | 3.5312 |
178
+ | 3.3862 | 21.14 | 1032000 | 3.5140 |
179
+ | 3.3855 | 21.31 | 1040000 | 3.5617 |
180
+ | 3.3855 | 21.47 | 1048000 | 3.5375 |
181
+ | 3.3872 | 21.64 | 1056000 | 3.5328 |
182
+ | 3.3872 | 21.8 | 1064000 | 3.5616 |
183
+ | 3.3931 | 21.96 | 1072000 | 3.5648 |
184
+ | 3.3931 | 22.13 | 1080000 | 3.5443 |
185
+ | 3.3708 | 22.29 | 1088000 | 3.5401 |
186
+ | 3.3708 | 22.45 | 1096000 | 3.5529 |
187
+ | 3.4099 | 22.62 | 1104000 | 3.5334 |
188
+ | 3.4099 | 22.78 | 1112000 | 3.5325 |
189
+ | 3.4027 | 22.95 | 1120000 | 3.5819 |
190
+ | 3.4027 | 23.11 | 1128000 | 3.5471 |
191
+ | 3.4035 | 23.27 | 1136000 | 3.5486 |
192
+ | 3.4035 | 23.44 | 1144000 | 3.5470 |
193
+ | 3.3964 | 23.6 | 1152000 | 3.5722 |
194
+ | 3.3964 | 23.77 | 1160000 | 3.5510 |
195
+ | 3.4115 | 23.93 | 1168000 | 3.5610 |
196
+ | 3.4115 | 24.09 | 1176000 | 3.5757 |
197
+ | 3.4173 | 24.26 | 1184000 | 3.5541 |
198
+ | 3.4173 | 24.42 | 1192000 | 3.5777 |
199
+ | 3.4169 | 24.59 | 1200000 | 3.5638 |
200
+ | 3.4169 | 24.75 | 1208000 | 3.5463 |
201
+ | 3.4031 | 24.91 | 1216000 | 3.5300 |
202
+ | 3.4031 | 25.08 | 1224000 | 3.5584 |
203
+ | 3.4094 | 25.24 | 1232000 | 3.5682 |
204
+ | 3.4094 | 25.41 | 1240000 | 3.5558 |
205
+ | 3.4116 | 25.57 | 1248000 | 3.5629 |
206
+ | 3.4116 | 25.73 | 1256000 | 3.5490 |
207
+ | 3.4199 | 25.9 | 1264000 | 3.5679 |
208
+ | 3.4199 | 26.06 | 1272000 | 3.5885 |
209
+ | 3.412 | 26.22 | 1280000 | 3.5579 |
210
+ | 3.412 | 26.39 | 1288000 | 3.5465 |
211
+ | 3.4123 | 26.55 | 1296000 | 3.5726 |
212
+ | 3.4123 | 26.72 | 1304000 | 3.5775 |
213
+ | 3.4132 | 26.88 | 1312000 | 3.5478 |
214
+ | 3.4132 | 27.04 | 1320000 | 3.5589 |
215
+ | 3.4161 | 27.21 | 1328000 | 3.5662 |
216
+ | 3.4161 | 27.37 | 1336000 | 3.5895 |
217
+ | 3.4097 | 27.54 | 1344000 | 3.5941 |
218
+ | 3.4097 | 27.7 | 1352000 | 3.5912 |
219
+ | 3.415 | 27.86 | 1360000 | 3.5658 |
220
+ | 3.415 | 28.03 | 1368000 | 3.5554 |
221
+ | 3.4193 | 28.19 | 1376000 | 3.5899 |
222
+ | 3.4193 | 28.36 | 1384000 | 3.5652 |
223
+ | 3.4136 | 28.52 | 1392000 | 3.5832 |
224
+ | 3.4136 | 28.68 | 1400000 | 3.5885 |
225
+ | 3.4294 | 28.85 | 1408000 | 3.5832 |
226
+ | 3.4294 | 29.01 | 1416000 | 3.6025 |
227
+ | 3.4243 | 29.17 | 1424000 | 3.6040 |
228
+ | 3.4243 | 29.34 | 1432000 | 3.5890 |
229
+ | 3.4427 | 29.5 | 1440000 | 3.5835 |
230
+ | 3.4427 | 29.67 | 1448000 | 3.6185 |
231
+ | 3.4293 | 29.83 | 1456000 | 3.6029 |
232
+ | 3.4293 | 29.99 | 1464000 | 3.6162 |
233
+ | 3.4363 | 30.16 | 1472000 | 3.6258 |
234
+ | 3.4363 | 30.32 | 1480000 | 3.6038 |
235
+ | 3.4532 | 30.49 | 1488000 | 3.6039 |
236
+ | 3.4532 | 30.65 | 1496000 | 3.6054 |
237
+ | 3.4401 | 30.81 | 1504000 | 3.6269 |
238
+ | 3.4401 | 30.98 | 1512000 | 3.6004 |
239
+ | 3.4491 | 31.14 | 1520000 | 3.6096 |
240
+ | 3.4491 | 31.31 | 1528000 | 3.6217 |
241
+ | 3.4438 | 31.47 | 1536000 | 3.6081 |
242
+ | 3.4438 | 31.63 | 1544000 | 3.6190 |
243
+ | 3.4337 | 31.8 | 1552000 | 3.6120 |
244
+ | 3.4337 | 31.96 | 1560000 | 3.5861 |
245
+ | 3.4475 | 32.13 | 1568000 | 3.6209 |
246
+ | 3.4475 | 32.29 | 1576000 | 3.6302 |
247
+ | 3.4406 | 32.45 | 1584000 | 3.6053 |
248
+ | 3.4406 | 32.62 | 1592000 | 3.5934 |
249
+ | 3.4392 | 32.78 | 1600000 | 3.5942 |
250
+ | 3.4392 | 32.94 | 1608000 | 3.6013 |
251
+ | 3.4514 | 33.11 | 1616000 | 3.6506 |
252
+ | 3.4514 | 33.27 | 1624000 | 3.6049 |
253
+ | 3.4406 | 33.44 | 1632000 | 3.6285 |
254
+ | 3.4406 | 33.6 | 1640000 | 3.6107 |
255
+ | 3.4522 | 33.76 | 1648000 | 3.6081 |
256
+ | 3.4522 | 33.93 | 1656000 | 3.6121 |
257
+ | 3.4592 | 34.09 | 1664000 | 3.6396 |
258
+ | 3.4592 | 34.26 | 1672000 | 3.6284 |
259
+ | 3.4587 | 34.42 | 1680000 | 3.6195 |
260
+ | 3.4587 | 34.58 | 1688000 | 3.6168 |
261
+ | 3.4589 | 34.75 | 1696000 | 3.6315 |
262
+ | 3.4589 | 34.91 | 1704000 | 3.6045 |
263
+ | 3.4703 | 35.08 | 1712000 | 3.6251 |
264
+ | 3.4703 | 35.24 | 1720000 | 3.6252 |
265
+ | 3.4565 | 35.4 | 1728000 | 3.6254 |
266
+ | 3.4565 | 35.57 | 1736000 | 3.6544 |
267
+ | 3.4634 | 35.73 | 1744000 | 3.6290 |
268
+ | 3.4634 | 35.9 | 1752000 | 3.6124 |
269
+ | 3.4625 | 36.06 | 1760000 | 3.6262 |
270
+ | 3.4625 | 36.22 | 1768000 | 3.6318 |
271
+ | 3.457 | 36.39 | 1776000 | 3.6408 |
272
+ | 3.457 | 36.55 | 1784000 | 3.6433 |
273
+ | 3.4618 | 36.71 | 1792000 | 3.6276 |
274
+ | 3.4618 | 36.88 | 1800000 | 3.6314 |
275
+ | 3.4611 | 37.04 | 1808000 | 3.6416 |
276
+ | 3.4611 | 37.21 | 1816000 | 3.6658 |
277
+ | 3.4651 | 37.37 | 1824000 | 3.6382 |
278
+ | 3.4651 | 37.53 | 1832000 | 3.6562 |
279
+ | 3.4625 | 37.7 | 1840000 | 3.6376 |
280
+ | 3.4625 | 37.86 | 1848000 | 3.6520 |
281
+ | 3.4561 | 38.03 | 1856000 | 3.6301 |
282
+ | 3.4561 | 38.19 | 1864000 | 3.6195 |
283
+ | 3.4655 | 38.35 | 1872000 | 3.6279 |
284
+ | 3.4655 | 38.52 | 1880000 | 3.6365 |
285
+ | 3.4637 | 38.68 | 1888000 | 3.6386 |
286
+ | 3.4637 | 38.85 | 1896000 | 3.6434 |
287
+ | 3.458 | 39.01 | 1904000 | 3.6519 |
288
+ | 3.458 | 39.17 | 1912000 | 3.6438 |
289
+ | 3.4523 | 39.34 | 1920000 | 3.6408 |
290
+ | 3.4523 | 39.5 | 1928000 | 3.6513 |
291
+ | 3.4743 | 39.66 | 1936000 | 3.6178 |
292
+ | 3.4743 | 39.83 | 1944000 | 3.6399 |
293
+ | 3.4626 | 39.99 | 1952000 | 3.6243 |
294
+ | 3.4626 | 40.16 | 1960000 | 3.6326 |
295
+ | 3.4692 | 40.32 | 1968000 | 3.6723 |
296
+ | 3.4692 | 40.48 | 1976000 | 3.6456 |
297
+ | 3.4765 | 40.65 | 1984000 | 3.6437 |
298
+ | 3.4765 | 40.81 | 1992000 | 3.6477 |
299
+ | 3.4747 | 40.98 | 2000000 | 3.6384 |
300
+ | 3.4747 | 41.14 | 2008000 | 3.6370 |
301
+ | 3.4683 | 41.3 | 2016000 | 3.6625 |
302
+ | 3.4683 | 41.47 | 2024000 | 3.6453 |
303
+ | 3.4599 | 41.63 | 2032000 | 3.6489 |
304
+ | 3.4599 | 41.8 | 2040000 | 3.6311 |
305
+ | 3.4713 | 41.96 | 2048000 | 3.6192 |
306
+ | 3.4713 | 42.12 | 2056000 | 3.6511 |
307
+ | 3.4677 | 42.29 | 2064000 | 3.6426 |
308
+ | 3.4677 | 42.45 | 2072000 | 3.6363 |
309
+ | 3.4689 | 42.62 | 2080000 | 3.6378 |
310
+ | 3.4689 | 42.78 | 2088000 | 3.6450 |
311
+ | 3.4598 | 42.94 | 2096000 | 3.6481 |
312
+ | 3.4598 | 43.11 | 2104000 | 3.6675 |
313
+ | 3.4487 | 43.27 | 2112000 | 3.6558 |
314
+ | 3.4487 | 43.43 | 2120000 | 3.6451 |
315
+ | 3.4555 | 43.6 | 2128000 | 3.6431 |
316
+ | 3.4555 | 43.76 | 2136000 | 3.6470 |
317
+ | 3.4727 | 43.93 | 2144000 | 3.6265 |
318
+ | 3.4727 | 44.09 | 2152000 | 3.6335 |
319
+ | 3.4626 | 44.25 | 2160000 | 3.6396 |
320
+ | 3.4626 | 44.42 | 2168000 | 3.6537 |
321
+ | 3.4724 | 44.58 | 2176000 | 3.6168 |
322
+ | 3.4724 | 44.75 | 2184000 | 3.6444 |
323
+ | 3.4545 | 44.91 | 2192000 | 3.6440 |
324
+ | 3.4545 | 45.07 | 2200000 | 3.6327 |
325
+ | 3.461 | 45.24 | 2208000 | 3.6363 |
326
+ | 3.461 | 45.4 | 2216000 | 3.6537 |
327
+ | 3.4702 | 45.57 | 2224000 | 3.6123 |
328
+ | 3.4702 | 45.73 | 2232000 | 3.6554 |
329
+ | 3.4565 | 45.89 | 2240000 | 3.6523 |
330
+ | 3.4565 | 46.06 | 2248000 | 3.6340 |
331
+ | 3.4517 | 46.22 | 2256000 | 3.6459 |
332
+ | 3.4517 | 46.38 | 2264000 | 3.6561 |
333
+ | 3.4631 | 46.55 | 2272000 | 3.6548 |
334
+ | 3.4631 | 46.71 | 2280000 | 3.6229 |
335
+ | 3.4518 | 46.88 | 2288000 | 3.6350 |
336
+ | 3.4518 | 47.04 | 2296000 | 3.6483 |
337
+ | 3.4592 | 47.2 | 2304000 | 3.6263 |
338
+ | 3.4592 | 47.37 | 2312000 | 3.6339 |
339
+ | 3.4569 | 47.53 | 2320000 | 3.6594 |
340
+ | 3.4569 | 47.7 | 2328000 | 3.6385 |
341
+ | 3.4524 | 47.86 | 2336000 | 3.6434 |
342
+ | 3.4524 | 48.02 | 2344000 | 3.6502 |
343
+ | 3.4644 | 48.19 | 2352000 | 3.6176 |
344
+ | 3.4644 | 48.35 | 2360000 | 3.6293 |
345
+ | 3.4586 | 48.52 | 2368000 | 3.6304 |
346
+ | 3.4586 | 48.68 | 2376000 | 3.6343 |
347
+ | 3.4439 | 48.84 | 2384000 | 3.6090 |
348
+ | 3.4439 | 49.01 | 2392000 | 3.6414 |
349
+ | 3.4474 | 49.17 | 2400000 | 3.6208 |
350
+
351
+
352
+ ### Framework versions
353
+
354
+ - Transformers 4.35.0.dev0
355
+ - Pytorch 2.0.1+cu117
356
+ - Datasets 2.14.5
357
+ - Tokenizers 0.14.0
added_tokens.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "</s>": 2,
3
+ "<mask>": 50264,
4
+ "<pad>": 1,
5
+ "<s>": 0,
6
+ "<unk>": 3
7
+ }
all_results.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 49.17,
3
+ "eval_loss": 3.2573602199554443,
4
+ "eval_runtime": 41.9844,
5
+ "eval_samples": 41102,
6
+ "eval_samples_per_second": 978.983,
7
+ "eval_steps_per_second": 61.189,
8
+ "perplexity": 25.980862776689275,
9
+ "train_loss": 3.376089767252604,
10
+ "train_runtime": 158003.2062,
11
+ "train_samples": 780934,
12
+ "train_samples_per_second": 243.033,
13
+ "train_steps_per_second": 15.19
14
+ }
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "cardiffnlp/twitter-roberta-base-2019-90m",
3
+ "architectures": [
4
+ "RobertaForMaskedLM"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "gradient_checkpointing": false,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.1,
13
+ "hidden_size": 768,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "layer_norm_eps": 1e-05,
17
+ "max_position_embeddings": 514,
18
+ "model_type": "roberta",
19
+ "num_attention_heads": 12,
20
+ "num_hidden_layers": 12,
21
+ "pad_token_id": 1,
22
+ "position_embedding_type": "absolute",
23
+ "torch_dtype": "float32",
24
+ "transformers_version": "4.35.0.dev0",
25
+ "type_vocab_size": 1,
26
+ "use_cache": true,
27
+ "vocab_size": 50265
28
+ }
eval_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 49.17,
3
+ "eval_loss": 3.2573602199554443,
4
+ "eval_runtime": 41.9844,
5
+ "eval_samples": 41102,
6
+ "eval_samples_per_second": 978.983,
7
+ "eval_steps_per_second": 61.189,
8
+ "perplexity": 25.980862776689275
9
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f0a5a9280e3c9e6a4e56f6d367b5e1ef6f66b6ce35c7a0262a73d3638110800f
3
+ size 498859189
special_tokens_map.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "cls_token": "<s>",
4
+ "eos_token": "</s>",
5
+ "mask_token": "<mask>",
6
+ "pad_token": "<pad>",
7
+ "sep_token": "</s>",
8
+ "unk_token": "<unk>"
9
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<unk>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "50264": {
37
+ "content": "<mask>",
38
+ "lstrip": true,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ }
44
+ },
45
+ "additional_special_tokens": [],
46
+ "bos_token": "<s>",
47
+ "clean_up_tokenization_spaces": true,
48
+ "cls_token": "<s>",
49
+ "eos_token": "</s>",
50
+ "errors": "replace",
51
+ "mask_token": "<mask>",
52
+ "max_length": 512,
53
+ "model_max_length": 512,
54
+ "pad_token": "<pad>",
55
+ "sep_token": "</s>",
56
+ "stride": 0,
57
+ "tokenizer_class": "RobertaTokenizer",
58
+ "trim_offsets": true,
59
+ "truncation_side": "right",
60
+ "truncation_strategy": "longest_first",
61
+ "unk_token": "<unk>"
62
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 49.17,
3
+ "train_loss": 3.376089767252604,
4
+ "train_runtime": 158003.2062,
5
+ "train_samples": 780934,
6
+ "train_samples_per_second": 243.033,
7
+ "train_steps_per_second": 15.19
8
+ }
trainer_state.json ADDED
@@ -0,0 +1,3328 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 3.23770809173584,
3
+ "best_model_checkpoint": "./model_tweets_2020_Q1_90/checkpoint-128000",
4
+ "epoch": 49.171259398881354,
5
+ "eval_steps": 8000,
6
+ "global_step": 2400000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.16,
13
+ "eval_loss": 3.4494731426239014,
14
+ "eval_runtime": 46.3964,
15
+ "eval_samples_per_second": 885.888,
16
+ "eval_steps_per_second": 55.371,
17
+ "step": 8000
18
+ },
19
+ {
20
+ "epoch": 0.33,
21
+ "learning_rate": 9.939131159843243e-06,
22
+ "loss": 3.5684,
23
+ "step": 16000
24
+ },
25
+ {
26
+ "epoch": 0.33,
27
+ "eval_loss": 3.416630744934082,
28
+ "eval_runtime": 46.3565,
29
+ "eval_samples_per_second": 886.65,
30
+ "eval_steps_per_second": 55.418,
31
+ "step": 16000
32
+ },
33
+ {
34
+ "epoch": 0.49,
35
+ "eval_loss": 3.3847219944000244,
36
+ "eval_runtime": 47.2297,
37
+ "eval_samples_per_second": 870.258,
38
+ "eval_steps_per_second": 54.394,
39
+ "step": 24000
40
+ },
41
+ {
42
+ "epoch": 0.66,
43
+ "learning_rate": 9.872425581589261e-06,
44
+ "loss": 3.3755,
45
+ "step": 32000
46
+ },
47
+ {
48
+ "epoch": 0.66,
49
+ "eval_loss": 3.3664660453796387,
50
+ "eval_runtime": 47.0495,
51
+ "eval_samples_per_second": 873.591,
52
+ "eval_steps_per_second": 54.602,
53
+ "step": 32000
54
+ },
55
+ {
56
+ "epoch": 0.82,
57
+ "eval_loss": 3.3654134273529053,
58
+ "eval_runtime": 46.3932,
59
+ "eval_samples_per_second": 885.949,
60
+ "eval_steps_per_second": 55.374,
61
+ "step": 40000
62
+ },
63
+ {
64
+ "epoch": 0.98,
65
+ "learning_rate": 9.80572000333528e-06,
66
+ "loss": 3.3533,
67
+ "step": 48000
68
+ },
69
+ {
70
+ "epoch": 0.98,
71
+ "eval_loss": 3.3654167652130127,
72
+ "eval_runtime": 46.5322,
73
+ "eval_samples_per_second": 883.301,
74
+ "eval_steps_per_second": 55.209,
75
+ "step": 48000
76
+ },
77
+ {
78
+ "epoch": 1.15,
79
+ "eval_loss": 3.332759380340576,
80
+ "eval_runtime": 46.4492,
81
+ "eval_samples_per_second": 884.88,
82
+ "eval_steps_per_second": 55.308,
83
+ "step": 56000
84
+ },
85
+ {
86
+ "epoch": 1.31,
87
+ "learning_rate": 9.739014425081299e-06,
88
+ "loss": 3.3014,
89
+ "step": 64000
90
+ },
91
+ {
92
+ "epoch": 1.31,
93
+ "eval_loss": 3.3209590911865234,
94
+ "eval_runtime": 45.8973,
95
+ "eval_samples_per_second": 895.521,
96
+ "eval_steps_per_second": 55.973,
97
+ "step": 64000
98
+ },
99
+ {
100
+ "epoch": 1.48,
101
+ "eval_loss": 3.3491690158843994,
102
+ "eval_runtime": 46.3252,
103
+ "eval_samples_per_second": 887.249,
104
+ "eval_steps_per_second": 55.456,
105
+ "step": 72000
106
+ },
107
+ {
108
+ "epoch": 1.64,
109
+ "learning_rate": 9.672308846827316e-06,
110
+ "loss": 3.2888,
111
+ "step": 80000
112
+ },
113
+ {
114
+ "epoch": 1.64,
115
+ "eval_loss": 3.3213465213775635,
116
+ "eval_runtime": 45.915,
117
+ "eval_samples_per_second": 895.177,
118
+ "eval_steps_per_second": 55.951,
119
+ "step": 80000
120
+ },
121
+ {
122
+ "epoch": 1.8,
123
+ "eval_loss": 3.2708065509796143,
124
+ "eval_runtime": 45.9723,
125
+ "eval_samples_per_second": 894.061,
126
+ "eval_steps_per_second": 55.882,
127
+ "step": 88000
128
+ },
129
+ {
130
+ "epoch": 1.97,
131
+ "learning_rate": 9.605603268573334e-06,
132
+ "loss": 3.2609,
133
+ "step": 96000
134
+ },
135
+ {
136
+ "epoch": 1.97,
137
+ "eval_loss": 3.290764808654785,
138
+ "eval_runtime": 46.6916,
139
+ "eval_samples_per_second": 880.287,
140
+ "eval_steps_per_second": 55.021,
141
+ "step": 96000
142
+ },
143
+ {
144
+ "epoch": 2.13,
145
+ "eval_loss": 3.2766778469085693,
146
+ "eval_runtime": 45.6527,
147
+ "eval_samples_per_second": 900.318,
148
+ "eval_steps_per_second": 56.273,
149
+ "step": 104000
150
+ },
151
+ {
152
+ "epoch": 2.29,
153
+ "learning_rate": 9.538897690319354e-06,
154
+ "loss": 3.2159,
155
+ "step": 112000
156
+ },
157
+ {
158
+ "epoch": 2.29,
159
+ "eval_loss": 3.259241819381714,
160
+ "eval_runtime": 45.9077,
161
+ "eval_samples_per_second": 895.319,
162
+ "eval_steps_per_second": 55.96,
163
+ "step": 112000
164
+ },
165
+ {
166
+ "epoch": 2.46,
167
+ "eval_loss": 3.2411258220672607,
168
+ "eval_runtime": 46.8974,
169
+ "eval_samples_per_second": 876.424,
170
+ "eval_steps_per_second": 54.779,
171
+ "step": 120000
172
+ },
173
+ {
174
+ "epoch": 2.62,
175
+ "learning_rate": 9.472192112065373e-06,
176
+ "loss": 3.2167,
177
+ "step": 128000
178
+ },
179
+ {
180
+ "epoch": 2.62,
181
+ "eval_loss": 3.23770809173584,
182
+ "eval_runtime": 46.0285,
183
+ "eval_samples_per_second": 892.969,
184
+ "eval_steps_per_second": 55.813,
185
+ "step": 128000
186
+ },
187
+ {
188
+ "epoch": 2.79,
189
+ "eval_loss": 3.2485291957855225,
190
+ "eval_runtime": 46.313,
191
+ "eval_samples_per_second": 887.483,
192
+ "eval_steps_per_second": 55.47,
193
+ "step": 136000
194
+ },
195
+ {
196
+ "epoch": 2.95,
197
+ "learning_rate": 9.405486533811392e-06,
198
+ "loss": 3.199,
199
+ "step": 144000
200
+ },
201
+ {
202
+ "epoch": 2.95,
203
+ "eval_loss": 3.2608513832092285,
204
+ "eval_runtime": 46.3737,
205
+ "eval_samples_per_second": 886.322,
206
+ "eval_steps_per_second": 55.398,
207
+ "step": 144000
208
+ },
209
+ {
210
+ "epoch": 3.11,
211
+ "eval_loss": 3.2552711963653564,
212
+ "eval_runtime": 45.7073,
213
+ "eval_samples_per_second": 899.243,
214
+ "eval_steps_per_second": 56.205,
215
+ "step": 152000
216
+ },
217
+ {
218
+ "epoch": 3.28,
219
+ "learning_rate": 9.338780955557409e-06,
220
+ "loss": 3.1905,
221
+ "step": 160000
222
+ },
223
+ {
224
+ "epoch": 3.28,
225
+ "eval_loss": 3.2425193786621094,
226
+ "eval_runtime": 46.3189,
227
+ "eval_samples_per_second": 887.37,
228
+ "eval_steps_per_second": 55.463,
229
+ "step": 160000
230
+ },
231
+ {
232
+ "epoch": 3.44,
233
+ "eval_loss": 3.2421696186065674,
234
+ "eval_runtime": 46.3489,
235
+ "eval_samples_per_second": 886.796,
236
+ "eval_steps_per_second": 55.427,
237
+ "step": 168000
238
+ },
239
+ {
240
+ "epoch": 3.61,
241
+ "learning_rate": 9.272075377303427e-06,
242
+ "loss": 3.1822,
243
+ "step": 176000
244
+ },
245
+ {
246
+ "epoch": 3.61,
247
+ "eval_loss": 3.262392997741699,
248
+ "eval_runtime": 46.6763,
249
+ "eval_samples_per_second": 880.575,
250
+ "eval_steps_per_second": 55.039,
251
+ "step": 176000
252
+ },
253
+ {
254
+ "epoch": 3.77,
255
+ "eval_loss": 3.2507119178771973,
256
+ "eval_runtime": 46.8277,
257
+ "eval_samples_per_second": 877.728,
258
+ "eval_steps_per_second": 54.861,
259
+ "step": 184000
260
+ },
261
+ {
262
+ "epoch": 3.93,
263
+ "learning_rate": 9.205369799049446e-06,
264
+ "loss": 3.1852,
265
+ "step": 192000
266
+ },
267
+ {
268
+ "epoch": 3.93,
269
+ "eval_loss": 3.2483315467834473,
270
+ "eval_runtime": 45.7607,
271
+ "eval_samples_per_second": 898.195,
272
+ "eval_steps_per_second": 56.14,
273
+ "step": 192000
274
+ },
275
+ {
276
+ "epoch": 4.1,
277
+ "eval_loss": 3.251424789428711,
278
+ "eval_runtime": 46.3642,
279
+ "eval_samples_per_second": 886.503,
280
+ "eval_steps_per_second": 55.409,
281
+ "step": 200000
282
+ },
283
+ {
284
+ "epoch": 4.26,
285
+ "learning_rate": 9.138664220795464e-06,
286
+ "loss": 3.1767,
287
+ "step": 208000
288
+ },
289
+ {
290
+ "epoch": 4.26,
291
+ "eval_loss": 3.242562770843506,
292
+ "eval_runtime": 46.886,
293
+ "eval_samples_per_second": 876.637,
294
+ "eval_steps_per_second": 54.792,
295
+ "step": 208000
296
+ },
297
+ {
298
+ "epoch": 4.43,
299
+ "eval_loss": 3.234778642654419,
300
+ "eval_runtime": 46.4949,
301
+ "eval_samples_per_second": 884.01,
302
+ "eval_steps_per_second": 55.253,
303
+ "step": 216000
304
+ },
305
+ {
306
+ "epoch": 4.59,
307
+ "learning_rate": 9.071958642541483e-06,
308
+ "loss": 3.1767,
309
+ "step": 224000
310
+ },
311
+ {
312
+ "epoch": 4.59,
313
+ "eval_loss": 3.2734625339508057,
314
+ "eval_runtime": 46.0486,
315
+ "eval_samples_per_second": 892.58,
316
+ "eval_steps_per_second": 55.789,
317
+ "step": 224000
318
+ },
319
+ {
320
+ "epoch": 4.75,
321
+ "eval_loss": 3.2471694946289062,
322
+ "eval_runtime": 46.5054,
323
+ "eval_samples_per_second": 883.811,
324
+ "eval_steps_per_second": 55.241,
325
+ "step": 232000
326
+ },
327
+ {
328
+ "epoch": 4.92,
329
+ "learning_rate": 9.005253064287502e-06,
330
+ "loss": 3.1973,
331
+ "step": 240000
332
+ },
333
+ {
334
+ "epoch": 4.92,
335
+ "eval_loss": 3.259644031524658,
336
+ "eval_runtime": 45.8405,
337
+ "eval_samples_per_second": 896.631,
338
+ "eval_steps_per_second": 56.042,
339
+ "step": 240000
340
+ },
341
+ {
342
+ "epoch": 5.08,
343
+ "eval_loss": 3.2605602741241455,
344
+ "eval_runtime": 45.4485,
345
+ "eval_samples_per_second": 904.365,
346
+ "eval_steps_per_second": 56.526,
347
+ "step": 248000
348
+ },
349
+ {
350
+ "epoch": 5.24,
351
+ "learning_rate": 8.93854748603352e-06,
352
+ "loss": 3.1781,
353
+ "step": 256000
354
+ },
355
+ {
356
+ "epoch": 5.24,
357
+ "eval_loss": 3.281527519226074,
358
+ "eval_runtime": 46.3336,
359
+ "eval_samples_per_second": 887.089,
360
+ "eval_steps_per_second": 55.446,
361
+ "step": 256000
362
+ },
363
+ {
364
+ "epoch": 5.41,
365
+ "eval_loss": 3.273421049118042,
366
+ "eval_runtime": 45.4558,
367
+ "eval_samples_per_second": 904.218,
368
+ "eval_steps_per_second": 56.516,
369
+ "step": 264000
370
+ },
371
+ {
372
+ "epoch": 5.57,
373
+ "learning_rate": 8.871841907779539e-06,
374
+ "loss": 3.1803,
375
+ "step": 272000
376
+ },
377
+ {
378
+ "epoch": 5.57,
379
+ "eval_loss": 3.2739477157592773,
380
+ "eval_runtime": 45.6455,
381
+ "eval_samples_per_second": 900.462,
382
+ "eval_steps_per_second": 56.282,
383
+ "step": 272000
384
+ },
385
+ {
386
+ "epoch": 5.74,
387
+ "eval_loss": 3.2712481021881104,
388
+ "eval_runtime": 46.5973,
389
+ "eval_samples_per_second": 882.068,
390
+ "eval_steps_per_second": 55.132,
391
+ "step": 280000
392
+ },
393
+ {
394
+ "epoch": 5.9,
395
+ "learning_rate": 8.805136329525557e-06,
396
+ "loss": 3.1989,
397
+ "step": 288000
398
+ },
399
+ {
400
+ "epoch": 5.9,
401
+ "eval_loss": 3.273439884185791,
402
+ "eval_runtime": 46.318,
403
+ "eval_samples_per_second": 887.387,
404
+ "eval_steps_per_second": 55.464,
405
+ "step": 288000
406
+ },
407
+ {
408
+ "epoch": 6.06,
409
+ "eval_loss": 3.293893814086914,
410
+ "eval_runtime": 45.8003,
411
+ "eval_samples_per_second": 897.418,
412
+ "eval_steps_per_second": 56.091,
413
+ "step": 296000
414
+ },
415
+ {
416
+ "epoch": 6.23,
417
+ "learning_rate": 8.738430751271576e-06,
418
+ "loss": 3.1929,
419
+ "step": 304000
420
+ },
421
+ {
422
+ "epoch": 6.23,
423
+ "eval_loss": 3.288043737411499,
424
+ "eval_runtime": 46.6462,
425
+ "eval_samples_per_second": 881.144,
426
+ "eval_steps_per_second": 55.074,
427
+ "step": 304000
428
+ },
429
+ {
430
+ "epoch": 6.39,
431
+ "eval_loss": 3.289358139038086,
432
+ "eval_runtime": 45.8512,
433
+ "eval_samples_per_second": 896.422,
434
+ "eval_steps_per_second": 56.029,
435
+ "step": 312000
436
+ },
437
+ {
438
+ "epoch": 6.56,
439
+ "learning_rate": 8.671725173017595e-06,
440
+ "loss": 3.2083,
441
+ "step": 320000
442
+ },
443
+ {
444
+ "epoch": 6.56,
445
+ "eval_loss": 3.308645725250244,
446
+ "eval_runtime": 46.2317,
447
+ "eval_samples_per_second": 889.043,
448
+ "eval_steps_per_second": 55.568,
449
+ "step": 320000
450
+ },
451
+ {
452
+ "epoch": 6.72,
453
+ "eval_loss": 3.3066623210906982,
454
+ "eval_runtime": 46.8669,
455
+ "eval_samples_per_second": 876.995,
456
+ "eval_steps_per_second": 54.815,
457
+ "step": 328000
458
+ },
459
+ {
460
+ "epoch": 6.88,
461
+ "learning_rate": 8.605019594763613e-06,
462
+ "loss": 3.2013,
463
+ "step": 336000
464
+ },
465
+ {
466
+ "epoch": 6.88,
467
+ "eval_loss": 3.278655529022217,
468
+ "eval_runtime": 45.904,
469
+ "eval_samples_per_second": 895.391,
470
+ "eval_steps_per_second": 55.965,
471
+ "step": 336000
472
+ },
473
+ {
474
+ "epoch": 7.05,
475
+ "eval_loss": 3.3152964115142822,
476
+ "eval_runtime": 46.5312,
477
+ "eval_samples_per_second": 883.322,
478
+ "eval_steps_per_second": 55.21,
479
+ "step": 344000
480
+ },
481
+ {
482
+ "epoch": 7.21,
483
+ "learning_rate": 8.538314016509632e-06,
484
+ "loss": 3.2111,
485
+ "step": 352000
486
+ },
487
+ {
488
+ "epoch": 7.21,
489
+ "eval_loss": 3.3246278762817383,
490
+ "eval_runtime": 46.7247,
491
+ "eval_samples_per_second": 879.664,
492
+ "eval_steps_per_second": 54.982,
493
+ "step": 352000
494
+ },
495
+ {
496
+ "epoch": 7.38,
497
+ "eval_loss": 3.3322579860687256,
498
+ "eval_runtime": 45.9989,
499
+ "eval_samples_per_second": 893.543,
500
+ "eval_steps_per_second": 55.849,
501
+ "step": 360000
502
+ },
503
+ {
504
+ "epoch": 7.54,
505
+ "learning_rate": 8.471608438255649e-06,
506
+ "loss": 3.2186,
507
+ "step": 368000
508
+ },
509
+ {
510
+ "epoch": 7.54,
511
+ "eval_loss": 3.2938337326049805,
512
+ "eval_runtime": 46.144,
513
+ "eval_samples_per_second": 890.734,
514
+ "eval_steps_per_second": 55.674,
515
+ "step": 368000
516
+ },
517
+ {
518
+ "epoch": 7.7,
519
+ "eval_loss": 3.3499817848205566,
520
+ "eval_runtime": 45.582,
521
+ "eval_samples_per_second": 901.717,
522
+ "eval_steps_per_second": 56.36,
523
+ "step": 376000
524
+ },
525
+ {
526
+ "epoch": 7.87,
527
+ "learning_rate": 8.404902860001667e-06,
528
+ "loss": 3.2268,
529
+ "step": 384000
530
+ },
531
+ {
532
+ "epoch": 7.87,
533
+ "eval_loss": 3.3179759979248047,
534
+ "eval_runtime": 45.2091,
535
+ "eval_samples_per_second": 909.153,
536
+ "eval_steps_per_second": 56.825,
537
+ "step": 384000
538
+ },
539
+ {
540
+ "epoch": 8.03,
541
+ "eval_loss": 3.3171069622039795,
542
+ "eval_runtime": 46.0196,
543
+ "eval_samples_per_second": 893.141,
544
+ "eval_steps_per_second": 55.824,
545
+ "step": 392000
546
+ },
547
+ {
548
+ "epoch": 8.2,
549
+ "learning_rate": 8.338197281747686e-06,
550
+ "loss": 3.233,
551
+ "step": 400000
552
+ },
553
+ {
554
+ "epoch": 8.2,
555
+ "eval_loss": 3.3461642265319824,
556
+ "eval_runtime": 45.5487,
557
+ "eval_samples_per_second": 902.375,
558
+ "eval_steps_per_second": 56.401,
559
+ "step": 400000
560
+ },
561
+ {
562
+ "epoch": 8.36,
563
+ "eval_loss": 3.341256618499756,
564
+ "eval_runtime": 45.4264,
565
+ "eval_samples_per_second": 904.804,
566
+ "eval_steps_per_second": 56.553,
567
+ "step": 408000
568
+ },
569
+ {
570
+ "epoch": 8.52,
571
+ "learning_rate": 8.271491703493705e-06,
572
+ "loss": 3.2432,
573
+ "step": 416000
574
+ },
575
+ {
576
+ "epoch": 8.52,
577
+ "eval_loss": 3.328122615814209,
578
+ "eval_runtime": 45.9787,
579
+ "eval_samples_per_second": 893.936,
580
+ "eval_steps_per_second": 55.874,
581
+ "step": 416000
582
+ },
583
+ {
584
+ "epoch": 8.69,
585
+ "eval_loss": 3.342041492462158,
586
+ "eval_runtime": 45.4274,
587
+ "eval_samples_per_second": 904.784,
588
+ "eval_steps_per_second": 56.552,
589
+ "step": 424000
590
+ },
591
+ {
592
+ "epoch": 8.85,
593
+ "learning_rate": 8.204786125239725e-06,
594
+ "loss": 3.2586,
595
+ "step": 432000
596
+ },
597
+ {
598
+ "epoch": 8.85,
599
+ "eval_loss": 3.3609066009521484,
600
+ "eval_runtime": 45.3913,
601
+ "eval_samples_per_second": 905.504,
602
+ "eval_steps_per_second": 56.597,
603
+ "step": 432000
604
+ },
605
+ {
606
+ "epoch": 9.01,
607
+ "eval_loss": 3.352691173553467,
608
+ "eval_runtime": 46.0515,
609
+ "eval_samples_per_second": 892.522,
610
+ "eval_steps_per_second": 55.785,
611
+ "step": 440000
612
+ },
613
+ {
614
+ "epoch": 9.18,
615
+ "learning_rate": 8.138080546985743e-06,
616
+ "loss": 3.2567,
617
+ "step": 448000
618
+ },
619
+ {
620
+ "epoch": 9.18,
621
+ "eval_loss": 3.359393358230591,
622
+ "eval_runtime": 45.57,
623
+ "eval_samples_per_second": 901.953,
624
+ "eval_steps_per_second": 56.375,
625
+ "step": 448000
626
+ },
627
+ {
628
+ "epoch": 9.34,
629
+ "eval_loss": 3.3497443199157715,
630
+ "eval_runtime": 45.4208,
631
+ "eval_samples_per_second": 904.915,
632
+ "eval_steps_per_second": 56.56,
633
+ "step": 456000
634
+ },
635
+ {
636
+ "epoch": 9.51,
637
+ "learning_rate": 8.07137496873176e-06,
638
+ "loss": 3.2592,
639
+ "step": 464000
640
+ },
641
+ {
642
+ "epoch": 9.51,
643
+ "eval_loss": 3.3606550693511963,
644
+ "eval_runtime": 46.15,
645
+ "eval_samples_per_second": 890.617,
646
+ "eval_steps_per_second": 55.666,
647
+ "step": 464000
648
+ },
649
+ {
650
+ "epoch": 9.67,
651
+ "eval_loss": 3.3839540481567383,
652
+ "eval_runtime": 45.5702,
653
+ "eval_samples_per_second": 901.95,
654
+ "eval_steps_per_second": 56.375,
655
+ "step": 472000
656
+ },
657
+ {
658
+ "epoch": 9.83,
659
+ "learning_rate": 8.004669390477779e-06,
660
+ "loss": 3.2793,
661
+ "step": 480000
662
+ },
663
+ {
664
+ "epoch": 9.83,
665
+ "eval_loss": 3.366785764694214,
666
+ "eval_runtime": 45.749,
667
+ "eval_samples_per_second": 898.424,
668
+ "eval_steps_per_second": 56.154,
669
+ "step": 480000
670
+ },
671
+ {
672
+ "epoch": 10.0,
673
+ "eval_loss": 3.3609416484832764,
674
+ "eval_runtime": 47.1383,
675
+ "eval_samples_per_second": 871.945,
676
+ "eval_steps_per_second": 54.499,
677
+ "step": 488000
678
+ },
679
+ {
680
+ "epoch": 10.16,
681
+ "learning_rate": 7.937963812223798e-06,
682
+ "loss": 3.257,
683
+ "step": 496000
684
+ },
685
+ {
686
+ "epoch": 10.16,
687
+ "eval_loss": 3.368229389190674,
688
+ "eval_runtime": 45.5778,
689
+ "eval_samples_per_second": 901.798,
690
+ "eval_steps_per_second": 56.365,
691
+ "step": 496000
692
+ },
693
+ {
694
+ "epoch": 10.33,
695
+ "eval_loss": 3.4005918502807617,
696
+ "eval_runtime": 46.5843,
697
+ "eval_samples_per_second": 882.314,
698
+ "eval_steps_per_second": 55.147,
699
+ "step": 504000
700
+ },
701
+ {
702
+ "epoch": 10.49,
703
+ "learning_rate": 7.871258233969816e-06,
704
+ "loss": 3.2656,
705
+ "step": 512000
706
+ },
707
+ {
708
+ "epoch": 10.49,
709
+ "eval_loss": 3.358835220336914,
710
+ "eval_runtime": 46.2545,
711
+ "eval_samples_per_second": 888.605,
712
+ "eval_steps_per_second": 55.541,
713
+ "step": 512000
714
+ },
715
+ {
716
+ "epoch": 10.65,
717
+ "eval_loss": 3.379861831665039,
718
+ "eval_runtime": 45.613,
719
+ "eval_samples_per_second": 901.103,
720
+ "eval_steps_per_second": 56.322,
721
+ "step": 520000
722
+ },
723
+ {
724
+ "epoch": 10.82,
725
+ "learning_rate": 7.804552655715835e-06,
726
+ "loss": 3.2727,
727
+ "step": 528000
728
+ },
729
+ {
730
+ "epoch": 10.82,
731
+ "eval_loss": 3.383315086364746,
732
+ "eval_runtime": 46.0041,
733
+ "eval_samples_per_second": 893.442,
734
+ "eval_steps_per_second": 55.843,
735
+ "step": 528000
736
+ },
737
+ {
738
+ "epoch": 10.98,
739
+ "eval_loss": 3.356590747833252,
740
+ "eval_runtime": 45.9202,
741
+ "eval_samples_per_second": 895.074,
742
+ "eval_steps_per_second": 55.945,
743
+ "step": 536000
744
+ },
745
+ {
746
+ "epoch": 11.15,
747
+ "learning_rate": 7.737847077461853e-06,
748
+ "loss": 3.2705,
749
+ "step": 544000
750
+ },
751
+ {
752
+ "epoch": 11.15,
753
+ "eval_loss": 3.3793959617614746,
754
+ "eval_runtime": 45.6075,
755
+ "eval_samples_per_second": 901.211,
756
+ "eval_steps_per_second": 56.328,
757
+ "step": 544000
758
+ },
759
+ {
760
+ "epoch": 11.31,
761
+ "eval_loss": 3.3838233947753906,
762
+ "eval_runtime": 46.1859,
763
+ "eval_samples_per_second": 889.925,
764
+ "eval_steps_per_second": 55.623,
765
+ "step": 552000
766
+ },
767
+ {
768
+ "epoch": 11.47,
769
+ "learning_rate": 7.671141499207872e-06,
770
+ "loss": 3.2676,
771
+ "step": 560000
772
+ },
773
+ {
774
+ "epoch": 11.47,
775
+ "eval_loss": 3.3659656047821045,
776
+ "eval_runtime": 45.7183,
777
+ "eval_samples_per_second": 899.027,
778
+ "eval_steps_per_second": 56.192,
779
+ "step": 560000
780
+ },
781
+ {
782
+ "epoch": 11.64,
783
+ "eval_loss": 3.3937699794769287,
784
+ "eval_runtime": 45.9326,
785
+ "eval_samples_per_second": 894.832,
786
+ "eval_steps_per_second": 55.93,
787
+ "step": 568000
788
+ },
789
+ {
790
+ "epoch": 11.8,
791
+ "learning_rate": 7.604435920953891e-06,
792
+ "loss": 3.258,
793
+ "step": 576000
794
+ },
795
+ {
796
+ "epoch": 11.8,
797
+ "eval_loss": 3.3661420345306396,
798
+ "eval_runtime": 46.4625,
799
+ "eval_samples_per_second": 884.627,
800
+ "eval_steps_per_second": 55.292,
801
+ "step": 576000
802
+ },
803
+ {
804
+ "epoch": 11.97,
805
+ "eval_loss": 3.3490447998046875,
806
+ "eval_runtime": 45.8318,
807
+ "eval_samples_per_second": 896.801,
808
+ "eval_steps_per_second": 56.053,
809
+ "step": 584000
810
+ },
811
+ {
812
+ "epoch": 12.13,
813
+ "learning_rate": 7.537730342699909e-06,
814
+ "loss": 3.2646,
815
+ "step": 592000
816
+ },
817
+ {
818
+ "epoch": 12.13,
819
+ "eval_loss": 3.3716230392456055,
820
+ "eval_runtime": 45.6734,
821
+ "eval_samples_per_second": 899.91,
822
+ "eval_steps_per_second": 56.247,
823
+ "step": 592000
824
+ },
825
+ {
826
+ "epoch": 12.29,
827
+ "eval_loss": 3.3877346515655518,
828
+ "eval_runtime": 46.2161,
829
+ "eval_samples_per_second": 889.344,
830
+ "eval_steps_per_second": 55.587,
831
+ "step": 600000
832
+ },
833
+ {
834
+ "epoch": 12.46,
835
+ "learning_rate": 7.471024764445928e-06,
836
+ "loss": 3.2578,
837
+ "step": 608000
838
+ },
839
+ {
840
+ "epoch": 12.46,
841
+ "eval_loss": 3.3930206298828125,
842
+ "eval_runtime": 45.3985,
843
+ "eval_samples_per_second": 905.361,
844
+ "eval_steps_per_second": 56.588,
845
+ "step": 608000
846
+ },
847
+ {
848
+ "epoch": 12.62,
849
+ "eval_loss": 3.392077922821045,
850
+ "eval_runtime": 45.1724,
851
+ "eval_samples_per_second": 909.893,
852
+ "eval_steps_per_second": 56.871,
853
+ "step": 616000
854
+ },
855
+ {
856
+ "epoch": 12.78,
857
+ "learning_rate": 7.4043191861919465e-06,
858
+ "loss": 3.2719,
859
+ "step": 624000
860
+ },
861
+ {
862
+ "epoch": 12.78,
863
+ "eval_loss": 3.395730495452881,
864
+ "eval_runtime": 45.8195,
865
+ "eval_samples_per_second": 897.042,
866
+ "eval_steps_per_second": 56.068,
867
+ "step": 624000
868
+ },
869
+ {
870
+ "epoch": 12.95,
871
+ "eval_loss": 3.4196434020996094,
872
+ "eval_runtime": 45.2614,
873
+ "eval_samples_per_second": 908.103,
874
+ "eval_steps_per_second": 56.759,
875
+ "step": 632000
876
+ },
877
+ {
878
+ "epoch": 13.11,
879
+ "learning_rate": 7.337613607937964e-06,
880
+ "loss": 3.2828,
881
+ "step": 640000
882
+ },
883
+ {
884
+ "epoch": 13.11,
885
+ "eval_loss": 3.4077515602111816,
886
+ "eval_runtime": 45.5674,
887
+ "eval_samples_per_second": 902.004,
888
+ "eval_steps_per_second": 56.378,
889
+ "step": 640000
890
+ },
891
+ {
892
+ "epoch": 13.28,
893
+ "eval_loss": 3.4202864170074463,
894
+ "eval_runtime": 46.3249,
895
+ "eval_samples_per_second": 887.255,
896
+ "eval_steps_per_second": 55.456,
897
+ "step": 648000
898
+ },
899
+ {
900
+ "epoch": 13.44,
901
+ "learning_rate": 7.270908029683983e-06,
902
+ "loss": 3.2805,
903
+ "step": 656000
904
+ },
905
+ {
906
+ "epoch": 13.44,
907
+ "eval_loss": 3.3899548053741455,
908
+ "eval_runtime": 46.1588,
909
+ "eval_samples_per_second": 890.448,
910
+ "eval_steps_per_second": 55.656,
911
+ "step": 656000
912
+ },
913
+ {
914
+ "epoch": 13.6,
915
+ "eval_loss": 3.4037835597991943,
916
+ "eval_runtime": 46.9454,
917
+ "eval_samples_per_second": 875.527,
918
+ "eval_steps_per_second": 54.723,
919
+ "step": 664000
920
+ },
921
+ {
922
+ "epoch": 13.77,
923
+ "learning_rate": 7.2042024514300015e-06,
924
+ "loss": 3.2975,
925
+ "step": 672000
926
+ },
927
+ {
928
+ "epoch": 13.77,
929
+ "eval_loss": 3.405585765838623,
930
+ "eval_runtime": 46.2706,
931
+ "eval_samples_per_second": 888.297,
932
+ "eval_steps_per_second": 55.521,
933
+ "step": 672000
934
+ },
935
+ {
936
+ "epoch": 13.93,
937
+ "eval_loss": 3.428373336791992,
938
+ "eval_runtime": 45.9889,
939
+ "eval_samples_per_second": 893.738,
940
+ "eval_steps_per_second": 55.861,
941
+ "step": 680000
942
+ },
943
+ {
944
+ "epoch": 14.1,
945
+ "learning_rate": 7.13749687317602e-06,
946
+ "loss": 3.2965,
947
+ "step": 688000
948
+ },
949
+ {
950
+ "epoch": 14.1,
951
+ "eval_loss": 3.41803240776062,
952
+ "eval_runtime": 46.9126,
953
+ "eval_samples_per_second": 876.14,
954
+ "eval_steps_per_second": 54.761,
955
+ "step": 688000
956
+ },
957
+ {
958
+ "epoch": 14.26,
959
+ "eval_loss": 3.419599771499634,
960
+ "eval_runtime": 46.1796,
961
+ "eval_samples_per_second": 890.047,
962
+ "eval_steps_per_second": 55.631,
963
+ "step": 696000
964
+ },
965
+ {
966
+ "epoch": 14.42,
967
+ "learning_rate": 7.070791294922038e-06,
968
+ "loss": 3.3069,
969
+ "step": 704000
970
+ },
971
+ {
972
+ "epoch": 14.42,
973
+ "eval_loss": 3.425711154937744,
974
+ "eval_runtime": 46.2298,
975
+ "eval_samples_per_second": 889.08,
976
+ "eval_steps_per_second": 55.57,
977
+ "step": 704000
978
+ },
979
+ {
980
+ "epoch": 14.59,
981
+ "eval_loss": 3.4299447536468506,
982
+ "eval_runtime": 46.768,
983
+ "eval_samples_per_second": 878.85,
984
+ "eval_steps_per_second": 54.931,
985
+ "step": 712000
986
+ },
987
+ {
988
+ "epoch": 14.75,
989
+ "learning_rate": 7.0040857166680564e-06,
990
+ "loss": 3.3152,
991
+ "step": 720000
992
+ },
993
+ {
994
+ "epoch": 14.75,
995
+ "eval_loss": 3.4787514209747314,
996
+ "eval_runtime": 46.0913,
997
+ "eval_samples_per_second": 891.752,
998
+ "eval_steps_per_second": 55.737,
999
+ "step": 720000
1000
+ },
1001
+ {
1002
+ "epoch": 14.92,
1003
+ "eval_loss": 3.4424662590026855,
1004
+ "eval_runtime": 46.3411,
1005
+ "eval_samples_per_second": 886.945,
1006
+ "eval_steps_per_second": 55.437,
1007
+ "step": 728000
1008
+ },
1009
+ {
1010
+ "epoch": 15.08,
1011
+ "learning_rate": 6.937380138414076e-06,
1012
+ "loss": 3.3125,
1013
+ "step": 736000
1014
+ },
1015
+ {
1016
+ "epoch": 15.08,
1017
+ "eval_loss": 3.430126667022705,
1018
+ "eval_runtime": 46.9882,
1019
+ "eval_samples_per_second": 874.73,
1020
+ "eval_steps_per_second": 54.673,
1021
+ "step": 736000
1022
+ },
1023
+ {
1024
+ "epoch": 15.24,
1025
+ "eval_loss": 3.4440979957580566,
1026
+ "eval_runtime": 46.1825,
1027
+ "eval_samples_per_second": 889.99,
1028
+ "eval_steps_per_second": 55.627,
1029
+ "step": 744000
1030
+ },
1031
+ {
1032
+ "epoch": 15.41,
1033
+ "learning_rate": 6.8706745601600945e-06,
1034
+ "loss": 3.3174,
1035
+ "step": 752000
1036
+ },
1037
+ {
1038
+ "epoch": 15.41,
1039
+ "eval_loss": 3.4396116733551025,
1040
+ "eval_runtime": 46.2686,
1041
+ "eval_samples_per_second": 888.334,
1042
+ "eval_steps_per_second": 55.524,
1043
+ "step": 752000
1044
+ },
1045
+ {
1046
+ "epoch": 15.57,
1047
+ "eval_loss": 3.463931083679199,
1048
+ "eval_runtime": 46.7798,
1049
+ "eval_samples_per_second": 878.627,
1050
+ "eval_steps_per_second": 54.917,
1051
+ "step": 760000
1052
+ },
1053
+ {
1054
+ "epoch": 15.73,
1055
+ "learning_rate": 6.803968981906113e-06,
1056
+ "loss": 3.3242,
1057
+ "step": 768000
1058
+ },
1059
+ {
1060
+ "epoch": 15.73,
1061
+ "eval_loss": 3.4523837566375732,
1062
+ "eval_runtime": 45.7867,
1063
+ "eval_samples_per_second": 897.685,
1064
+ "eval_steps_per_second": 56.108,
1065
+ "step": 768000
1066
+ },
1067
+ {
1068
+ "epoch": 15.9,
1069
+ "eval_loss": 3.455958366394043,
1070
+ "eval_runtime": 45.3124,
1071
+ "eval_samples_per_second": 907.08,
1072
+ "eval_steps_per_second": 56.695,
1073
+ "step": 776000
1074
+ },
1075
+ {
1076
+ "epoch": 16.06,
1077
+ "learning_rate": 6.737263403652131e-06,
1078
+ "loss": 3.3385,
1079
+ "step": 784000
1080
+ },
1081
+ {
1082
+ "epoch": 16.06,
1083
+ "eval_loss": 3.4779999256134033,
1084
+ "eval_runtime": 46.0072,
1085
+ "eval_samples_per_second": 893.383,
1086
+ "eval_steps_per_second": 55.839,
1087
+ "step": 784000
1088
+ },
1089
+ {
1090
+ "epoch": 16.23,
1091
+ "eval_loss": 3.4773714542388916,
1092
+ "eval_runtime": 45.131,
1093
+ "eval_samples_per_second": 910.727,
1094
+ "eval_steps_per_second": 56.923,
1095
+ "step": 792000
1096
+ },
1097
+ {
1098
+ "epoch": 16.39,
1099
+ "learning_rate": 6.6705578253981495e-06,
1100
+ "loss": 3.3371,
1101
+ "step": 800000
1102
+ },
1103
+ {
1104
+ "epoch": 16.39,
1105
+ "eval_loss": 3.47719669342041,
1106
+ "eval_runtime": 45.6308,
1107
+ "eval_samples_per_second": 900.751,
1108
+ "eval_steps_per_second": 56.3,
1109
+ "step": 800000
1110
+ },
1111
+ {
1112
+ "epoch": 16.55,
1113
+ "eval_loss": 3.4955241680145264,
1114
+ "eval_runtime": 46.0477,
1115
+ "eval_samples_per_second": 892.597,
1116
+ "eval_steps_per_second": 55.79,
1117
+ "step": 808000
1118
+ },
1119
+ {
1120
+ "epoch": 16.72,
1121
+ "learning_rate": 6.603852247144168e-06,
1122
+ "loss": 3.3633,
1123
+ "step": 816000
1124
+ },
1125
+ {
1126
+ "epoch": 16.72,
1127
+ "eval_loss": 3.486057996749878,
1128
+ "eval_runtime": 44.9231,
1129
+ "eval_samples_per_second": 914.941,
1130
+ "eval_steps_per_second": 57.187,
1131
+ "step": 816000
1132
+ },
1133
+ {
1134
+ "epoch": 16.88,
1135
+ "eval_loss": 3.506316661834717,
1136
+ "eval_runtime": 45.7078,
1137
+ "eval_samples_per_second": 899.234,
1138
+ "eval_steps_per_second": 56.205,
1139
+ "step": 824000
1140
+ },
1141
+ {
1142
+ "epoch": 17.05,
1143
+ "learning_rate": 6.537146668890187e-06,
1144
+ "loss": 3.3678,
1145
+ "step": 832000
1146
+ },
1147
+ {
1148
+ "epoch": 17.05,
1149
+ "eval_loss": 3.50439190864563,
1150
+ "eval_runtime": 45.0245,
1151
+ "eval_samples_per_second": 912.882,
1152
+ "eval_steps_per_second": 57.058,
1153
+ "step": 832000
1154
+ },
1155
+ {
1156
+ "epoch": 17.21,
1157
+ "eval_loss": 3.520247220993042,
1158
+ "eval_runtime": 45.2071,
1159
+ "eval_samples_per_second": 909.193,
1160
+ "eval_steps_per_second": 56.827,
1161
+ "step": 840000
1162
+ },
1163
+ {
1164
+ "epoch": 17.37,
1165
+ "learning_rate": 6.4704410906362044e-06,
1166
+ "loss": 3.3634,
1167
+ "step": 848000
1168
+ },
1169
+ {
1170
+ "epoch": 17.37,
1171
+ "eval_loss": 3.4941418170928955,
1172
+ "eval_runtime": 46.4208,
1173
+ "eval_samples_per_second": 885.423,
1174
+ "eval_steps_per_second": 55.342,
1175
+ "step": 848000
1176
+ },
1177
+ {
1178
+ "epoch": 17.54,
1179
+ "eval_loss": 3.522303819656372,
1180
+ "eval_runtime": 46.164,
1181
+ "eval_samples_per_second": 890.347,
1182
+ "eval_steps_per_second": 55.649,
1183
+ "step": 856000
1184
+ },
1185
+ {
1186
+ "epoch": 17.7,
1187
+ "learning_rate": 6.403735512382223e-06,
1188
+ "loss": 3.3797,
1189
+ "step": 864000
1190
+ },
1191
+ {
1192
+ "epoch": 17.7,
1193
+ "eval_loss": 3.502774715423584,
1194
+ "eval_runtime": 45.8285,
1195
+ "eval_samples_per_second": 896.865,
1196
+ "eval_steps_per_second": 56.057,
1197
+ "step": 864000
1198
+ },
1199
+ {
1200
+ "epoch": 17.87,
1201
+ "eval_loss": 3.526393175125122,
1202
+ "eval_runtime": 46.6422,
1203
+ "eval_samples_per_second": 881.219,
1204
+ "eval_steps_per_second": 55.079,
1205
+ "step": 872000
1206
+ },
1207
+ {
1208
+ "epoch": 18.03,
1209
+ "learning_rate": 6.337029934128242e-06,
1210
+ "loss": 3.3802,
1211
+ "step": 880000
1212
+ },
1213
+ {
1214
+ "epoch": 18.03,
1215
+ "eval_loss": 3.531257152557373,
1216
+ "eval_runtime": 46.217,
1217
+ "eval_samples_per_second": 889.327,
1218
+ "eval_steps_per_second": 55.586,
1219
+ "step": 880000
1220
+ },
1221
+ {
1222
+ "epoch": 18.19,
1223
+ "eval_loss": 3.496319055557251,
1224
+ "eval_runtime": 45.9803,
1225
+ "eval_samples_per_second": 893.904,
1226
+ "eval_steps_per_second": 55.872,
1227
+ "step": 888000
1228
+ },
1229
+ {
1230
+ "epoch": 18.36,
1231
+ "learning_rate": 6.270324355874261e-06,
1232
+ "loss": 3.357,
1233
+ "step": 896000
1234
+ },
1235
+ {
1236
+ "epoch": 18.36,
1237
+ "eval_loss": 3.5171141624450684,
1238
+ "eval_runtime": 47.1622,
1239
+ "eval_samples_per_second": 871.504,
1240
+ "eval_steps_per_second": 54.472,
1241
+ "step": 896000
1242
+ },
1243
+ {
1244
+ "epoch": 18.52,
1245
+ "eval_loss": 3.530701160430908,
1246
+ "eval_runtime": 46.113,
1247
+ "eval_samples_per_second": 891.332,
1248
+ "eval_steps_per_second": 55.711,
1249
+ "step": 904000
1250
+ },
1251
+ {
1252
+ "epoch": 18.69,
1253
+ "learning_rate": 6.20361877762028e-06,
1254
+ "loss": 3.3866,
1255
+ "step": 912000
1256
+ },
1257
+ {
1258
+ "epoch": 18.69,
1259
+ "eval_loss": 3.5221967697143555,
1260
+ "eval_runtime": 46.035,
1261
+ "eval_samples_per_second": 892.843,
1262
+ "eval_steps_per_second": 55.805,
1263
+ "step": 912000
1264
+ },
1265
+ {
1266
+ "epoch": 18.85,
1267
+ "eval_loss": 3.5319056510925293,
1268
+ "eval_runtime": 46.8446,
1269
+ "eval_samples_per_second": 877.412,
1270
+ "eval_steps_per_second": 54.841,
1271
+ "step": 920000
1272
+ },
1273
+ {
1274
+ "epoch": 19.01,
1275
+ "learning_rate": 6.1369131993662975e-06,
1276
+ "loss": 3.3818,
1277
+ "step": 928000
1278
+ },
1279
+ {
1280
+ "epoch": 19.01,
1281
+ "eval_loss": 3.532552480697632,
1282
+ "eval_runtime": 46.3901,
1283
+ "eval_samples_per_second": 886.007,
1284
+ "eval_steps_per_second": 55.378,
1285
+ "step": 928000
1286
+ },
1287
+ {
1288
+ "epoch": 19.18,
1289
+ "eval_loss": 3.5116307735443115,
1290
+ "eval_runtime": 45.2931,
1291
+ "eval_samples_per_second": 907.466,
1292
+ "eval_steps_per_second": 56.719,
1293
+ "step": 936000
1294
+ },
1295
+ {
1296
+ "epoch": 19.34,
1297
+ "learning_rate": 6.070207621112316e-06,
1298
+ "loss": 3.3754,
1299
+ "step": 944000
1300
+ },
1301
+ {
1302
+ "epoch": 19.34,
1303
+ "eval_loss": 3.5228991508483887,
1304
+ "eval_runtime": 47.0715,
1305
+ "eval_samples_per_second": 873.183,
1306
+ "eval_steps_per_second": 54.577,
1307
+ "step": 944000
1308
+ },
1309
+ {
1310
+ "epoch": 19.5,
1311
+ "eval_loss": 3.538318634033203,
1312
+ "eval_runtime": 45.9256,
1313
+ "eval_samples_per_second": 894.97,
1314
+ "eval_steps_per_second": 55.938,
1315
+ "step": 952000
1316
+ },
1317
+ {
1318
+ "epoch": 19.67,
1319
+ "learning_rate": 6.003502042858335e-06,
1320
+ "loss": 3.3893,
1321
+ "step": 960000
1322
+ },
1323
+ {
1324
+ "epoch": 19.67,
1325
+ "eval_loss": 3.544513463973999,
1326
+ "eval_runtime": 46.8245,
1327
+ "eval_samples_per_second": 877.788,
1328
+ "eval_steps_per_second": 54.864,
1329
+ "step": 960000
1330
+ },
1331
+ {
1332
+ "epoch": 19.83,
1333
+ "eval_loss": 3.5230634212493896,
1334
+ "eval_runtime": 47.3348,
1335
+ "eval_samples_per_second": 868.325,
1336
+ "eval_steps_per_second": 54.273,
1337
+ "step": 968000
1338
+ },
1339
+ {
1340
+ "epoch": 20.0,
1341
+ "learning_rate": 5.936796464604353e-06,
1342
+ "loss": 3.3899,
1343
+ "step": 976000
1344
+ },
1345
+ {
1346
+ "epoch": 20.0,
1347
+ "eval_loss": 3.531026840209961,
1348
+ "eval_runtime": 45.7886,
1349
+ "eval_samples_per_second": 897.647,
1350
+ "eval_steps_per_second": 56.106,
1351
+ "step": 976000
1352
+ },
1353
+ {
1354
+ "epoch": 20.16,
1355
+ "eval_loss": 3.53287935256958,
1356
+ "eval_runtime": 46.7771,
1357
+ "eval_samples_per_second": 878.677,
1358
+ "eval_steps_per_second": 54.92,
1359
+ "step": 984000
1360
+ },
1361
+ {
1362
+ "epoch": 20.32,
1363
+ "learning_rate": 5.870090886350371e-06,
1364
+ "loss": 3.3918,
1365
+ "step": 992000
1366
+ },
1367
+ {
1368
+ "epoch": 20.32,
1369
+ "eval_loss": 3.5158653259277344,
1370
+ "eval_runtime": 46.2173,
1371
+ "eval_samples_per_second": 889.32,
1372
+ "eval_steps_per_second": 55.585,
1373
+ "step": 992000
1374
+ },
1375
+ {
1376
+ "epoch": 20.49,
1377
+ "eval_loss": 3.562788486480713,
1378
+ "eval_runtime": 45.7474,
1379
+ "eval_samples_per_second": 898.456,
1380
+ "eval_steps_per_second": 56.156,
1381
+ "step": 1000000
1382
+ },
1383
+ {
1384
+ "epoch": 20.65,
1385
+ "learning_rate": 5.80338530809639e-06,
1386
+ "loss": 3.3786,
1387
+ "step": 1008000
1388
+ },
1389
+ {
1390
+ "epoch": 20.65,
1391
+ "eval_loss": 3.5290534496307373,
1392
+ "eval_runtime": 46.4581,
1393
+ "eval_samples_per_second": 884.711,
1394
+ "eval_steps_per_second": 55.297,
1395
+ "step": 1008000
1396
+ },
1397
+ {
1398
+ "epoch": 20.82,
1399
+ "eval_loss": 3.5163111686706543,
1400
+ "eval_runtime": 45.899,
1401
+ "eval_samples_per_second": 895.487,
1402
+ "eval_steps_per_second": 55.971,
1403
+ "step": 1016000
1404
+ },
1405
+ {
1406
+ "epoch": 20.98,
1407
+ "learning_rate": 5.736679729842408e-06,
1408
+ "loss": 3.3862,
1409
+ "step": 1024000
1410
+ },
1411
+ {
1412
+ "epoch": 20.98,
1413
+ "eval_loss": 3.531219959259033,
1414
+ "eval_runtime": 45.4959,
1415
+ "eval_samples_per_second": 903.423,
1416
+ "eval_steps_per_second": 56.467,
1417
+ "step": 1024000
1418
+ },
1419
+ {
1420
+ "epoch": 21.14,
1421
+ "eval_loss": 3.514033317565918,
1422
+ "eval_runtime": 46.6408,
1423
+ "eval_samples_per_second": 881.245,
1424
+ "eval_steps_per_second": 55.08,
1425
+ "step": 1032000
1426
+ },
1427
+ {
1428
+ "epoch": 21.31,
1429
+ "learning_rate": 5.669974151588427e-06,
1430
+ "loss": 3.3855,
1431
+ "step": 1040000
1432
+ },
1433
+ {
1434
+ "epoch": 21.31,
1435
+ "eval_loss": 3.5617153644561768,
1436
+ "eval_runtime": 45.7071,
1437
+ "eval_samples_per_second": 899.248,
1438
+ "eval_steps_per_second": 56.206,
1439
+ "step": 1040000
1440
+ },
1441
+ {
1442
+ "epoch": 21.47,
1443
+ "eval_loss": 3.5374927520751953,
1444
+ "eval_runtime": 45.668,
1445
+ "eval_samples_per_second": 900.018,
1446
+ "eval_steps_per_second": 56.254,
1447
+ "step": 1048000
1448
+ },
1449
+ {
1450
+ "epoch": 21.64,
1451
+ "learning_rate": 5.603268573334446e-06,
1452
+ "loss": 3.3872,
1453
+ "step": 1056000
1454
+ },
1455
+ {
1456
+ "epoch": 21.64,
1457
+ "eval_loss": 3.532823085784912,
1458
+ "eval_runtime": 46.5514,
1459
+ "eval_samples_per_second": 882.938,
1460
+ "eval_steps_per_second": 55.186,
1461
+ "step": 1056000
1462
+ },
1463
+ {
1464
+ "epoch": 21.8,
1465
+ "eval_loss": 3.561626434326172,
1466
+ "eval_runtime": 45.9586,
1467
+ "eval_samples_per_second": 894.327,
1468
+ "eval_steps_per_second": 55.898,
1469
+ "step": 1064000
1470
+ },
1471
+ {
1472
+ "epoch": 21.96,
1473
+ "learning_rate": 5.536562995080464e-06,
1474
+ "loss": 3.3931,
1475
+ "step": 1072000
1476
+ },
1477
+ {
1478
+ "epoch": 21.96,
1479
+ "eval_loss": 3.5647873878479004,
1480
+ "eval_runtime": 46.8936,
1481
+ "eval_samples_per_second": 876.495,
1482
+ "eval_steps_per_second": 54.784,
1483
+ "step": 1072000
1484
+ },
1485
+ {
1486
+ "epoch": 22.13,
1487
+ "eval_loss": 3.544335126876831,
1488
+ "eval_runtime": 46.3686,
1489
+ "eval_samples_per_second": 886.419,
1490
+ "eval_steps_per_second": 55.404,
1491
+ "step": 1080000
1492
+ },
1493
+ {
1494
+ "epoch": 22.29,
1495
+ "learning_rate": 5.469857416826483e-06,
1496
+ "loss": 3.3708,
1497
+ "step": 1088000
1498
+ },
1499
+ {
1500
+ "epoch": 22.29,
1501
+ "eval_loss": 3.5400941371917725,
1502
+ "eval_runtime": 45.8359,
1503
+ "eval_samples_per_second": 896.72,
1504
+ "eval_steps_per_second": 56.048,
1505
+ "step": 1088000
1506
+ },
1507
+ {
1508
+ "epoch": 22.45,
1509
+ "eval_loss": 3.55292010307312,
1510
+ "eval_runtime": 46.8082,
1511
+ "eval_samples_per_second": 878.095,
1512
+ "eval_steps_per_second": 54.884,
1513
+ "step": 1096000
1514
+ },
1515
+ {
1516
+ "epoch": 22.62,
1517
+ "learning_rate": 5.403151838572501e-06,
1518
+ "loss": 3.4099,
1519
+ "step": 1104000
1520
+ },
1521
+ {
1522
+ "epoch": 22.62,
1523
+ "eval_loss": 3.533414602279663,
1524
+ "eval_runtime": 46.1107,
1525
+ "eval_samples_per_second": 891.377,
1526
+ "eval_steps_per_second": 55.714,
1527
+ "step": 1104000
1528
+ },
1529
+ {
1530
+ "epoch": 22.78,
1531
+ "eval_loss": 3.5325212478637695,
1532
+ "eval_runtime": 46.1351,
1533
+ "eval_samples_per_second": 890.905,
1534
+ "eval_steps_per_second": 55.684,
1535
+ "step": 1112000
1536
+ },
1537
+ {
1538
+ "epoch": 22.95,
1539
+ "learning_rate": 5.33644626031852e-06,
1540
+ "loss": 3.4027,
1541
+ "step": 1120000
1542
+ },
1543
+ {
1544
+ "epoch": 22.95,
1545
+ "eval_loss": 3.5818660259246826,
1546
+ "eval_runtime": 46.7428,
1547
+ "eval_samples_per_second": 879.323,
1548
+ "eval_steps_per_second": 54.96,
1549
+ "step": 1120000
1550
+ },
1551
+ {
1552
+ "epoch": 23.11,
1553
+ "eval_loss": 3.5470829010009766,
1554
+ "eval_runtime": 46.1344,
1555
+ "eval_samples_per_second": 890.92,
1556
+ "eval_steps_per_second": 55.685,
1557
+ "step": 1128000
1558
+ },
1559
+ {
1560
+ "epoch": 23.27,
1561
+ "learning_rate": 5.269740682064538e-06,
1562
+ "loss": 3.4035,
1563
+ "step": 1136000
1564
+ },
1565
+ {
1566
+ "epoch": 23.27,
1567
+ "eval_loss": 3.548552989959717,
1568
+ "eval_runtime": 46.1071,
1569
+ "eval_samples_per_second": 891.446,
1570
+ "eval_steps_per_second": 55.718,
1571
+ "step": 1136000
1572
+ },
1573
+ {
1574
+ "epoch": 23.44,
1575
+ "eval_loss": 3.5470151901245117,
1576
+ "eval_runtime": 46.849,
1577
+ "eval_samples_per_second": 877.33,
1578
+ "eval_steps_per_second": 54.836,
1579
+ "step": 1144000
1580
+ },
1581
+ {
1582
+ "epoch": 23.6,
1583
+ "learning_rate": 5.203035103810556e-06,
1584
+ "loss": 3.3964,
1585
+ "step": 1152000
1586
+ },
1587
+ {
1588
+ "epoch": 23.6,
1589
+ "eval_loss": 3.572176694869995,
1590
+ "eval_runtime": 46.3661,
1591
+ "eval_samples_per_second": 886.467,
1592
+ "eval_steps_per_second": 55.407,
1593
+ "step": 1152000
1594
+ },
1595
+ {
1596
+ "epoch": 23.77,
1597
+ "eval_loss": 3.55098295211792,
1598
+ "eval_runtime": 46.1812,
1599
+ "eval_samples_per_second": 890.015,
1600
+ "eval_steps_per_second": 55.629,
1601
+ "step": 1160000
1602
+ },
1603
+ {
1604
+ "epoch": 23.93,
1605
+ "learning_rate": 5.136329525556575e-06,
1606
+ "loss": 3.4115,
1607
+ "step": 1168000
1608
+ },
1609
+ {
1610
+ "epoch": 23.93,
1611
+ "eval_loss": 3.561007499694824,
1612
+ "eval_runtime": 47.5429,
1613
+ "eval_samples_per_second": 864.525,
1614
+ "eval_steps_per_second": 54.035,
1615
+ "step": 1168000
1616
+ },
1617
+ {
1618
+ "epoch": 24.09,
1619
+ "eval_loss": 3.5757482051849365,
1620
+ "eval_runtime": 46.3962,
1621
+ "eval_samples_per_second": 885.891,
1622
+ "eval_steps_per_second": 55.371,
1623
+ "step": 1176000
1624
+ },
1625
+ {
1626
+ "epoch": 24.26,
1627
+ "learning_rate": 5.0696239473025935e-06,
1628
+ "loss": 3.4173,
1629
+ "step": 1184000
1630
+ },
1631
+ {
1632
+ "epoch": 24.26,
1633
+ "eval_loss": 3.554094076156616,
1634
+ "eval_runtime": 45.5708,
1635
+ "eval_samples_per_second": 901.936,
1636
+ "eval_steps_per_second": 56.374,
1637
+ "step": 1184000
1638
+ },
1639
+ {
1640
+ "epoch": 24.42,
1641
+ "eval_loss": 3.577660083770752,
1642
+ "eval_runtime": 47.0565,
1643
+ "eval_samples_per_second": 873.461,
1644
+ "eval_steps_per_second": 54.594,
1645
+ "step": 1192000
1646
+ },
1647
+ {
1648
+ "epoch": 24.59,
1649
+ "learning_rate": 5.002918369048611e-06,
1650
+ "loss": 3.4169,
1651
+ "step": 1200000
1652
+ },
1653
+ {
1654
+ "epoch": 24.59,
1655
+ "eval_loss": 3.5637948513031006,
1656
+ "eval_runtime": 47.0711,
1657
+ "eval_samples_per_second": 873.19,
1658
+ "eval_steps_per_second": 54.577,
1659
+ "step": 1200000
1660
+ },
1661
+ {
1662
+ "epoch": 24.75,
1663
+ "eval_loss": 3.5462896823883057,
1664
+ "eval_runtime": 46.8215,
1665
+ "eval_samples_per_second": 877.845,
1666
+ "eval_steps_per_second": 54.868,
1667
+ "step": 1208000
1668
+ },
1669
+ {
1670
+ "epoch": 24.91,
1671
+ "learning_rate": 4.936212790794631e-06,
1672
+ "loss": 3.4031,
1673
+ "step": 1216000
1674
+ },
1675
+ {
1676
+ "epoch": 24.91,
1677
+ "eval_loss": 3.5299670696258545,
1678
+ "eval_runtime": 46.9742,
1679
+ "eval_samples_per_second": 874.99,
1680
+ "eval_steps_per_second": 54.69,
1681
+ "step": 1216000
1682
+ },
1683
+ {
1684
+ "epoch": 25.08,
1685
+ "eval_loss": 3.558427333831787,
1686
+ "eval_runtime": 46.1322,
1687
+ "eval_samples_per_second": 890.961,
1688
+ "eval_steps_per_second": 55.688,
1689
+ "step": 1224000
1690
+ },
1691
+ {
1692
+ "epoch": 25.24,
1693
+ "learning_rate": 4.869507212540649e-06,
1694
+ "loss": 3.4094,
1695
+ "step": 1232000
1696
+ },
1697
+ {
1698
+ "epoch": 25.24,
1699
+ "eval_loss": 3.568174123764038,
1700
+ "eval_runtime": 46.3049,
1701
+ "eval_samples_per_second": 887.638,
1702
+ "eval_steps_per_second": 55.48,
1703
+ "step": 1232000
1704
+ },
1705
+ {
1706
+ "epoch": 25.41,
1707
+ "eval_loss": 3.555844783782959,
1708
+ "eval_runtime": 46.0676,
1709
+ "eval_samples_per_second": 892.211,
1710
+ "eval_steps_per_second": 55.766,
1711
+ "step": 1240000
1712
+ },
1713
+ {
1714
+ "epoch": 25.57,
1715
+ "learning_rate": 4.802801634286667e-06,
1716
+ "loss": 3.4116,
1717
+ "step": 1248000
1718
+ },
1719
+ {
1720
+ "epoch": 25.57,
1721
+ "eval_loss": 3.5629091262817383,
1722
+ "eval_runtime": 45.5765,
1723
+ "eval_samples_per_second": 901.825,
1724
+ "eval_steps_per_second": 56.367,
1725
+ "step": 1248000
1726
+ },
1727
+ {
1728
+ "epoch": 25.73,
1729
+ "eval_loss": 3.5490224361419678,
1730
+ "eval_runtime": 46.4409,
1731
+ "eval_samples_per_second": 885.039,
1732
+ "eval_steps_per_second": 55.318,
1733
+ "step": 1256000
1734
+ },
1735
+ {
1736
+ "epoch": 25.9,
1737
+ "learning_rate": 4.7360960560326865e-06,
1738
+ "loss": 3.4199,
1739
+ "step": 1264000
1740
+ },
1741
+ {
1742
+ "epoch": 25.9,
1743
+ "eval_loss": 3.567878484725952,
1744
+ "eval_runtime": 46.1595,
1745
+ "eval_samples_per_second": 890.434,
1746
+ "eval_steps_per_second": 55.655,
1747
+ "step": 1264000
1748
+ },
1749
+ {
1750
+ "epoch": 26.06,
1751
+ "eval_loss": 3.5885465145111084,
1752
+ "eval_runtime": 45.9316,
1753
+ "eval_samples_per_second": 894.853,
1754
+ "eval_steps_per_second": 55.931,
1755
+ "step": 1272000
1756
+ },
1757
+ {
1758
+ "epoch": 26.22,
1759
+ "learning_rate": 4.669390477778704e-06,
1760
+ "loss": 3.412,
1761
+ "step": 1280000
1762
+ },
1763
+ {
1764
+ "epoch": 26.22,
1765
+ "eval_loss": 3.5578629970550537,
1766
+ "eval_runtime": 46.4337,
1767
+ "eval_samples_per_second": 885.176,
1768
+ "eval_steps_per_second": 55.326,
1769
+ "step": 1280000
1770
+ },
1771
+ {
1772
+ "epoch": 26.39,
1773
+ "eval_loss": 3.5465352535247803,
1774
+ "eval_runtime": 45.7517,
1775
+ "eval_samples_per_second": 898.371,
1776
+ "eval_steps_per_second": 56.151,
1777
+ "step": 1288000
1778
+ },
1779
+ {
1780
+ "epoch": 26.55,
1781
+ "learning_rate": 4.602684899524723e-06,
1782
+ "loss": 3.4123,
1783
+ "step": 1296000
1784
+ },
1785
+ {
1786
+ "epoch": 26.55,
1787
+ "eval_loss": 3.572610855102539,
1788
+ "eval_runtime": 45.5426,
1789
+ "eval_samples_per_second": 902.496,
1790
+ "eval_steps_per_second": 56.409,
1791
+ "step": 1296000
1792
+ },
1793
+ {
1794
+ "epoch": 26.72,
1795
+ "eval_loss": 3.577484130859375,
1796
+ "eval_runtime": 46.4204,
1797
+ "eval_samples_per_second": 885.431,
1798
+ "eval_steps_per_second": 55.342,
1799
+ "step": 1304000
1800
+ },
1801
+ {
1802
+ "epoch": 26.88,
1803
+ "learning_rate": 4.5359793212707415e-06,
1804
+ "loss": 3.4132,
1805
+ "step": 1312000
1806
+ },
1807
+ {
1808
+ "epoch": 26.88,
1809
+ "eval_loss": 3.5477850437164307,
1810
+ "eval_runtime": 45.6512,
1811
+ "eval_samples_per_second": 900.348,
1812
+ "eval_steps_per_second": 56.275,
1813
+ "step": 1312000
1814
+ },
1815
+ {
1816
+ "epoch": 27.04,
1817
+ "eval_loss": 3.5588574409484863,
1818
+ "eval_runtime": 46.0446,
1819
+ "eval_samples_per_second": 892.657,
1820
+ "eval_steps_per_second": 55.794,
1821
+ "step": 1320000
1822
+ },
1823
+ {
1824
+ "epoch": 27.21,
1825
+ "learning_rate": 4.46927374301676e-06,
1826
+ "loss": 3.4161,
1827
+ "step": 1328000
1828
+ },
1829
+ {
1830
+ "epoch": 27.21,
1831
+ "eval_loss": 3.56620717048645,
1832
+ "eval_runtime": 46.4839,
1833
+ "eval_samples_per_second": 884.22,
1834
+ "eval_steps_per_second": 55.266,
1835
+ "step": 1328000
1836
+ },
1837
+ {
1838
+ "epoch": 27.37,
1839
+ "eval_loss": 3.589487075805664,
1840
+ "eval_runtime": 46.3966,
1841
+ "eval_samples_per_second": 885.884,
1842
+ "eval_steps_per_second": 55.37,
1843
+ "step": 1336000
1844
+ },
1845
+ {
1846
+ "epoch": 27.54,
1847
+ "learning_rate": 4.402568164762779e-06,
1848
+ "loss": 3.4097,
1849
+ "step": 1344000
1850
+ },
1851
+ {
1852
+ "epoch": 27.54,
1853
+ "eval_loss": 3.5940632820129395,
1854
+ "eval_runtime": 46.4364,
1855
+ "eval_samples_per_second": 885.125,
1856
+ "eval_steps_per_second": 55.323,
1857
+ "step": 1344000
1858
+ },
1859
+ {
1860
+ "epoch": 27.7,
1861
+ "eval_loss": 3.5912110805511475,
1862
+ "eval_runtime": 45.9687,
1863
+ "eval_samples_per_second": 894.131,
1864
+ "eval_steps_per_second": 55.886,
1865
+ "step": 1352000
1866
+ },
1867
+ {
1868
+ "epoch": 27.86,
1869
+ "learning_rate": 4.335862586508797e-06,
1870
+ "loss": 3.415,
1871
+ "step": 1360000
1872
+ },
1873
+ {
1874
+ "epoch": 27.86,
1875
+ "eval_loss": 3.565756320953369,
1876
+ "eval_runtime": 45.7621,
1877
+ "eval_samples_per_second": 898.168,
1878
+ "eval_steps_per_second": 56.138,
1879
+ "step": 1360000
1880
+ },
1881
+ {
1882
+ "epoch": 28.03,
1883
+ "eval_loss": 3.5553781986236572,
1884
+ "eval_runtime": 46.2903,
1885
+ "eval_samples_per_second": 887.919,
1886
+ "eval_steps_per_second": 55.498,
1887
+ "step": 1368000
1888
+ },
1889
+ {
1890
+ "epoch": 28.19,
1891
+ "learning_rate": 4.269157008254816e-06,
1892
+ "loss": 3.4193,
1893
+ "step": 1376000
1894
+ },
1895
+ {
1896
+ "epoch": 28.19,
1897
+ "eval_loss": 3.589851140975952,
1898
+ "eval_runtime": 45.8411,
1899
+ "eval_samples_per_second": 896.618,
1900
+ "eval_steps_per_second": 56.041,
1901
+ "step": 1376000
1902
+ },
1903
+ {
1904
+ "epoch": 28.36,
1905
+ "eval_loss": 3.5652260780334473,
1906
+ "eval_runtime": 45.5538,
1907
+ "eval_samples_per_second": 902.275,
1908
+ "eval_steps_per_second": 56.395,
1909
+ "step": 1384000
1910
+ },
1911
+ {
1912
+ "epoch": 28.52,
1913
+ "learning_rate": 4.202451430000834e-06,
1914
+ "loss": 3.4136,
1915
+ "step": 1392000
1916
+ },
1917
+ {
1918
+ "epoch": 28.52,
1919
+ "eval_loss": 3.5832390785217285,
1920
+ "eval_runtime": 46.575,
1921
+ "eval_samples_per_second": 882.491,
1922
+ "eval_steps_per_second": 55.158,
1923
+ "step": 1392000
1924
+ },
1925
+ {
1926
+ "epoch": 28.68,
1927
+ "eval_loss": 3.5885210037231445,
1928
+ "eval_runtime": 45.9659,
1929
+ "eval_samples_per_second": 894.184,
1930
+ "eval_steps_per_second": 55.889,
1931
+ "step": 1400000
1932
+ },
1933
+ {
1934
+ "epoch": 28.85,
1935
+ "learning_rate": 4.135745851746852e-06,
1936
+ "loss": 3.4294,
1937
+ "step": 1408000
1938
+ },
1939
+ {
1940
+ "epoch": 28.85,
1941
+ "eval_loss": 3.583249807357788,
1942
+ "eval_runtime": 45.7927,
1943
+ "eval_samples_per_second": 897.568,
1944
+ "eval_steps_per_second": 56.101,
1945
+ "step": 1408000
1946
+ },
1947
+ {
1948
+ "epoch": 29.01,
1949
+ "eval_loss": 3.6025209426879883,
1950
+ "eval_runtime": 46.362,
1951
+ "eval_samples_per_second": 886.546,
1952
+ "eval_steps_per_second": 55.412,
1953
+ "step": 1416000
1954
+ },
1955
+ {
1956
+ "epoch": 29.17,
1957
+ "learning_rate": 4.069040273492872e-06,
1958
+ "loss": 3.4243,
1959
+ "step": 1424000
1960
+ },
1961
+ {
1962
+ "epoch": 29.17,
1963
+ "eval_loss": 3.6040360927581787,
1964
+ "eval_runtime": 45.7855,
1965
+ "eval_samples_per_second": 897.708,
1966
+ "eval_steps_per_second": 56.11,
1967
+ "step": 1424000
1968
+ },
1969
+ {
1970
+ "epoch": 29.34,
1971
+ "eval_loss": 3.5890395641326904,
1972
+ "eval_runtime": 46.5109,
1973
+ "eval_samples_per_second": 883.707,
1974
+ "eval_steps_per_second": 55.234,
1975
+ "step": 1432000
1976
+ },
1977
+ {
1978
+ "epoch": 29.5,
1979
+ "learning_rate": 4.0023346952388895e-06,
1980
+ "loss": 3.4427,
1981
+ "step": 1440000
1982
+ },
1983
+ {
1984
+ "epoch": 29.5,
1985
+ "eval_loss": 3.58347749710083,
1986
+ "eval_runtime": 46.2896,
1987
+ "eval_samples_per_second": 887.931,
1988
+ "eval_steps_per_second": 55.498,
1989
+ "step": 1440000
1990
+ },
1991
+ {
1992
+ "epoch": 29.67,
1993
+ "eval_loss": 3.6185286045074463,
1994
+ "eval_runtime": 46.4189,
1995
+ "eval_samples_per_second": 885.459,
1996
+ "eval_steps_per_second": 55.344,
1997
+ "step": 1448000
1998
+ },
1999
+ {
2000
+ "epoch": 29.83,
2001
+ "learning_rate": 3.935629116984908e-06,
2002
+ "loss": 3.4293,
2003
+ "step": 1456000
2004
+ },
2005
+ {
2006
+ "epoch": 29.83,
2007
+ "eval_loss": 3.6028919219970703,
2008
+ "eval_runtime": 46.7251,
2009
+ "eval_samples_per_second": 879.656,
2010
+ "eval_steps_per_second": 54.981,
2011
+ "step": 1456000
2012
+ },
2013
+ {
2014
+ "epoch": 29.99,
2015
+ "eval_loss": 3.616161823272705,
2016
+ "eval_runtime": 45.7265,
2017
+ "eval_samples_per_second": 898.865,
2018
+ "eval_steps_per_second": 56.182,
2019
+ "step": 1464000
2020
+ },
2021
+ {
2022
+ "epoch": 30.16,
2023
+ "learning_rate": 3.868923538730927e-06,
2024
+ "loss": 3.4363,
2025
+ "step": 1472000
2026
+ },
2027
+ {
2028
+ "epoch": 30.16,
2029
+ "eval_loss": 3.6257941722869873,
2030
+ "eval_runtime": 45.6532,
2031
+ "eval_samples_per_second": 900.308,
2032
+ "eval_steps_per_second": 56.272,
2033
+ "step": 1472000
2034
+ },
2035
+ {
2036
+ "epoch": 30.32,
2037
+ "eval_loss": 3.6038014888763428,
2038
+ "eval_runtime": 46.717,
2039
+ "eval_samples_per_second": 879.808,
2040
+ "eval_steps_per_second": 54.991,
2041
+ "step": 1480000
2042
+ },
2043
+ {
2044
+ "epoch": 30.49,
2045
+ "learning_rate": 3.8022179604769453e-06,
2046
+ "loss": 3.4532,
2047
+ "step": 1488000
2048
+ },
2049
+ {
2050
+ "epoch": 30.49,
2051
+ "eval_loss": 3.6039483547210693,
2052
+ "eval_runtime": 45.742,
2053
+ "eval_samples_per_second": 898.562,
2054
+ "eval_steps_per_second": 56.163,
2055
+ "step": 1488000
2056
+ },
2057
+ {
2058
+ "epoch": 30.65,
2059
+ "eval_loss": 3.605367422103882,
2060
+ "eval_runtime": 45.7078,
2061
+ "eval_samples_per_second": 899.234,
2062
+ "eval_steps_per_second": 56.205,
2063
+ "step": 1496000
2064
+ },
2065
+ {
2066
+ "epoch": 30.81,
2067
+ "learning_rate": 3.735512382222964e-06,
2068
+ "loss": 3.4401,
2069
+ "step": 1504000
2070
+ },
2071
+ {
2072
+ "epoch": 30.81,
2073
+ "eval_loss": 3.6269376277923584,
2074
+ "eval_runtime": 46.6124,
2075
+ "eval_samples_per_second": 881.783,
2076
+ "eval_steps_per_second": 55.114,
2077
+ "step": 1504000
2078
+ },
2079
+ {
2080
+ "epoch": 30.98,
2081
+ "eval_loss": 3.600417137145996,
2082
+ "eval_runtime": 47.0146,
2083
+ "eval_samples_per_second": 874.239,
2084
+ "eval_steps_per_second": 54.643,
2085
+ "step": 1512000
2086
+ },
2087
+ {
2088
+ "epoch": 31.14,
2089
+ "learning_rate": 3.668806803968982e-06,
2090
+ "loss": 3.4491,
2091
+ "step": 1520000
2092
+ },
2093
+ {
2094
+ "epoch": 31.14,
2095
+ "eval_loss": 3.6095597743988037,
2096
+ "eval_runtime": 47.1653,
2097
+ "eval_samples_per_second": 871.446,
2098
+ "eval_steps_per_second": 54.468,
2099
+ "step": 1520000
2100
+ },
2101
+ {
2102
+ "epoch": 31.31,
2103
+ "eval_loss": 3.6216766834259033,
2104
+ "eval_runtime": 48.343,
2105
+ "eval_samples_per_second": 850.216,
2106
+ "eval_steps_per_second": 53.141,
2107
+ "step": 1528000
2108
+ },
2109
+ {
2110
+ "epoch": 31.47,
2111
+ "learning_rate": 3.6021012257150007e-06,
2112
+ "loss": 3.4438,
2113
+ "step": 1536000
2114
+ },
2115
+ {
2116
+ "epoch": 31.47,
2117
+ "eval_loss": 3.6081080436706543,
2118
+ "eval_runtime": 47.4804,
2119
+ "eval_samples_per_second": 865.663,
2120
+ "eval_steps_per_second": 54.107,
2121
+ "step": 1536000
2122
+ },
2123
+ {
2124
+ "epoch": 31.63,
2125
+ "eval_loss": 3.6190168857574463,
2126
+ "eval_runtime": 48.3587,
2127
+ "eval_samples_per_second": 849.941,
2128
+ "eval_steps_per_second": 53.124,
2129
+ "step": 1544000
2130
+ },
2131
+ {
2132
+ "epoch": 31.8,
2133
+ "learning_rate": 3.535395647461019e-06,
2134
+ "loss": 3.4337,
2135
+ "step": 1552000
2136
+ },
2137
+ {
2138
+ "epoch": 31.8,
2139
+ "eval_loss": 3.611992835998535,
2140
+ "eval_runtime": 47.5342,
2141
+ "eval_samples_per_second": 864.683,
2142
+ "eval_steps_per_second": 54.045,
2143
+ "step": 1552000
2144
+ },
2145
+ {
2146
+ "epoch": 31.96,
2147
+ "eval_loss": 3.586127996444702,
2148
+ "eval_runtime": 46.8726,
2149
+ "eval_samples_per_second": 876.888,
2150
+ "eval_steps_per_second": 54.808,
2151
+ "step": 1560000
2152
+ },
2153
+ {
2154
+ "epoch": 32.13,
2155
+ "learning_rate": 3.468690069207038e-06,
2156
+ "loss": 3.4475,
2157
+ "step": 1568000
2158
+ },
2159
+ {
2160
+ "epoch": 32.13,
2161
+ "eval_loss": 3.620932102203369,
2162
+ "eval_runtime": 48.2654,
2163
+ "eval_samples_per_second": 851.582,
2164
+ "eval_steps_per_second": 53.226,
2165
+ "step": 1568000
2166
+ },
2167
+ {
2168
+ "epoch": 32.29,
2169
+ "eval_loss": 3.6301937103271484,
2170
+ "eval_runtime": 47.2416,
2171
+ "eval_samples_per_second": 870.039,
2172
+ "eval_steps_per_second": 54.38,
2173
+ "step": 1576000
2174
+ },
2175
+ {
2176
+ "epoch": 32.45,
2177
+ "learning_rate": 3.4019844909530565e-06,
2178
+ "loss": 3.4406,
2179
+ "step": 1584000
2180
+ },
2181
+ {
2182
+ "epoch": 32.45,
2183
+ "eval_loss": 3.6052932739257812,
2184
+ "eval_runtime": 46.0861,
2185
+ "eval_samples_per_second": 891.852,
2186
+ "eval_steps_per_second": 55.743,
2187
+ "step": 1584000
2188
+ },
2189
+ {
2190
+ "epoch": 32.62,
2191
+ "eval_loss": 3.593369960784912,
2192
+ "eval_runtime": 49.6475,
2193
+ "eval_samples_per_second": 827.876,
2194
+ "eval_steps_per_second": 51.745,
2195
+ "step": 1592000
2196
+ },
2197
+ {
2198
+ "epoch": 32.78,
2199
+ "learning_rate": 3.3352789126990747e-06,
2200
+ "loss": 3.4392,
2201
+ "step": 1600000
2202
+ },
2203
+ {
2204
+ "epoch": 32.78,
2205
+ "eval_loss": 3.594203472137451,
2206
+ "eval_runtime": 47.8907,
2207
+ "eval_samples_per_second": 858.246,
2208
+ "eval_steps_per_second": 53.643,
2209
+ "step": 1600000
2210
+ },
2211
+ {
2212
+ "epoch": 32.94,
2213
+ "eval_loss": 3.601329803466797,
2214
+ "eval_runtime": 46.6549,
2215
+ "eval_samples_per_second": 880.98,
2216
+ "eval_steps_per_second": 55.064,
2217
+ "step": 1608000
2218
+ },
2219
+ {
2220
+ "epoch": 33.11,
2221
+ "learning_rate": 3.2685733344450933e-06,
2222
+ "loss": 3.4514,
2223
+ "step": 1616000
2224
+ },
2225
+ {
2226
+ "epoch": 33.11,
2227
+ "eval_loss": 3.6505630016326904,
2228
+ "eval_runtime": 47.3453,
2229
+ "eval_samples_per_second": 868.132,
2230
+ "eval_steps_per_second": 54.261,
2231
+ "step": 1616000
2232
+ },
2233
+ {
2234
+ "epoch": 33.27,
2235
+ "eval_loss": 3.604905128479004,
2236
+ "eval_runtime": 47.3478,
2237
+ "eval_samples_per_second": 868.087,
2238
+ "eval_steps_per_second": 54.258,
2239
+ "step": 1624000
2240
+ },
2241
+ {
2242
+ "epoch": 33.44,
2243
+ "learning_rate": 3.2018677561911115e-06,
2244
+ "loss": 3.4406,
2245
+ "step": 1632000
2246
+ },
2247
+ {
2248
+ "epoch": 33.44,
2249
+ "eval_loss": 3.6285159587860107,
2250
+ "eval_runtime": 45.2665,
2251
+ "eval_samples_per_second": 908.001,
2252
+ "eval_steps_per_second": 56.753,
2253
+ "step": 1632000
2254
+ },
2255
+ {
2256
+ "epoch": 33.6,
2257
+ "eval_loss": 3.6107122898101807,
2258
+ "eval_runtime": 47.0075,
2259
+ "eval_samples_per_second": 874.372,
2260
+ "eval_steps_per_second": 54.651,
2261
+ "step": 1640000
2262
+ },
2263
+ {
2264
+ "epoch": 33.76,
2265
+ "learning_rate": 3.1351621779371306e-06,
2266
+ "loss": 3.4522,
2267
+ "step": 1648000
2268
+ },
2269
+ {
2270
+ "epoch": 33.76,
2271
+ "eval_loss": 3.6080775260925293,
2272
+ "eval_runtime": 46.384,
2273
+ "eval_samples_per_second": 886.124,
2274
+ "eval_steps_per_second": 55.385,
2275
+ "step": 1648000
2276
+ },
2277
+ {
2278
+ "epoch": 33.93,
2279
+ "eval_loss": 3.6121394634246826,
2280
+ "eval_runtime": 47.5808,
2281
+ "eval_samples_per_second": 863.836,
2282
+ "eval_steps_per_second": 53.992,
2283
+ "step": 1656000
2284
+ },
2285
+ {
2286
+ "epoch": 34.09,
2287
+ "learning_rate": 3.0684565996831487e-06,
2288
+ "loss": 3.4592,
2289
+ "step": 1664000
2290
+ },
2291
+ {
2292
+ "epoch": 34.09,
2293
+ "eval_loss": 3.639568567276001,
2294
+ "eval_runtime": 47.4907,
2295
+ "eval_samples_per_second": 865.474,
2296
+ "eval_steps_per_second": 54.095,
2297
+ "step": 1664000
2298
+ },
2299
+ {
2300
+ "epoch": 34.26,
2301
+ "eval_loss": 3.628408432006836,
2302
+ "eval_runtime": 45.8805,
2303
+ "eval_samples_per_second": 895.849,
2304
+ "eval_steps_per_second": 55.993,
2305
+ "step": 1672000
2306
+ },
2307
+ {
2308
+ "epoch": 34.42,
2309
+ "learning_rate": 3.0017510214291673e-06,
2310
+ "loss": 3.4587,
2311
+ "step": 1680000
2312
+ },
2313
+ {
2314
+ "epoch": 34.42,
2315
+ "eval_loss": 3.619464635848999,
2316
+ "eval_runtime": 46.7813,
2317
+ "eval_samples_per_second": 878.599,
2318
+ "eval_steps_per_second": 54.915,
2319
+ "step": 1680000
2320
+ },
2321
+ {
2322
+ "epoch": 34.58,
2323
+ "eval_loss": 3.6168148517608643,
2324
+ "eval_runtime": 46.0408,
2325
+ "eval_samples_per_second": 892.731,
2326
+ "eval_steps_per_second": 55.798,
2327
+ "step": 1688000
2328
+ },
2329
+ {
2330
+ "epoch": 34.75,
2331
+ "learning_rate": 2.9350454431751855e-06,
2332
+ "loss": 3.4589,
2333
+ "step": 1696000
2334
+ },
2335
+ {
2336
+ "epoch": 34.75,
2337
+ "eval_loss": 3.631527900695801,
2338
+ "eval_runtime": 45.9831,
2339
+ "eval_samples_per_second": 893.85,
2340
+ "eval_steps_per_second": 55.868,
2341
+ "step": 1696000
2342
+ },
2343
+ {
2344
+ "epoch": 34.91,
2345
+ "eval_loss": 3.6044745445251465,
2346
+ "eval_runtime": 46.5293,
2347
+ "eval_samples_per_second": 883.356,
2348
+ "eval_steps_per_second": 55.212,
2349
+ "step": 1704000
2350
+ },
2351
+ {
2352
+ "epoch": 35.08,
2353
+ "learning_rate": 2.868339864921204e-06,
2354
+ "loss": 3.4703,
2355
+ "step": 1712000
2356
+ },
2357
+ {
2358
+ "epoch": 35.08,
2359
+ "eval_loss": 3.6251227855682373,
2360
+ "eval_runtime": 45.5912,
2361
+ "eval_samples_per_second": 901.533,
2362
+ "eval_steps_per_second": 56.349,
2363
+ "step": 1712000
2364
+ },
2365
+ {
2366
+ "epoch": 35.24,
2367
+ "eval_loss": 3.6251931190490723,
2368
+ "eval_runtime": 45.7404,
2369
+ "eval_samples_per_second": 898.593,
2370
+ "eval_steps_per_second": 56.165,
2371
+ "step": 1720000
2372
+ },
2373
+ {
2374
+ "epoch": 35.4,
2375
+ "learning_rate": 2.801634286667223e-06,
2376
+ "loss": 3.4565,
2377
+ "step": 1728000
2378
+ },
2379
+ {
2380
+ "epoch": 35.4,
2381
+ "eval_loss": 3.62538743019104,
2382
+ "eval_runtime": 46.4207,
2383
+ "eval_samples_per_second": 885.423,
2384
+ "eval_steps_per_second": 55.342,
2385
+ "step": 1728000
2386
+ },
2387
+ {
2388
+ "epoch": 35.57,
2389
+ "eval_loss": 3.6544113159179688,
2390
+ "eval_runtime": 45.7864,
2391
+ "eval_samples_per_second": 897.691,
2392
+ "eval_steps_per_second": 56.108,
2393
+ "step": 1736000
2394
+ },
2395
+ {
2396
+ "epoch": 35.73,
2397
+ "learning_rate": 2.7349287084132413e-06,
2398
+ "loss": 3.4634,
2399
+ "step": 1744000
2400
+ },
2401
+ {
2402
+ "epoch": 35.73,
2403
+ "eval_loss": 3.629049062728882,
2404
+ "eval_runtime": 46.556,
2405
+ "eval_samples_per_second": 882.85,
2406
+ "eval_steps_per_second": 55.181,
2407
+ "step": 1744000
2408
+ },
2409
+ {
2410
+ "epoch": 35.9,
2411
+ "eval_loss": 3.612429618835449,
2412
+ "eval_runtime": 46.5059,
2413
+ "eval_samples_per_second": 883.802,
2414
+ "eval_steps_per_second": 55.24,
2415
+ "step": 1752000
2416
+ },
2417
+ {
2418
+ "epoch": 36.06,
2419
+ "learning_rate": 2.66822313015926e-06,
2420
+ "loss": 3.4625,
2421
+ "step": 1760000
2422
+ },
2423
+ {
2424
+ "epoch": 36.06,
2425
+ "eval_loss": 3.6262378692626953,
2426
+ "eval_runtime": 45.8554,
2427
+ "eval_samples_per_second": 896.34,
2428
+ "eval_steps_per_second": 56.024,
2429
+ "step": 1760000
2430
+ },
2431
+ {
2432
+ "epoch": 36.22,
2433
+ "eval_loss": 3.6317975521087646,
2434
+ "eval_runtime": 46.7318,
2435
+ "eval_samples_per_second": 879.529,
2436
+ "eval_steps_per_second": 54.973,
2437
+ "step": 1768000
2438
+ },
2439
+ {
2440
+ "epoch": 36.39,
2441
+ "learning_rate": 2.601517551905278e-06,
2442
+ "loss": 3.457,
2443
+ "step": 1776000
2444
+ },
2445
+ {
2446
+ "epoch": 36.39,
2447
+ "eval_loss": 3.640812397003174,
2448
+ "eval_runtime": 45.9688,
2449
+ "eval_samples_per_second": 894.129,
2450
+ "eval_steps_per_second": 55.886,
2451
+ "step": 1776000
2452
+ },
2453
+ {
2454
+ "epoch": 36.55,
2455
+ "eval_loss": 3.6433026790618896,
2456
+ "eval_runtime": 45.8154,
2457
+ "eval_samples_per_second": 897.122,
2458
+ "eval_steps_per_second": 56.073,
2459
+ "step": 1784000
2460
+ },
2461
+ {
2462
+ "epoch": 36.71,
2463
+ "learning_rate": 2.5348119736512967e-06,
2464
+ "loss": 3.4618,
2465
+ "step": 1792000
2466
+ },
2467
+ {
2468
+ "epoch": 36.71,
2469
+ "eval_loss": 3.627612352371216,
2470
+ "eval_runtime": 46.6149,
2471
+ "eval_samples_per_second": 881.735,
2472
+ "eval_steps_per_second": 55.111,
2473
+ "step": 1792000
2474
+ },
2475
+ {
2476
+ "epoch": 36.88,
2477
+ "eval_loss": 3.631366014480591,
2478
+ "eval_runtime": 46.0925,
2479
+ "eval_samples_per_second": 891.729,
2480
+ "eval_steps_per_second": 55.736,
2481
+ "step": 1800000
2482
+ },
2483
+ {
2484
+ "epoch": 37.04,
2485
+ "learning_rate": 2.4681063953973154e-06,
2486
+ "loss": 3.4611,
2487
+ "step": 1808000
2488
+ },
2489
+ {
2490
+ "epoch": 37.04,
2491
+ "eval_loss": 3.6415860652923584,
2492
+ "eval_runtime": 46.287,
2493
+ "eval_samples_per_second": 887.982,
2494
+ "eval_steps_per_second": 55.502,
2495
+ "step": 1808000
2496
+ },
2497
+ {
2498
+ "epoch": 37.21,
2499
+ "eval_loss": 3.665800094604492,
2500
+ "eval_runtime": 46.839,
2501
+ "eval_samples_per_second": 877.517,
2502
+ "eval_steps_per_second": 54.847,
2503
+ "step": 1816000
2504
+ },
2505
+ {
2506
+ "epoch": 37.37,
2507
+ "learning_rate": 2.4014008171433335e-06,
2508
+ "loss": 3.4651,
2509
+ "step": 1824000
2510
+ },
2511
+ {
2512
+ "epoch": 37.37,
2513
+ "eval_loss": 3.638195037841797,
2514
+ "eval_runtime": 46.0815,
2515
+ "eval_samples_per_second": 891.942,
2516
+ "eval_steps_per_second": 55.749,
2517
+ "step": 1824000
2518
+ },
2519
+ {
2520
+ "epoch": 37.53,
2521
+ "eval_loss": 3.656243085861206,
2522
+ "eval_runtime": 45.3257,
2523
+ "eval_samples_per_second": 906.815,
2524
+ "eval_steps_per_second": 56.679,
2525
+ "step": 1832000
2526
+ },
2527
+ {
2528
+ "epoch": 37.7,
2529
+ "learning_rate": 2.334695238889352e-06,
2530
+ "loss": 3.4625,
2531
+ "step": 1840000
2532
+ },
2533
+ {
2534
+ "epoch": 37.7,
2535
+ "eval_loss": 3.6376214027404785,
2536
+ "eval_runtime": 47.1734,
2537
+ "eval_samples_per_second": 871.296,
2538
+ "eval_steps_per_second": 54.459,
2539
+ "step": 1840000
2540
+ },
2541
+ {
2542
+ "epoch": 37.86,
2543
+ "eval_loss": 3.651963710784912,
2544
+ "eval_runtime": 46.059,
2545
+ "eval_samples_per_second": 892.377,
2546
+ "eval_steps_per_second": 55.776,
2547
+ "step": 1848000
2548
+ },
2549
+ {
2550
+ "epoch": 38.03,
2551
+ "learning_rate": 2.2679896606353707e-06,
2552
+ "loss": 3.4561,
2553
+ "step": 1856000
2554
+ },
2555
+ {
2556
+ "epoch": 38.03,
2557
+ "eval_loss": 3.6300716400146484,
2558
+ "eval_runtime": 46.8158,
2559
+ "eval_samples_per_second": 877.951,
2560
+ "eval_steps_per_second": 54.875,
2561
+ "step": 1856000
2562
+ },
2563
+ {
2564
+ "epoch": 38.19,
2565
+ "eval_loss": 3.619462728500366,
2566
+ "eval_runtime": 45.8596,
2567
+ "eval_samples_per_second": 896.258,
2568
+ "eval_steps_per_second": 56.019,
2569
+ "step": 1864000
2570
+ },
2571
+ {
2572
+ "epoch": 38.35,
2573
+ "learning_rate": 2.2012840823813894e-06,
2574
+ "loss": 3.4655,
2575
+ "step": 1872000
2576
+ },
2577
+ {
2578
+ "epoch": 38.35,
2579
+ "eval_loss": 3.6279447078704834,
2580
+ "eval_runtime": 46.2215,
2581
+ "eval_samples_per_second": 889.241,
2582
+ "eval_steps_per_second": 55.58,
2583
+ "step": 1872000
2584
+ },
2585
+ {
2586
+ "epoch": 38.52,
2587
+ "eval_loss": 3.636460542678833,
2588
+ "eval_runtime": 46.7533,
2589
+ "eval_samples_per_second": 879.125,
2590
+ "eval_steps_per_second": 54.948,
2591
+ "step": 1880000
2592
+ },
2593
+ {
2594
+ "epoch": 38.68,
2595
+ "learning_rate": 2.134578504127408e-06,
2596
+ "loss": 3.4637,
2597
+ "step": 1888000
2598
+ },
2599
+ {
2600
+ "epoch": 38.68,
2601
+ "eval_loss": 3.638620138168335,
2602
+ "eval_runtime": 46.2177,
2603
+ "eval_samples_per_second": 889.313,
2604
+ "eval_steps_per_second": 55.585,
2605
+ "step": 1888000
2606
+ },
2607
+ {
2608
+ "epoch": 38.85,
2609
+ "eval_loss": 3.643373489379883,
2610
+ "eval_runtime": 45.9947,
2611
+ "eval_samples_per_second": 893.624,
2612
+ "eval_steps_per_second": 55.854,
2613
+ "step": 1896000
2614
+ },
2615
+ {
2616
+ "epoch": 39.01,
2617
+ "learning_rate": 2.067872925873426e-06,
2618
+ "loss": 3.458,
2619
+ "step": 1904000
2620
+ },
2621
+ {
2622
+ "epoch": 39.01,
2623
+ "eval_loss": 3.65189266204834,
2624
+ "eval_runtime": 46.7003,
2625
+ "eval_samples_per_second": 880.122,
2626
+ "eval_steps_per_second": 55.01,
2627
+ "step": 1904000
2628
+ },
2629
+ {
2630
+ "epoch": 39.17,
2631
+ "eval_loss": 3.6438076496124268,
2632
+ "eval_runtime": 46.3785,
2633
+ "eval_samples_per_second": 886.229,
2634
+ "eval_steps_per_second": 55.392,
2635
+ "step": 1912000
2636
+ },
2637
+ {
2638
+ "epoch": 39.34,
2639
+ "learning_rate": 2.0011673476194448e-06,
2640
+ "loss": 3.4523,
2641
+ "step": 1920000
2642
+ },
2643
+ {
2644
+ "epoch": 39.34,
2645
+ "eval_loss": 3.640777349472046,
2646
+ "eval_runtime": 46.701,
2647
+ "eval_samples_per_second": 880.109,
2648
+ "eval_steps_per_second": 55.01,
2649
+ "step": 1920000
2650
+ },
2651
+ {
2652
+ "epoch": 39.5,
2653
+ "eval_loss": 3.6513171195983887,
2654
+ "eval_runtime": 46.884,
2655
+ "eval_samples_per_second": 876.675,
2656
+ "eval_steps_per_second": 54.795,
2657
+ "step": 1928000
2658
+ },
2659
+ {
2660
+ "epoch": 39.66,
2661
+ "learning_rate": 1.9344617693654634e-06,
2662
+ "loss": 3.4743,
2663
+ "step": 1936000
2664
+ },
2665
+ {
2666
+ "epoch": 39.66,
2667
+ "eval_loss": 3.6177797317504883,
2668
+ "eval_runtime": 46.0686,
2669
+ "eval_samples_per_second": 892.192,
2670
+ "eval_steps_per_second": 55.765,
2671
+ "step": 1936000
2672
+ },
2673
+ {
2674
+ "epoch": 39.83,
2675
+ "eval_loss": 3.6398518085479736,
2676
+ "eval_runtime": 46.8575,
2677
+ "eval_samples_per_second": 877.171,
2678
+ "eval_steps_per_second": 54.826,
2679
+ "step": 1944000
2680
+ },
2681
+ {
2682
+ "epoch": 39.99,
2683
+ "learning_rate": 1.867756191111482e-06,
2684
+ "loss": 3.4626,
2685
+ "step": 1952000
2686
+ },
2687
+ {
2688
+ "epoch": 39.99,
2689
+ "eval_loss": 3.624283790588379,
2690
+ "eval_runtime": 46.1682,
2691
+ "eval_samples_per_second": 890.266,
2692
+ "eval_steps_per_second": 55.644,
2693
+ "step": 1952000
2694
+ },
2695
+ {
2696
+ "epoch": 40.16,
2697
+ "eval_loss": 3.6325714588165283,
2698
+ "eval_runtime": 45.9837,
2699
+ "eval_samples_per_second": 893.838,
2700
+ "eval_steps_per_second": 55.868,
2701
+ "step": 1960000
2702
+ },
2703
+ {
2704
+ "epoch": 40.32,
2705
+ "learning_rate": 1.8010506128575004e-06,
2706
+ "loss": 3.4692,
2707
+ "step": 1968000
2708
+ },
2709
+ {
2710
+ "epoch": 40.32,
2711
+ "eval_loss": 3.6723103523254395,
2712
+ "eval_runtime": 46.8787,
2713
+ "eval_samples_per_second": 876.773,
2714
+ "eval_steps_per_second": 54.801,
2715
+ "step": 1968000
2716
+ },
2717
+ {
2718
+ "epoch": 40.48,
2719
+ "eval_loss": 3.6456410884857178,
2720
+ "eval_runtime": 46.0442,
2721
+ "eval_samples_per_second": 892.664,
2722
+ "eval_steps_per_second": 55.794,
2723
+ "step": 1976000
2724
+ },
2725
+ {
2726
+ "epoch": 40.65,
2727
+ "learning_rate": 1.734345034603519e-06,
2728
+ "loss": 3.4765,
2729
+ "step": 1984000
2730
+ },
2731
+ {
2732
+ "epoch": 40.65,
2733
+ "eval_loss": 3.6437156200408936,
2734
+ "eval_runtime": 45.2826,
2735
+ "eval_samples_per_second": 907.678,
2736
+ "eval_steps_per_second": 56.733,
2737
+ "step": 1984000
2738
+ },
2739
+ {
2740
+ "epoch": 40.81,
2741
+ "eval_loss": 3.647704839706421,
2742
+ "eval_runtime": 46.8981,
2743
+ "eval_samples_per_second": 876.41,
2744
+ "eval_steps_per_second": 54.778,
2745
+ "step": 1992000
2746
+ },
2747
+ {
2748
+ "epoch": 40.98,
2749
+ "learning_rate": 1.6676394563495374e-06,
2750
+ "loss": 3.4747,
2751
+ "step": 2000000
2752
+ },
2753
+ {
2754
+ "epoch": 40.98,
2755
+ "eval_loss": 3.638388156890869,
2756
+ "eval_runtime": 46.0328,
2757
+ "eval_samples_per_second": 892.886,
2758
+ "eval_steps_per_second": 55.808,
2759
+ "step": 2000000
2760
+ },
2761
+ {
2762
+ "epoch": 41.14,
2763
+ "eval_loss": 3.6370368003845215,
2764
+ "eval_runtime": 46.7372,
2765
+ "eval_samples_per_second": 879.427,
2766
+ "eval_steps_per_second": 54.967,
2767
+ "step": 2008000
2768
+ },
2769
+ {
2770
+ "epoch": 41.3,
2771
+ "learning_rate": 1.6009338780955558e-06,
2772
+ "loss": 3.4683,
2773
+ "step": 2016000
2774
+ },
2775
+ {
2776
+ "epoch": 41.3,
2777
+ "eval_loss": 3.662468433380127,
2778
+ "eval_runtime": 46.61,
2779
+ "eval_samples_per_second": 881.828,
2780
+ "eval_steps_per_second": 55.117,
2781
+ "step": 2016000
2782
+ },
2783
+ {
2784
+ "epoch": 41.47,
2785
+ "eval_loss": 3.6453213691711426,
2786
+ "eval_runtime": 45.8611,
2787
+ "eval_samples_per_second": 896.229,
2788
+ "eval_steps_per_second": 56.017,
2789
+ "step": 2024000
2790
+ },
2791
+ {
2792
+ "epoch": 41.63,
2793
+ "learning_rate": 1.5342282998415744e-06,
2794
+ "loss": 3.4599,
2795
+ "step": 2032000
2796
+ },
2797
+ {
2798
+ "epoch": 41.63,
2799
+ "eval_loss": 3.64886212348938,
2800
+ "eval_runtime": 46.762,
2801
+ "eval_samples_per_second": 878.962,
2802
+ "eval_steps_per_second": 54.938,
2803
+ "step": 2032000
2804
+ },
2805
+ {
2806
+ "epoch": 41.8,
2807
+ "eval_loss": 3.6310884952545166,
2808
+ "eval_runtime": 46.4576,
2809
+ "eval_samples_per_second": 884.72,
2810
+ "eval_steps_per_second": 55.298,
2811
+ "step": 2040000
2812
+ },
2813
+ {
2814
+ "epoch": 41.96,
2815
+ "learning_rate": 1.4675227215875928e-06,
2816
+ "loss": 3.4713,
2817
+ "step": 2048000
2818
+ },
2819
+ {
2820
+ "epoch": 41.96,
2821
+ "eval_loss": 3.619154691696167,
2822
+ "eval_runtime": 45.9184,
2823
+ "eval_samples_per_second": 895.109,
2824
+ "eval_steps_per_second": 55.947,
2825
+ "step": 2048000
2826
+ },
2827
+ {
2828
+ "epoch": 42.12,
2829
+ "eval_loss": 3.651060104370117,
2830
+ "eval_runtime": 47.0032,
2831
+ "eval_samples_per_second": 874.451,
2832
+ "eval_steps_per_second": 54.656,
2833
+ "step": 2056000
2834
+ },
2835
+ {
2836
+ "epoch": 42.29,
2837
+ "learning_rate": 1.4008171433336116e-06,
2838
+ "loss": 3.4677,
2839
+ "step": 2064000
2840
+ },
2841
+ {
2842
+ "epoch": 42.29,
2843
+ "eval_loss": 3.6425869464874268,
2844
+ "eval_runtime": 46.3503,
2845
+ "eval_samples_per_second": 886.769,
2846
+ "eval_steps_per_second": 55.426,
2847
+ "step": 2064000
2848
+ },
2849
+ {
2850
+ "epoch": 42.45,
2851
+ "eval_loss": 3.6362836360931396,
2852
+ "eval_runtime": 46.2845,
2853
+ "eval_samples_per_second": 888.029,
2854
+ "eval_steps_per_second": 55.505,
2855
+ "step": 2072000
2856
+ },
2857
+ {
2858
+ "epoch": 42.62,
2859
+ "learning_rate": 1.33411156507963e-06,
2860
+ "loss": 3.4689,
2861
+ "step": 2080000
2862
+ },
2863
+ {
2864
+ "epoch": 42.62,
2865
+ "eval_loss": 3.6378438472747803,
2866
+ "eval_runtime": 47.0132,
2867
+ "eval_samples_per_second": 874.265,
2868
+ "eval_steps_per_second": 54.644,
2869
+ "step": 2080000
2870
+ },
2871
+ {
2872
+ "epoch": 42.78,
2873
+ "eval_loss": 3.6450445652008057,
2874
+ "eval_runtime": 46.1055,
2875
+ "eval_samples_per_second": 891.478,
2876
+ "eval_steps_per_second": 55.72,
2877
+ "step": 2088000
2878
+ },
2879
+ {
2880
+ "epoch": 42.94,
2881
+ "learning_rate": 1.2674059868256484e-06,
2882
+ "loss": 3.4598,
2883
+ "step": 2096000
2884
+ },
2885
+ {
2886
+ "epoch": 42.94,
2887
+ "eval_loss": 3.64805006980896,
2888
+ "eval_runtime": 46.8684,
2889
+ "eval_samples_per_second": 876.967,
2890
+ "eval_steps_per_second": 54.813,
2891
+ "step": 2096000
2892
+ },
2893
+ {
2894
+ "epoch": 43.11,
2895
+ "eval_loss": 3.6675028800964355,
2896
+ "eval_runtime": 46.4765,
2897
+ "eval_samples_per_second": 884.36,
2898
+ "eval_steps_per_second": 55.275,
2899
+ "step": 2104000
2900
+ },
2901
+ {
2902
+ "epoch": 43.27,
2903
+ "learning_rate": 1.2007004085716668e-06,
2904
+ "loss": 3.4487,
2905
+ "step": 2112000
2906
+ },
2907
+ {
2908
+ "epoch": 43.27,
2909
+ "eval_loss": 3.6557657718658447,
2910
+ "eval_runtime": 46.0356,
2911
+ "eval_samples_per_second": 892.83,
2912
+ "eval_steps_per_second": 55.805,
2913
+ "step": 2112000
2914
+ },
2915
+ {
2916
+ "epoch": 43.43,
2917
+ "eval_loss": 3.6451427936553955,
2918
+ "eval_runtime": 47.3121,
2919
+ "eval_samples_per_second": 868.741,
2920
+ "eval_steps_per_second": 54.299,
2921
+ "step": 2120000
2922
+ },
2923
+ {
2924
+ "epoch": 43.6,
2925
+ "learning_rate": 1.1339948303176854e-06,
2926
+ "loss": 3.4555,
2927
+ "step": 2128000
2928
+ },
2929
+ {
2930
+ "epoch": 43.6,
2931
+ "eval_loss": 3.643132448196411,
2932
+ "eval_runtime": 46.2499,
2933
+ "eval_samples_per_second": 888.694,
2934
+ "eval_steps_per_second": 55.546,
2935
+ "step": 2128000
2936
+ },
2937
+ {
2938
+ "epoch": 43.76,
2939
+ "eval_loss": 3.6470389366149902,
2940
+ "eval_runtime": 45.8331,
2941
+ "eval_samples_per_second": 896.776,
2942
+ "eval_steps_per_second": 56.051,
2943
+ "step": 2136000
2944
+ },
2945
+ {
2946
+ "epoch": 43.93,
2947
+ "learning_rate": 1.067289252063704e-06,
2948
+ "loss": 3.4727,
2949
+ "step": 2144000
2950
+ },
2951
+ {
2952
+ "epoch": 43.93,
2953
+ "eval_loss": 3.6265406608581543,
2954
+ "eval_runtime": 47.1162,
2955
+ "eval_samples_per_second": 872.353,
2956
+ "eval_steps_per_second": 54.525,
2957
+ "step": 2144000
2958
+ },
2959
+ {
2960
+ "epoch": 44.09,
2961
+ "eval_loss": 3.6335132122039795,
2962
+ "eval_runtime": 45.9499,
2963
+ "eval_samples_per_second": 894.497,
2964
+ "eval_steps_per_second": 55.909,
2965
+ "step": 2152000
2966
+ },
2967
+ {
2968
+ "epoch": 44.25,
2969
+ "learning_rate": 1.0005836738097224e-06,
2970
+ "loss": 3.4626,
2971
+ "step": 2160000
2972
+ },
2973
+ {
2974
+ "epoch": 44.25,
2975
+ "eval_loss": 3.639557123184204,
2976
+ "eval_runtime": 46.75,
2977
+ "eval_samples_per_second": 879.187,
2978
+ "eval_steps_per_second": 54.952,
2979
+ "step": 2160000
2980
+ },
2981
+ {
2982
+ "epoch": 44.42,
2983
+ "eval_loss": 3.653687000274658,
2984
+ "eval_runtime": 47.165,
2985
+ "eval_samples_per_second": 871.452,
2986
+ "eval_steps_per_second": 54.468,
2987
+ "step": 2168000
2988
+ },
2989
+ {
2990
+ "epoch": 44.58,
2991
+ "learning_rate": 9.33878095555741e-07,
2992
+ "loss": 3.4724,
2993
+ "step": 2176000
2994
+ },
2995
+ {
2996
+ "epoch": 44.58,
2997
+ "eval_loss": 3.61678409576416,
2998
+ "eval_runtime": 46.2585,
2999
+ "eval_samples_per_second": 888.528,
3000
+ "eval_steps_per_second": 55.536,
3001
+ "step": 2176000
3002
+ },
3003
+ {
3004
+ "epoch": 44.75,
3005
+ "eval_loss": 3.644352674484253,
3006
+ "eval_runtime": 47.0469,
3007
+ "eval_samples_per_second": 873.64,
3008
+ "eval_steps_per_second": 54.605,
3009
+ "step": 2184000
3010
+ },
3011
+ {
3012
+ "epoch": 44.91,
3013
+ "learning_rate": 8.671725173017595e-07,
3014
+ "loss": 3.4545,
3015
+ "step": 2192000
3016
+ },
3017
+ {
3018
+ "epoch": 44.91,
3019
+ "eval_loss": 3.6440114974975586,
3020
+ "eval_runtime": 46.2426,
3021
+ "eval_samples_per_second": 888.835,
3022
+ "eval_steps_per_second": 55.555,
3023
+ "step": 2192000
3024
+ },
3025
+ {
3026
+ "epoch": 45.07,
3027
+ "eval_loss": 3.6327061653137207,
3028
+ "eval_runtime": 46.09,
3029
+ "eval_samples_per_second": 891.776,
3030
+ "eval_steps_per_second": 55.739,
3031
+ "step": 2200000
3032
+ },
3033
+ {
3034
+ "epoch": 45.24,
3035
+ "learning_rate": 8.004669390477779e-07,
3036
+ "loss": 3.461,
3037
+ "step": 2208000
3038
+ },
3039
+ {
3040
+ "epoch": 45.24,
3041
+ "eval_loss": 3.6362533569335938,
3042
+ "eval_runtime": 47.1445,
3043
+ "eval_samples_per_second": 871.831,
3044
+ "eval_steps_per_second": 54.492,
3045
+ "step": 2208000
3046
+ },
3047
+ {
3048
+ "epoch": 45.4,
3049
+ "eval_loss": 3.653747081756592,
3050
+ "eval_runtime": 46.2235,
3051
+ "eval_samples_per_second": 889.202,
3052
+ "eval_steps_per_second": 55.578,
3053
+ "step": 2216000
3054
+ },
3055
+ {
3056
+ "epoch": 45.57,
3057
+ "learning_rate": 7.337613607937964e-07,
3058
+ "loss": 3.4702,
3059
+ "step": 2224000
3060
+ },
3061
+ {
3062
+ "epoch": 45.57,
3063
+ "eval_loss": 3.6123247146606445,
3064
+ "eval_runtime": 46.168,
3065
+ "eval_samples_per_second": 890.27,
3066
+ "eval_steps_per_second": 55.645,
3067
+ "step": 2224000
3068
+ },
3069
+ {
3070
+ "epoch": 45.73,
3071
+ "eval_loss": 3.6554455757141113,
3072
+ "eval_runtime": 47.1193,
3073
+ "eval_samples_per_second": 872.296,
3074
+ "eval_steps_per_second": 54.521,
3075
+ "step": 2232000
3076
+ },
3077
+ {
3078
+ "epoch": 45.89,
3079
+ "learning_rate": 6.67055782539815e-07,
3080
+ "loss": 3.4565,
3081
+ "step": 2240000
3082
+ },
3083
+ {
3084
+ "epoch": 45.89,
3085
+ "eval_loss": 3.6522979736328125,
3086
+ "eval_runtime": 46.1449,
3087
+ "eval_samples_per_second": 890.716,
3088
+ "eval_steps_per_second": 55.672,
3089
+ "step": 2240000
3090
+ },
3091
+ {
3092
+ "epoch": 46.06,
3093
+ "eval_loss": 3.6339659690856934,
3094
+ "eval_runtime": 47.2579,
3095
+ "eval_samples_per_second": 869.739,
3096
+ "eval_steps_per_second": 54.361,
3097
+ "step": 2248000
3098
+ },
3099
+ {
3100
+ "epoch": 46.22,
3101
+ "learning_rate": 6.003502042858334e-07,
3102
+ "loss": 3.4517,
3103
+ "step": 2256000
3104
+ },
3105
+ {
3106
+ "epoch": 46.22,
3107
+ "eval_loss": 3.6459498405456543,
3108
+ "eval_runtime": 46.9038,
3109
+ "eval_samples_per_second": 876.305,
3110
+ "eval_steps_per_second": 54.772,
3111
+ "step": 2256000
3112
+ },
3113
+ {
3114
+ "epoch": 46.38,
3115
+ "eval_loss": 3.656141996383667,
3116
+ "eval_runtime": 46.3654,
3117
+ "eval_samples_per_second": 886.48,
3118
+ "eval_steps_per_second": 55.408,
3119
+ "step": 2264000
3120
+ },
3121
+ {
3122
+ "epoch": 46.55,
3123
+ "learning_rate": 5.33644626031852e-07,
3124
+ "loss": 3.4631,
3125
+ "step": 2272000
3126
+ },
3127
+ {
3128
+ "epoch": 46.55,
3129
+ "eval_loss": 3.6547927856445312,
3130
+ "eval_runtime": 47.1154,
3131
+ "eval_samples_per_second": 872.368,
3132
+ "eval_steps_per_second": 54.526,
3133
+ "step": 2272000
3134
+ },
3135
+ {
3136
+ "epoch": 46.71,
3137
+ "eval_loss": 3.6228716373443604,
3138
+ "eval_runtime": 46.2908,
3139
+ "eval_samples_per_second": 887.908,
3140
+ "eval_steps_per_second": 55.497,
3141
+ "step": 2280000
3142
+ },
3143
+ {
3144
+ "epoch": 46.88,
3145
+ "learning_rate": 4.669390477778705e-07,
3146
+ "loss": 3.4518,
3147
+ "step": 2288000
3148
+ },
3149
+ {
3150
+ "epoch": 46.88,
3151
+ "eval_loss": 3.6350128650665283,
3152
+ "eval_runtime": 46.3584,
3153
+ "eval_samples_per_second": 886.613,
3154
+ "eval_steps_per_second": 55.416,
3155
+ "step": 2288000
3156
+ },
3157
+ {
3158
+ "epoch": 47.04,
3159
+ "eval_loss": 3.6483192443847656,
3160
+ "eval_runtime": 47.24,
3161
+ "eval_samples_per_second": 870.067,
3162
+ "eval_steps_per_second": 54.382,
3163
+ "step": 2296000
3164
+ },
3165
+ {
3166
+ "epoch": 47.2,
3167
+ "learning_rate": 4.0023346952388894e-07,
3168
+ "loss": 3.4592,
3169
+ "step": 2304000
3170
+ },
3171
+ {
3172
+ "epoch": 47.2,
3173
+ "eval_loss": 3.6263089179992676,
3174
+ "eval_runtime": 47.0185,
3175
+ "eval_samples_per_second": 874.166,
3176
+ "eval_steps_per_second": 54.638,
3177
+ "step": 2304000
3178
+ },
3179
+ {
3180
+ "epoch": 47.37,
3181
+ "eval_loss": 3.6339097023010254,
3182
+ "eval_runtime": 46.0199,
3183
+ "eval_samples_per_second": 893.135,
3184
+ "eval_steps_per_second": 55.824,
3185
+ "step": 2312000
3186
+ },
3187
+ {
3188
+ "epoch": 47.53,
3189
+ "learning_rate": 3.335278912699075e-07,
3190
+ "loss": 3.4569,
3191
+ "step": 2320000
3192
+ },
3193
+ {
3194
+ "epoch": 47.53,
3195
+ "eval_loss": 3.659444808959961,
3196
+ "eval_runtime": 47.1636,
3197
+ "eval_samples_per_second": 871.477,
3198
+ "eval_steps_per_second": 54.47,
3199
+ "step": 2320000
3200
+ },
3201
+ {
3202
+ "epoch": 47.7,
3203
+ "eval_loss": 3.638535737991333,
3204
+ "eval_runtime": 46.1693,
3205
+ "eval_samples_per_second": 890.246,
3206
+ "eval_steps_per_second": 55.643,
3207
+ "step": 2328000
3208
+ },
3209
+ {
3210
+ "epoch": 47.86,
3211
+ "learning_rate": 2.66822313015926e-07,
3212
+ "loss": 3.4524,
3213
+ "step": 2336000
3214
+ },
3215
+ {
3216
+ "epoch": 47.86,
3217
+ "eval_loss": 3.6434078216552734,
3218
+ "eval_runtime": 47.0318,
3219
+ "eval_samples_per_second": 873.919,
3220
+ "eval_steps_per_second": 54.623,
3221
+ "step": 2336000
3222
+ },
3223
+ {
3224
+ "epoch": 48.02,
3225
+ "eval_loss": 3.650230646133423,
3226
+ "eval_runtime": 46.5514,
3227
+ "eval_samples_per_second": 882.938,
3228
+ "eval_steps_per_second": 55.186,
3229
+ "step": 2344000
3230
+ },
3231
+ {
3232
+ "epoch": 48.19,
3233
+ "learning_rate": 2.0011673476194447e-07,
3234
+ "loss": 3.4644,
3235
+ "step": 2352000
3236
+ },
3237
+ {
3238
+ "epoch": 48.19,
3239
+ "eval_loss": 3.617619276046753,
3240
+ "eval_runtime": 46.2116,
3241
+ "eval_samples_per_second": 889.43,
3242
+ "eval_steps_per_second": 55.592,
3243
+ "step": 2352000
3244
+ },
3245
+ {
3246
+ "epoch": 48.35,
3247
+ "eval_loss": 3.6293184757232666,
3248
+ "eval_runtime": 47.399,
3249
+ "eval_samples_per_second": 867.15,
3250
+ "eval_steps_per_second": 54.199,
3251
+ "step": 2360000
3252
+ },
3253
+ {
3254
+ "epoch": 48.52,
3255
+ "learning_rate": 1.33411156507963e-07,
3256
+ "loss": 3.4586,
3257
+ "step": 2368000
3258
+ },
3259
+ {
3260
+ "epoch": 48.52,
3261
+ "eval_loss": 3.630380392074585,
3262
+ "eval_runtime": 46.3912,
3263
+ "eval_samples_per_second": 885.987,
3264
+ "eval_steps_per_second": 55.377,
3265
+ "step": 2368000
3266
+ },
3267
+ {
3268
+ "epoch": 48.68,
3269
+ "eval_loss": 3.6343326568603516,
3270
+ "eval_runtime": 46.2144,
3271
+ "eval_samples_per_second": 889.376,
3272
+ "eval_steps_per_second": 55.589,
3273
+ "step": 2376000
3274
+ },
3275
+ {
3276
+ "epoch": 48.84,
3277
+ "learning_rate": 6.67055782539815e-08,
3278
+ "loss": 3.4439,
3279
+ "step": 2384000
3280
+ },
3281
+ {
3282
+ "epoch": 48.84,
3283
+ "eval_loss": 3.6090333461761475,
3284
+ "eval_runtime": 47.3482,
3285
+ "eval_samples_per_second": 868.08,
3286
+ "eval_steps_per_second": 54.258,
3287
+ "step": 2384000
3288
+ },
3289
+ {
3290
+ "epoch": 49.01,
3291
+ "eval_loss": 3.6414153575897217,
3292
+ "eval_runtime": 46.5994,
3293
+ "eval_samples_per_second": 882.029,
3294
+ "eval_steps_per_second": 55.13,
3295
+ "step": 2392000
3296
+ },
3297
+ {
3298
+ "epoch": 49.17,
3299
+ "learning_rate": 0.0,
3300
+ "loss": 3.4474,
3301
+ "step": 2400000
3302
+ },
3303
+ {
3304
+ "epoch": 49.17,
3305
+ "eval_loss": 3.620838165283203,
3306
+ "eval_runtime": 46.9825,
3307
+ "eval_samples_per_second": 874.835,
3308
+ "eval_steps_per_second": 54.68,
3309
+ "step": 2400000
3310
+ },
3311
+ {
3312
+ "epoch": 49.17,
3313
+ "step": 2400000,
3314
+ "total_flos": 6.906141294629226e+17,
3315
+ "train_loss": 3.376089767252604,
3316
+ "train_runtime": 158003.2062,
3317
+ "train_samples_per_second": 243.033,
3318
+ "train_steps_per_second": 15.19
3319
+ }
3320
+ ],
3321
+ "logging_steps": 16000,
3322
+ "max_steps": 2400000,
3323
+ "num_train_epochs": 50,
3324
+ "save_steps": 32000,
3325
+ "total_flos": 6.906141294629226e+17,
3326
+ "trial_name": null,
3327
+ "trial_params": null
3328
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:015483c8d7a8f28e3cfc7685d3435adc6087ae5ef117bbe1f324d578dc554544
3
+ size 4219
vocab.json ADDED
The diff for this file is too large to render. See raw diff