stefan-it commited on
Commit
9040ab4
1 Parent(s): fac3561

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +505 -0
training.log ADDED
@@ -0,0 +1,505 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-24 22:31:59,069 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-24 22:31:59,070 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(64001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0): BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ (1): BertLayer(
39
+ (attention): BertAttention(
40
+ (self): BertSelfAttention(
41
+ (query): Linear(in_features=768, out_features=768, bias=True)
42
+ (key): Linear(in_features=768, out_features=768, bias=True)
43
+ (value): Linear(in_features=768, out_features=768, bias=True)
44
+ (dropout): Dropout(p=0.1, inplace=False)
45
+ )
46
+ (output): BertSelfOutput(
47
+ (dense): Linear(in_features=768, out_features=768, bias=True)
48
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
49
+ (dropout): Dropout(p=0.1, inplace=False)
50
+ )
51
+ )
52
+ (intermediate): BertIntermediate(
53
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
54
+ (intermediate_act_fn): GELUActivation()
55
+ )
56
+ (output): BertOutput(
57
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
58
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
59
+ (dropout): Dropout(p=0.1, inplace=False)
60
+ )
61
+ )
62
+ (2): BertLayer(
63
+ (attention): BertAttention(
64
+ (self): BertSelfAttention(
65
+ (query): Linear(in_features=768, out_features=768, bias=True)
66
+ (key): Linear(in_features=768, out_features=768, bias=True)
67
+ (value): Linear(in_features=768, out_features=768, bias=True)
68
+ (dropout): Dropout(p=0.1, inplace=False)
69
+ )
70
+ (output): BertSelfOutput(
71
+ (dense): Linear(in_features=768, out_features=768, bias=True)
72
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
73
+ (dropout): Dropout(p=0.1, inplace=False)
74
+ )
75
+ )
76
+ (intermediate): BertIntermediate(
77
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
78
+ (intermediate_act_fn): GELUActivation()
79
+ )
80
+ (output): BertOutput(
81
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
82
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
83
+ (dropout): Dropout(p=0.1, inplace=False)
84
+ )
85
+ )
86
+ (3): BertLayer(
87
+ (attention): BertAttention(
88
+ (self): BertSelfAttention(
89
+ (query): Linear(in_features=768, out_features=768, bias=True)
90
+ (key): Linear(in_features=768, out_features=768, bias=True)
91
+ (value): Linear(in_features=768, out_features=768, bias=True)
92
+ (dropout): Dropout(p=0.1, inplace=False)
93
+ )
94
+ (output): BertSelfOutput(
95
+ (dense): Linear(in_features=768, out_features=768, bias=True)
96
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
97
+ (dropout): Dropout(p=0.1, inplace=False)
98
+ )
99
+ )
100
+ (intermediate): BertIntermediate(
101
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
102
+ (intermediate_act_fn): GELUActivation()
103
+ )
104
+ (output): BertOutput(
105
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
106
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
107
+ (dropout): Dropout(p=0.1, inplace=False)
108
+ )
109
+ )
110
+ (4): BertLayer(
111
+ (attention): BertAttention(
112
+ (self): BertSelfAttention(
113
+ (query): Linear(in_features=768, out_features=768, bias=True)
114
+ (key): Linear(in_features=768, out_features=768, bias=True)
115
+ (value): Linear(in_features=768, out_features=768, bias=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (output): BertSelfOutput(
119
+ (dense): Linear(in_features=768, out_features=768, bias=True)
120
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
121
+ (dropout): Dropout(p=0.1, inplace=False)
122
+ )
123
+ )
124
+ (intermediate): BertIntermediate(
125
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
126
+ (intermediate_act_fn): GELUActivation()
127
+ )
128
+ (output): BertOutput(
129
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (5): BertLayer(
135
+ (attention): BertAttention(
136
+ (self): BertSelfAttention(
137
+ (query): Linear(in_features=768, out_features=768, bias=True)
138
+ (key): Linear(in_features=768, out_features=768, bias=True)
139
+ (value): Linear(in_features=768, out_features=768, bias=True)
140
+ (dropout): Dropout(p=0.1, inplace=False)
141
+ )
142
+ (output): BertSelfOutput(
143
+ (dense): Linear(in_features=768, out_features=768, bias=True)
144
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
145
+ (dropout): Dropout(p=0.1, inplace=False)
146
+ )
147
+ )
148
+ (intermediate): BertIntermediate(
149
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
150
+ (intermediate_act_fn): GELUActivation()
151
+ )
152
+ (output): BertOutput(
153
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
154
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
155
+ (dropout): Dropout(p=0.1, inplace=False)
156
+ )
157
+ )
158
+ (6): BertLayer(
159
+ (attention): BertAttention(
160
+ (self): BertSelfAttention(
161
+ (query): Linear(in_features=768, out_features=768, bias=True)
162
+ (key): Linear(in_features=768, out_features=768, bias=True)
163
+ (value): Linear(in_features=768, out_features=768, bias=True)
164
+ (dropout): Dropout(p=0.1, inplace=False)
165
+ )
166
+ (output): BertSelfOutput(
167
+ (dense): Linear(in_features=768, out_features=768, bias=True)
168
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
169
+ (dropout): Dropout(p=0.1, inplace=False)
170
+ )
171
+ )
172
+ (intermediate): BertIntermediate(
173
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
174
+ (intermediate_act_fn): GELUActivation()
175
+ )
176
+ (output): BertOutput(
177
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
178
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
179
+ (dropout): Dropout(p=0.1, inplace=False)
180
+ )
181
+ )
182
+ (7): BertLayer(
183
+ (attention): BertAttention(
184
+ (self): BertSelfAttention(
185
+ (query): Linear(in_features=768, out_features=768, bias=True)
186
+ (key): Linear(in_features=768, out_features=768, bias=True)
187
+ (value): Linear(in_features=768, out_features=768, bias=True)
188
+ (dropout): Dropout(p=0.1, inplace=False)
189
+ )
190
+ (output): BertSelfOutput(
191
+ (dense): Linear(in_features=768, out_features=768, bias=True)
192
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
193
+ (dropout): Dropout(p=0.1, inplace=False)
194
+ )
195
+ )
196
+ (intermediate): BertIntermediate(
197
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
198
+ (intermediate_act_fn): GELUActivation()
199
+ )
200
+ (output): BertOutput(
201
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
202
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
203
+ (dropout): Dropout(p=0.1, inplace=False)
204
+ )
205
+ )
206
+ (8): BertLayer(
207
+ (attention): BertAttention(
208
+ (self): BertSelfAttention(
209
+ (query): Linear(in_features=768, out_features=768, bias=True)
210
+ (key): Linear(in_features=768, out_features=768, bias=True)
211
+ (value): Linear(in_features=768, out_features=768, bias=True)
212
+ (dropout): Dropout(p=0.1, inplace=False)
213
+ )
214
+ (output): BertSelfOutput(
215
+ (dense): Linear(in_features=768, out_features=768, bias=True)
216
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
217
+ (dropout): Dropout(p=0.1, inplace=False)
218
+ )
219
+ )
220
+ (intermediate): BertIntermediate(
221
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
222
+ (intermediate_act_fn): GELUActivation()
223
+ )
224
+ (output): BertOutput(
225
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
226
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
227
+ (dropout): Dropout(p=0.1, inplace=False)
228
+ )
229
+ )
230
+ (9): BertLayer(
231
+ (attention): BertAttention(
232
+ (self): BertSelfAttention(
233
+ (query): Linear(in_features=768, out_features=768, bias=True)
234
+ (key): Linear(in_features=768, out_features=768, bias=True)
235
+ (value): Linear(in_features=768, out_features=768, bias=True)
236
+ (dropout): Dropout(p=0.1, inplace=False)
237
+ )
238
+ (output): BertSelfOutput(
239
+ (dense): Linear(in_features=768, out_features=768, bias=True)
240
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
241
+ (dropout): Dropout(p=0.1, inplace=False)
242
+ )
243
+ )
244
+ (intermediate): BertIntermediate(
245
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
246
+ (intermediate_act_fn): GELUActivation()
247
+ )
248
+ (output): BertOutput(
249
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
250
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
251
+ (dropout): Dropout(p=0.1, inplace=False)
252
+ )
253
+ )
254
+ (10): BertLayer(
255
+ (attention): BertAttention(
256
+ (self): BertSelfAttention(
257
+ (query): Linear(in_features=768, out_features=768, bias=True)
258
+ (key): Linear(in_features=768, out_features=768, bias=True)
259
+ (value): Linear(in_features=768, out_features=768, bias=True)
260
+ (dropout): Dropout(p=0.1, inplace=False)
261
+ )
262
+ (output): BertSelfOutput(
263
+ (dense): Linear(in_features=768, out_features=768, bias=True)
264
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
265
+ (dropout): Dropout(p=0.1, inplace=False)
266
+ )
267
+ )
268
+ (intermediate): BertIntermediate(
269
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
270
+ (intermediate_act_fn): GELUActivation()
271
+ )
272
+ (output): BertOutput(
273
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
274
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
275
+ (dropout): Dropout(p=0.1, inplace=False)
276
+ )
277
+ )
278
+ (11): BertLayer(
279
+ (attention): BertAttention(
280
+ (self): BertSelfAttention(
281
+ (query): Linear(in_features=768, out_features=768, bias=True)
282
+ (key): Linear(in_features=768, out_features=768, bias=True)
283
+ (value): Linear(in_features=768, out_features=768, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ )
286
+ (output): BertSelfOutput(
287
+ (dense): Linear(in_features=768, out_features=768, bias=True)
288
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
289
+ (dropout): Dropout(p=0.1, inplace=False)
290
+ )
291
+ )
292
+ (intermediate): BertIntermediate(
293
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
294
+ (intermediate_act_fn): GELUActivation()
295
+ )
296
+ (output): BertOutput(
297
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
298
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
299
+ (dropout): Dropout(p=0.1, inplace=False)
300
+ )
301
+ )
302
+ )
303
+ )
304
+ (pooler): BertPooler(
305
+ (dense): Linear(in_features=768, out_features=768, bias=True)
306
+ (activation): Tanh()
307
+ )
308
+ )
309
+ )
310
+ (locked_dropout): LockedDropout(p=0.5)
311
+ (linear): Linear(in_features=768, out_features=13, bias=True)
312
+ (loss_function): CrossEntropyLoss()
313
+ )"
314
+ 2023-10-24 22:31:59,071 ----------------------------------------------------------------------------------------------------
315
+ 2023-10-24 22:31:59,071 MultiCorpus: 5777 train + 722 dev + 723 test sentences
316
+ - NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /home/ubuntu/.flair/datasets/ner_icdar_europeana/nl
317
+ 2023-10-24 22:31:59,071 ----------------------------------------------------------------------------------------------------
318
+ 2023-10-24 22:31:59,071 Train: 5777 sentences
319
+ 2023-10-24 22:31:59,071 (train_with_dev=False, train_with_test=False)
320
+ 2023-10-24 22:31:59,071 ----------------------------------------------------------------------------------------------------
321
+ 2023-10-24 22:31:59,071 Training Params:
322
+ 2023-10-24 22:31:59,071 - learning_rate: "3e-05"
323
+ 2023-10-24 22:31:59,071 - mini_batch_size: "8"
324
+ 2023-10-24 22:31:59,071 - max_epochs: "10"
325
+ 2023-10-24 22:31:59,071 - shuffle: "True"
326
+ 2023-10-24 22:31:59,071 ----------------------------------------------------------------------------------------------------
327
+ 2023-10-24 22:31:59,071 Plugins:
328
+ 2023-10-24 22:31:59,071 - TensorboardLogger
329
+ 2023-10-24 22:31:59,072 - LinearScheduler | warmup_fraction: '0.1'
330
+ 2023-10-24 22:31:59,072 ----------------------------------------------------------------------------------------------------
331
+ 2023-10-24 22:31:59,072 Final evaluation on model from best epoch (best-model.pt)
332
+ 2023-10-24 22:31:59,072 - metric: "('micro avg', 'f1-score')"
333
+ 2023-10-24 22:31:59,072 ----------------------------------------------------------------------------------------------------
334
+ 2023-10-24 22:31:59,072 Computation:
335
+ 2023-10-24 22:31:59,072 - compute on device: cuda:0
336
+ 2023-10-24 22:31:59,072 - embedding storage: none
337
+ 2023-10-24 22:31:59,072 ----------------------------------------------------------------------------------------------------
338
+ 2023-10-24 22:31:59,072 Model training base path: "hmbench-icdar/nl-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1"
339
+ 2023-10-24 22:31:59,072 ----------------------------------------------------------------------------------------------------
340
+ 2023-10-24 22:31:59,072 ----------------------------------------------------------------------------------------------------
341
+ 2023-10-24 22:31:59,072 Logging anything other than scalars to TensorBoard is currently not supported.
342
+ 2023-10-24 22:32:07,563 epoch 1 - iter 72/723 - loss 2.31947311 - time (sec): 8.49 - samples/sec: 2083.66 - lr: 0.000003 - momentum: 0.000000
343
+ 2023-10-24 22:32:16,346 epoch 1 - iter 144/723 - loss 1.32909159 - time (sec): 17.27 - samples/sec: 2038.93 - lr: 0.000006 - momentum: 0.000000
344
+ 2023-10-24 22:32:25,292 epoch 1 - iter 216/723 - loss 0.94456255 - time (sec): 26.22 - samples/sec: 2064.96 - lr: 0.000009 - momentum: 0.000000
345
+ 2023-10-24 22:32:33,521 epoch 1 - iter 288/723 - loss 0.76663770 - time (sec): 34.45 - samples/sec: 2047.33 - lr: 0.000012 - momentum: 0.000000
346
+ 2023-10-24 22:32:41,645 epoch 1 - iter 360/723 - loss 0.64951811 - time (sec): 42.57 - samples/sec: 2047.01 - lr: 0.000015 - momentum: 0.000000
347
+ 2023-10-24 22:32:49,977 epoch 1 - iter 432/723 - loss 0.57340174 - time (sec): 50.90 - samples/sec: 2047.02 - lr: 0.000018 - momentum: 0.000000
348
+ 2023-10-24 22:32:58,323 epoch 1 - iter 504/723 - loss 0.51388230 - time (sec): 59.25 - samples/sec: 2039.16 - lr: 0.000021 - momentum: 0.000000
349
+ 2023-10-24 22:33:07,442 epoch 1 - iter 576/723 - loss 0.46626848 - time (sec): 68.37 - samples/sec: 2030.93 - lr: 0.000024 - momentum: 0.000000
350
+ 2023-10-24 22:33:16,119 epoch 1 - iter 648/723 - loss 0.42649097 - time (sec): 77.05 - samples/sec: 2038.95 - lr: 0.000027 - momentum: 0.000000
351
+ 2023-10-24 22:33:25,266 epoch 1 - iter 720/723 - loss 0.39365540 - time (sec): 86.19 - samples/sec: 2039.10 - lr: 0.000030 - momentum: 0.000000
352
+ 2023-10-24 22:33:25,516 ----------------------------------------------------------------------------------------------------
353
+ 2023-10-24 22:33:25,516 EPOCH 1 done: loss 0.3931 - lr: 0.000030
354
+ 2023-10-24 22:33:28,789 DEV : loss 0.13080401718616486 - f1-score (micro avg) 0.5705
355
+ 2023-10-24 22:33:28,801 saving best model
356
+ 2023-10-24 22:33:29,271 ----------------------------------------------------------------------------------------------------
357
+ 2023-10-24 22:33:37,630 epoch 2 - iter 72/723 - loss 0.11662981 - time (sec): 8.36 - samples/sec: 2039.80 - lr: 0.000030 - momentum: 0.000000
358
+ 2023-10-24 22:33:45,571 epoch 2 - iter 144/723 - loss 0.11114776 - time (sec): 16.30 - samples/sec: 2050.45 - lr: 0.000029 - momentum: 0.000000
359
+ 2023-10-24 22:33:53,911 epoch 2 - iter 216/723 - loss 0.10664717 - time (sec): 24.64 - samples/sec: 2054.36 - lr: 0.000029 - momentum: 0.000000
360
+ 2023-10-24 22:34:03,044 epoch 2 - iter 288/723 - loss 0.10255450 - time (sec): 33.77 - samples/sec: 2051.08 - lr: 0.000029 - momentum: 0.000000
361
+ 2023-10-24 22:34:12,351 epoch 2 - iter 360/723 - loss 0.09854358 - time (sec): 43.08 - samples/sec: 2054.75 - lr: 0.000028 - momentum: 0.000000
362
+ 2023-10-24 22:34:21,686 epoch 2 - iter 432/723 - loss 0.09691315 - time (sec): 52.41 - samples/sec: 2047.19 - lr: 0.000028 - momentum: 0.000000
363
+ 2023-10-24 22:34:30,078 epoch 2 - iter 504/723 - loss 0.09403782 - time (sec): 60.81 - samples/sec: 2046.78 - lr: 0.000028 - momentum: 0.000000
364
+ 2023-10-24 22:34:37,733 epoch 2 - iter 576/723 - loss 0.09689121 - time (sec): 68.46 - samples/sec: 2048.84 - lr: 0.000027 - momentum: 0.000000
365
+ 2023-10-24 22:34:46,175 epoch 2 - iter 648/723 - loss 0.09667821 - time (sec): 76.90 - samples/sec: 2048.92 - lr: 0.000027 - momentum: 0.000000
366
+ 2023-10-24 22:34:54,725 epoch 2 - iter 720/723 - loss 0.09635443 - time (sec): 85.45 - samples/sec: 2054.74 - lr: 0.000027 - momentum: 0.000000
367
+ 2023-10-24 22:34:54,971 ----------------------------------------------------------------------------------------------------
368
+ 2023-10-24 22:34:54,971 EPOCH 2 done: loss 0.0964 - lr: 0.000027
369
+ 2023-10-24 22:34:58,678 DEV : loss 0.07759504020214081 - f1-score (micro avg) 0.8195
370
+ 2023-10-24 22:34:58,690 saving best model
371
+ 2023-10-24 22:34:59,285 ----------------------------------------------------------------------------------------------------
372
+ 2023-10-24 22:35:07,943 epoch 3 - iter 72/723 - loss 0.06713425 - time (sec): 8.66 - samples/sec: 2019.48 - lr: 0.000026 - momentum: 0.000000
373
+ 2023-10-24 22:35:16,408 epoch 3 - iter 144/723 - loss 0.05848043 - time (sec): 17.12 - samples/sec: 2041.94 - lr: 0.000026 - momentum: 0.000000
374
+ 2023-10-24 22:35:24,682 epoch 3 - iter 216/723 - loss 0.06611272 - time (sec): 25.40 - samples/sec: 2057.98 - lr: 0.000026 - momentum: 0.000000
375
+ 2023-10-24 22:35:33,448 epoch 3 - iter 288/723 - loss 0.06373711 - time (sec): 34.16 - samples/sec: 2062.71 - lr: 0.000025 - momentum: 0.000000
376
+ 2023-10-24 22:35:42,258 epoch 3 - iter 360/723 - loss 0.06368511 - time (sec): 42.97 - samples/sec: 2052.83 - lr: 0.000025 - momentum: 0.000000
377
+ 2023-10-24 22:35:51,380 epoch 3 - iter 432/723 - loss 0.06405055 - time (sec): 52.09 - samples/sec: 2054.55 - lr: 0.000025 - momentum: 0.000000
378
+ 2023-10-24 22:35:59,698 epoch 3 - iter 504/723 - loss 0.06454130 - time (sec): 60.41 - samples/sec: 2043.44 - lr: 0.000024 - momentum: 0.000000
379
+ 2023-10-24 22:36:08,056 epoch 3 - iter 576/723 - loss 0.06344541 - time (sec): 68.77 - samples/sec: 2037.40 - lr: 0.000024 - momentum: 0.000000
380
+ 2023-10-24 22:36:16,742 epoch 3 - iter 648/723 - loss 0.06359618 - time (sec): 77.46 - samples/sec: 2037.63 - lr: 0.000024 - momentum: 0.000000
381
+ 2023-10-24 22:36:25,498 epoch 3 - iter 720/723 - loss 0.06294517 - time (sec): 86.21 - samples/sec: 2040.20 - lr: 0.000023 - momentum: 0.000000
382
+ 2023-10-24 22:36:25,702 ----------------------------------------------------------------------------------------------------
383
+ 2023-10-24 22:36:25,702 EPOCH 3 done: loss 0.0631 - lr: 0.000023
384
+ 2023-10-24 22:36:29,121 DEV : loss 0.06691966950893402 - f1-score (micro avg) 0.8335
385
+ 2023-10-24 22:36:29,133 saving best model
386
+ 2023-10-24 22:36:29,728 ----------------------------------------------------------------------------------------------------
387
+ 2023-10-24 22:36:38,347 epoch 4 - iter 72/723 - loss 0.04289293 - time (sec): 8.62 - samples/sec: 2030.37 - lr: 0.000023 - momentum: 0.000000
388
+ 2023-10-24 22:36:46,920 epoch 4 - iter 144/723 - loss 0.04393330 - time (sec): 17.19 - samples/sec: 2020.50 - lr: 0.000023 - momentum: 0.000000
389
+ 2023-10-24 22:36:54,721 epoch 4 - iter 216/723 - loss 0.04626933 - time (sec): 24.99 - samples/sec: 2029.60 - lr: 0.000022 - momentum: 0.000000
390
+ 2023-10-24 22:37:03,203 epoch 4 - iter 288/723 - loss 0.04679271 - time (sec): 33.47 - samples/sec: 2008.90 - lr: 0.000022 - momentum: 0.000000
391
+ 2023-10-24 22:37:12,171 epoch 4 - iter 360/723 - loss 0.04474950 - time (sec): 42.44 - samples/sec: 2022.65 - lr: 0.000022 - momentum: 0.000000
392
+ 2023-10-24 22:37:21,110 epoch 4 - iter 432/723 - loss 0.04661379 - time (sec): 51.38 - samples/sec: 2025.29 - lr: 0.000021 - momentum: 0.000000
393
+ 2023-10-24 22:37:30,179 epoch 4 - iter 504/723 - loss 0.04617695 - time (sec): 60.45 - samples/sec: 2026.61 - lr: 0.000021 - momentum: 0.000000
394
+ 2023-10-24 22:37:38,883 epoch 4 - iter 576/723 - loss 0.04515587 - time (sec): 69.15 - samples/sec: 2031.07 - lr: 0.000021 - momentum: 0.000000
395
+ 2023-10-24 22:37:47,642 epoch 4 - iter 648/723 - loss 0.04439814 - time (sec): 77.91 - samples/sec: 2028.48 - lr: 0.000020 - momentum: 0.000000
396
+ 2023-10-24 22:37:56,158 epoch 4 - iter 720/723 - loss 0.04362410 - time (sec): 86.43 - samples/sec: 2033.94 - lr: 0.000020 - momentum: 0.000000
397
+ 2023-10-24 22:37:56,386 ----------------------------------------------------------------------------------------------------
398
+ 2023-10-24 22:37:56,387 EPOCH 4 done: loss 0.0437 - lr: 0.000020
399
+ 2023-10-24 22:37:59,816 DEV : loss 0.09226194024085999 - f1-score (micro avg) 0.8141
400
+ 2023-10-24 22:37:59,828 ----------------------------------------------------------------------------------------------------
401
+ 2023-10-24 22:38:08,904 epoch 5 - iter 72/723 - loss 0.03491869 - time (sec): 9.08 - samples/sec: 2016.25 - lr: 0.000020 - momentum: 0.000000
402
+ 2023-10-24 22:38:18,027 epoch 5 - iter 144/723 - loss 0.03697398 - time (sec): 18.20 - samples/sec: 1966.32 - lr: 0.000019 - momentum: 0.000000
403
+ 2023-10-24 22:38:26,766 epoch 5 - iter 216/723 - loss 0.03299498 - time (sec): 26.94 - samples/sec: 1980.81 - lr: 0.000019 - momentum: 0.000000
404
+ 2023-10-24 22:38:36,258 epoch 5 - iter 288/723 - loss 0.03395920 - time (sec): 36.43 - samples/sec: 1983.77 - lr: 0.000019 - momentum: 0.000000
405
+ 2023-10-24 22:38:44,704 epoch 5 - iter 360/723 - loss 0.03351756 - time (sec): 44.88 - samples/sec: 1993.52 - lr: 0.000018 - momentum: 0.000000
406
+ 2023-10-24 22:38:53,420 epoch 5 - iter 432/723 - loss 0.03259007 - time (sec): 53.59 - samples/sec: 2008.90 - lr: 0.000018 - momentum: 0.000000
407
+ 2023-10-24 22:39:01,181 epoch 5 - iter 504/723 - loss 0.03251132 - time (sec): 61.35 - samples/sec: 2013.69 - lr: 0.000018 - momentum: 0.000000
408
+ 2023-10-24 22:39:09,786 epoch 5 - iter 576/723 - loss 0.03147036 - time (sec): 69.96 - samples/sec: 2022.20 - lr: 0.000017 - momentum: 0.000000
409
+ 2023-10-24 22:39:18,155 epoch 5 - iter 648/723 - loss 0.03152705 - time (sec): 78.33 - samples/sec: 2016.73 - lr: 0.000017 - momentum: 0.000000
410
+ 2023-10-24 22:39:26,576 epoch 5 - iter 720/723 - loss 0.03165355 - time (sec): 86.75 - samples/sec: 2022.49 - lr: 0.000017 - momentum: 0.000000
411
+ 2023-10-24 22:39:26,978 ----------------------------------------------------------------------------------------------------
412
+ 2023-10-24 22:39:26,979 EPOCH 5 done: loss 0.0317 - lr: 0.000017
413
+ 2023-10-24 22:39:30,691 DEV : loss 0.1143888533115387 - f1-score (micro avg) 0.8319
414
+ 2023-10-24 22:39:30,703 ----------------------------------------------------------------------------------------------------
415
+ 2023-10-24 22:39:39,467 epoch 6 - iter 72/723 - loss 0.01995720 - time (sec): 8.76 - samples/sec: 1955.67 - lr: 0.000016 - momentum: 0.000000
416
+ 2023-10-24 22:39:47,873 epoch 6 - iter 144/723 - loss 0.02213871 - time (sec): 17.17 - samples/sec: 2001.18 - lr: 0.000016 - momentum: 0.000000
417
+ 2023-10-24 22:39:57,180 epoch 6 - iter 216/723 - loss 0.02122386 - time (sec): 26.48 - samples/sec: 2013.81 - lr: 0.000016 - momentum: 0.000000
418
+ 2023-10-24 22:40:05,862 epoch 6 - iter 288/723 - loss 0.02103884 - time (sec): 35.16 - samples/sec: 1996.29 - lr: 0.000015 - momentum: 0.000000
419
+ 2023-10-24 22:40:14,288 epoch 6 - iter 360/723 - loss 0.02213591 - time (sec): 43.58 - samples/sec: 2004.08 - lr: 0.000015 - momentum: 0.000000
420
+ 2023-10-24 22:40:22,944 epoch 6 - iter 432/723 - loss 0.02306774 - time (sec): 52.24 - samples/sec: 2015.55 - lr: 0.000015 - momentum: 0.000000
421
+ 2023-10-24 22:40:31,393 epoch 6 - iter 504/723 - loss 0.02312191 - time (sec): 60.69 - samples/sec: 2032.38 - lr: 0.000014 - momentum: 0.000000
422
+ 2023-10-24 22:40:40,001 epoch 6 - iter 576/723 - loss 0.02377623 - time (sec): 69.30 - samples/sec: 2032.40 - lr: 0.000014 - momentum: 0.000000
423
+ 2023-10-24 22:40:48,317 epoch 6 - iter 648/723 - loss 0.02402287 - time (sec): 77.61 - samples/sec: 2042.80 - lr: 0.000014 - momentum: 0.000000
424
+ 2023-10-24 22:40:56,641 epoch 6 - iter 720/723 - loss 0.02443272 - time (sec): 85.94 - samples/sec: 2044.18 - lr: 0.000013 - momentum: 0.000000
425
+ 2023-10-24 22:40:56,909 ----------------------------------------------------------------------------------------------------
426
+ 2023-10-24 22:40:56,910 EPOCH 6 done: loss 0.0244 - lr: 0.000013
427
+ 2023-10-24 22:41:00,364 DEV : loss 0.12616097927093506 - f1-score (micro avg) 0.8405
428
+ 2023-10-24 22:41:00,376 saving best model
429
+ 2023-10-24 22:41:01,246 ----------------------------------------------------------------------------------------------------
430
+ 2023-10-24 22:41:09,652 epoch 7 - iter 72/723 - loss 0.01313625 - time (sec): 8.40 - samples/sec: 2128.95 - lr: 0.000013 - momentum: 0.000000
431
+ 2023-10-24 22:41:18,753 epoch 7 - iter 144/723 - loss 0.01797221 - time (sec): 17.51 - samples/sec: 2020.29 - lr: 0.000013 - momentum: 0.000000
432
+ 2023-10-24 22:41:27,131 epoch 7 - iter 216/723 - loss 0.01779934 - time (sec): 25.88 - samples/sec: 2033.59 - lr: 0.000012 - momentum: 0.000000
433
+ 2023-10-24 22:41:35,855 epoch 7 - iter 288/723 - loss 0.01742948 - time (sec): 34.61 - samples/sec: 2048.28 - lr: 0.000012 - momentum: 0.000000
434
+ 2023-10-24 22:41:44,958 epoch 7 - iter 360/723 - loss 0.01829595 - time (sec): 43.71 - samples/sec: 2038.76 - lr: 0.000012 - momentum: 0.000000
435
+ 2023-10-24 22:41:53,223 epoch 7 - iter 432/723 - loss 0.01860388 - time (sec): 51.98 - samples/sec: 2027.05 - lr: 0.000011 - momentum: 0.000000
436
+ 2023-10-24 22:42:01,586 epoch 7 - iter 504/723 - loss 0.01859120 - time (sec): 60.34 - samples/sec: 2027.45 - lr: 0.000011 - momentum: 0.000000
437
+ 2023-10-24 22:42:10,140 epoch 7 - iter 576/723 - loss 0.01846148 - time (sec): 68.89 - samples/sec: 2029.92 - lr: 0.000011 - momentum: 0.000000
438
+ 2023-10-24 22:42:18,997 epoch 7 - iter 648/723 - loss 0.01760306 - time (sec): 77.75 - samples/sec: 2032.59 - lr: 0.000010 - momentum: 0.000000
439
+ 2023-10-24 22:42:27,610 epoch 7 - iter 720/723 - loss 0.01743141 - time (sec): 86.36 - samples/sec: 2032.66 - lr: 0.000010 - momentum: 0.000000
440
+ 2023-10-24 22:42:27,974 ----------------------------------------------------------------------------------------------------
441
+ 2023-10-24 22:42:27,975 EPOCH 7 done: loss 0.0174 - lr: 0.000010
442
+ 2023-10-24 22:42:31,416 DEV : loss 0.1625511348247528 - f1-score (micro avg) 0.8273
443
+ 2023-10-24 22:42:31,428 ----------------------------------------------------------------------------------------------------
444
+ 2023-10-24 22:42:40,071 epoch 8 - iter 72/723 - loss 0.01217969 - time (sec): 8.64 - samples/sec: 2041.96 - lr: 0.000010 - momentum: 0.000000
445
+ 2023-10-24 22:42:49,190 epoch 8 - iter 144/723 - loss 0.01421228 - time (sec): 17.76 - samples/sec: 1996.70 - lr: 0.000009 - momentum: 0.000000
446
+ 2023-10-24 22:42:57,430 epoch 8 - iter 216/723 - loss 0.01295467 - time (sec): 26.00 - samples/sec: 2040.84 - lr: 0.000009 - momentum: 0.000000
447
+ 2023-10-24 22:43:06,802 epoch 8 - iter 288/723 - loss 0.01304675 - time (sec): 35.37 - samples/sec: 2069.04 - lr: 0.000009 - momentum: 0.000000
448
+ 2023-10-24 22:43:15,158 epoch 8 - iter 360/723 - loss 0.01147798 - time (sec): 43.73 - samples/sec: 2062.23 - lr: 0.000008 - momentum: 0.000000
449
+ 2023-10-24 22:43:23,633 epoch 8 - iter 432/723 - loss 0.01210736 - time (sec): 52.20 - samples/sec: 2063.22 - lr: 0.000008 - momentum: 0.000000
450
+ 2023-10-24 22:43:32,367 epoch 8 - iter 504/723 - loss 0.01300207 - time (sec): 60.94 - samples/sec: 2051.84 - lr: 0.000008 - momentum: 0.000000
451
+ 2023-10-24 22:43:40,103 epoch 8 - iter 576/723 - loss 0.01331943 - time (sec): 68.67 - samples/sec: 2042.03 - lr: 0.000007 - momentum: 0.000000
452
+ 2023-10-24 22:43:48,376 epoch 8 - iter 648/723 - loss 0.01306185 - time (sec): 76.95 - samples/sec: 2042.52 - lr: 0.000007 - momentum: 0.000000
453
+ 2023-10-24 22:43:57,171 epoch 8 - iter 720/723 - loss 0.01323653 - time (sec): 85.74 - samples/sec: 2046.87 - lr: 0.000007 - momentum: 0.000000
454
+ 2023-10-24 22:43:57,642 ----------------------------------------------------------------------------------------------------
455
+ 2023-10-24 22:43:57,642 EPOCH 8 done: loss 0.0132 - lr: 0.000007
456
+ 2023-10-24 22:44:01,377 DEV : loss 0.14701317250728607 - f1-score (micro avg) 0.8396
457
+ 2023-10-24 22:44:01,389 ----------------------------------------------------------------------------------------------------
458
+ 2023-10-24 22:44:10,338 epoch 9 - iter 72/723 - loss 0.00421793 - time (sec): 8.95 - samples/sec: 2094.09 - lr: 0.000006 - momentum: 0.000000
459
+ 2023-10-24 22:44:18,282 epoch 9 - iter 144/723 - loss 0.00746426 - time (sec): 16.89 - samples/sec: 2075.48 - lr: 0.000006 - momentum: 0.000000
460
+ 2023-10-24 22:44:27,384 epoch 9 - iter 216/723 - loss 0.00859736 - time (sec): 25.99 - samples/sec: 2060.21 - lr: 0.000006 - momentum: 0.000000
461
+ 2023-10-24 22:44:36,026 epoch 9 - iter 288/723 - loss 0.00966802 - time (sec): 34.64 - samples/sec: 2050.75 - lr: 0.000005 - momentum: 0.000000
462
+ 2023-10-24 22:44:44,734 epoch 9 - iter 360/723 - loss 0.00936446 - time (sec): 43.34 - samples/sec: 2037.71 - lr: 0.000005 - momentum: 0.000000
463
+ 2023-10-24 22:44:53,267 epoch 9 - iter 432/723 - loss 0.00876449 - time (sec): 51.88 - samples/sec: 2046.44 - lr: 0.000005 - momentum: 0.000000
464
+ 2023-10-24 22:45:01,957 epoch 9 - iter 504/723 - loss 0.00947271 - time (sec): 60.57 - samples/sec: 2046.52 - lr: 0.000004 - momentum: 0.000000
465
+ 2023-10-24 22:45:10,195 epoch 9 - iter 576/723 - loss 0.00911775 - time (sec): 68.81 - samples/sec: 2050.71 - lr: 0.000004 - momentum: 0.000000
466
+ 2023-10-24 22:45:18,773 epoch 9 - iter 648/723 - loss 0.00878954 - time (sec): 77.38 - samples/sec: 2047.93 - lr: 0.000004 - momentum: 0.000000
467
+ 2023-10-24 22:45:27,488 epoch 9 - iter 720/723 - loss 0.00902168 - time (sec): 86.10 - samples/sec: 2042.23 - lr: 0.000003 - momentum: 0.000000
468
+ 2023-10-24 22:45:27,703 ----------------------------------------------------------------------------------------------------
469
+ 2023-10-24 22:45:27,703 EPOCH 9 done: loss 0.0090 - lr: 0.000003
470
+ 2023-10-24 22:45:31,141 DEV : loss 0.16477558016777039 - f1-score (micro avg) 0.8348
471
+ 2023-10-24 22:45:31,153 ----------------------------------------------------------------------------------------------------
472
+ 2023-10-24 22:45:39,876 epoch 10 - iter 72/723 - loss 0.00532785 - time (sec): 8.72 - samples/sec: 2001.01 - lr: 0.000003 - momentum: 0.000000
473
+ 2023-10-24 22:45:48,365 epoch 10 - iter 144/723 - loss 0.00549018 - time (sec): 17.21 - samples/sec: 2063.17 - lr: 0.000003 - momentum: 0.000000
474
+ 2023-10-24 22:45:57,326 epoch 10 - iter 216/723 - loss 0.00530542 - time (sec): 26.17 - samples/sec: 2080.27 - lr: 0.000002 - momentum: 0.000000
475
+ 2023-10-24 22:46:06,688 epoch 10 - iter 288/723 - loss 0.00594591 - time (sec): 35.53 - samples/sec: 2048.79 - lr: 0.000002 - momentum: 0.000000
476
+ 2023-10-24 22:46:15,127 epoch 10 - iter 360/723 - loss 0.00637310 - time (sec): 43.97 - samples/sec: 2036.30 - lr: 0.000002 - momentum: 0.000000
477
+ 2023-10-24 22:46:24,052 epoch 10 - iter 432/723 - loss 0.00626112 - time (sec): 52.90 - samples/sec: 2018.61 - lr: 0.000001 - momentum: 0.000000
478
+ 2023-10-24 22:46:32,684 epoch 10 - iter 504/723 - loss 0.00700602 - time (sec): 61.53 - samples/sec: 2017.36 - lr: 0.000001 - momentum: 0.000000
479
+ 2023-10-24 22:46:41,008 epoch 10 - iter 576/723 - loss 0.00726040 - time (sec): 69.85 - samples/sec: 2026.17 - lr: 0.000001 - momentum: 0.000000
480
+ 2023-10-24 22:46:49,862 epoch 10 - iter 648/723 - loss 0.00745651 - time (sec): 78.71 - samples/sec: 2015.95 - lr: 0.000000 - momentum: 0.000000
481
+ 2023-10-24 22:46:58,135 epoch 10 - iter 720/723 - loss 0.00724114 - time (sec): 86.98 - samples/sec: 2021.35 - lr: 0.000000 - momentum: 0.000000
482
+ 2023-10-24 22:46:58,346 ----------------------------------------------------------------------------------------------------
483
+ 2023-10-24 22:46:58,347 EPOCH 10 done: loss 0.0072 - lr: 0.000000
484
+ 2023-10-24 22:47:01,783 DEV : loss 0.16929292678833008 - f1-score (micro avg) 0.8392
485
+ 2023-10-24 22:47:02,271 ----------------------------------------------------------------------------------------------------
486
+ 2023-10-24 22:47:02,272 Loading model from best epoch ...
487
+ 2023-10-24 22:47:04,037 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
488
+ 2023-10-24 22:47:07,593
489
+ Results:
490
+ - F-score (micro) 0.8156
491
+ - F-score (macro) 0.6995
492
+ - Accuracy 0.6985
493
+
494
+ By class:
495
+ precision recall f1-score support
496
+
497
+ PER 0.8537 0.8112 0.8319 482
498
+ LOC 0.8956 0.8057 0.8483 458
499
+ ORG 0.5610 0.3333 0.4182 69
500
+
501
+ micro avg 0.8595 0.7760 0.8156 1009
502
+ macro avg 0.7701 0.6501 0.6995 1009
503
+ weighted avg 0.8527 0.7760 0.8110 1009
504
+
505
+ 2023-10-24 22:47:07,593 ----------------------------------------------------------------------------------------------------