stefan-it commited on
Commit
0d8727f
·
1 Parent(s): 31ce489

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +508 -0
training.log ADDED
@@ -0,0 +1,508 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-23 19:08:24,656 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-23 19:08:24,657 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(64001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0): BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ (1): BertLayer(
39
+ (attention): BertAttention(
40
+ (self): BertSelfAttention(
41
+ (query): Linear(in_features=768, out_features=768, bias=True)
42
+ (key): Linear(in_features=768, out_features=768, bias=True)
43
+ (value): Linear(in_features=768, out_features=768, bias=True)
44
+ (dropout): Dropout(p=0.1, inplace=False)
45
+ )
46
+ (output): BertSelfOutput(
47
+ (dense): Linear(in_features=768, out_features=768, bias=True)
48
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
49
+ (dropout): Dropout(p=0.1, inplace=False)
50
+ )
51
+ )
52
+ (intermediate): BertIntermediate(
53
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
54
+ (intermediate_act_fn): GELUActivation()
55
+ )
56
+ (output): BertOutput(
57
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
58
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
59
+ (dropout): Dropout(p=0.1, inplace=False)
60
+ )
61
+ )
62
+ (2): BertLayer(
63
+ (attention): BertAttention(
64
+ (self): BertSelfAttention(
65
+ (query): Linear(in_features=768, out_features=768, bias=True)
66
+ (key): Linear(in_features=768, out_features=768, bias=True)
67
+ (value): Linear(in_features=768, out_features=768, bias=True)
68
+ (dropout): Dropout(p=0.1, inplace=False)
69
+ )
70
+ (output): BertSelfOutput(
71
+ (dense): Linear(in_features=768, out_features=768, bias=True)
72
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
73
+ (dropout): Dropout(p=0.1, inplace=False)
74
+ )
75
+ )
76
+ (intermediate): BertIntermediate(
77
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
78
+ (intermediate_act_fn): GELUActivation()
79
+ )
80
+ (output): BertOutput(
81
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
82
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
83
+ (dropout): Dropout(p=0.1, inplace=False)
84
+ )
85
+ )
86
+ (3): BertLayer(
87
+ (attention): BertAttention(
88
+ (self): BertSelfAttention(
89
+ (query): Linear(in_features=768, out_features=768, bias=True)
90
+ (key): Linear(in_features=768, out_features=768, bias=True)
91
+ (value): Linear(in_features=768, out_features=768, bias=True)
92
+ (dropout): Dropout(p=0.1, inplace=False)
93
+ )
94
+ (output): BertSelfOutput(
95
+ (dense): Linear(in_features=768, out_features=768, bias=True)
96
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
97
+ (dropout): Dropout(p=0.1, inplace=False)
98
+ )
99
+ )
100
+ (intermediate): BertIntermediate(
101
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
102
+ (intermediate_act_fn): GELUActivation()
103
+ )
104
+ (output): BertOutput(
105
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
106
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
107
+ (dropout): Dropout(p=0.1, inplace=False)
108
+ )
109
+ )
110
+ (4): BertLayer(
111
+ (attention): BertAttention(
112
+ (self): BertSelfAttention(
113
+ (query): Linear(in_features=768, out_features=768, bias=True)
114
+ (key): Linear(in_features=768, out_features=768, bias=True)
115
+ (value): Linear(in_features=768, out_features=768, bias=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (output): BertSelfOutput(
119
+ (dense): Linear(in_features=768, out_features=768, bias=True)
120
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
121
+ (dropout): Dropout(p=0.1, inplace=False)
122
+ )
123
+ )
124
+ (intermediate): BertIntermediate(
125
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
126
+ (intermediate_act_fn): GELUActivation()
127
+ )
128
+ (output): BertOutput(
129
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (5): BertLayer(
135
+ (attention): BertAttention(
136
+ (self): BertSelfAttention(
137
+ (query): Linear(in_features=768, out_features=768, bias=True)
138
+ (key): Linear(in_features=768, out_features=768, bias=True)
139
+ (value): Linear(in_features=768, out_features=768, bias=True)
140
+ (dropout): Dropout(p=0.1, inplace=False)
141
+ )
142
+ (output): BertSelfOutput(
143
+ (dense): Linear(in_features=768, out_features=768, bias=True)
144
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
145
+ (dropout): Dropout(p=0.1, inplace=False)
146
+ )
147
+ )
148
+ (intermediate): BertIntermediate(
149
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
150
+ (intermediate_act_fn): GELUActivation()
151
+ )
152
+ (output): BertOutput(
153
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
154
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
155
+ (dropout): Dropout(p=0.1, inplace=False)
156
+ )
157
+ )
158
+ (6): BertLayer(
159
+ (attention): BertAttention(
160
+ (self): BertSelfAttention(
161
+ (query): Linear(in_features=768, out_features=768, bias=True)
162
+ (key): Linear(in_features=768, out_features=768, bias=True)
163
+ (value): Linear(in_features=768, out_features=768, bias=True)
164
+ (dropout): Dropout(p=0.1, inplace=False)
165
+ )
166
+ (output): BertSelfOutput(
167
+ (dense): Linear(in_features=768, out_features=768, bias=True)
168
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
169
+ (dropout): Dropout(p=0.1, inplace=False)
170
+ )
171
+ )
172
+ (intermediate): BertIntermediate(
173
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
174
+ (intermediate_act_fn): GELUActivation()
175
+ )
176
+ (output): BertOutput(
177
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
178
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
179
+ (dropout): Dropout(p=0.1, inplace=False)
180
+ )
181
+ )
182
+ (7): BertLayer(
183
+ (attention): BertAttention(
184
+ (self): BertSelfAttention(
185
+ (query): Linear(in_features=768, out_features=768, bias=True)
186
+ (key): Linear(in_features=768, out_features=768, bias=True)
187
+ (value): Linear(in_features=768, out_features=768, bias=True)
188
+ (dropout): Dropout(p=0.1, inplace=False)
189
+ )
190
+ (output): BertSelfOutput(
191
+ (dense): Linear(in_features=768, out_features=768, bias=True)
192
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
193
+ (dropout): Dropout(p=0.1, inplace=False)
194
+ )
195
+ )
196
+ (intermediate): BertIntermediate(
197
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
198
+ (intermediate_act_fn): GELUActivation()
199
+ )
200
+ (output): BertOutput(
201
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
202
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
203
+ (dropout): Dropout(p=0.1, inplace=False)
204
+ )
205
+ )
206
+ (8): BertLayer(
207
+ (attention): BertAttention(
208
+ (self): BertSelfAttention(
209
+ (query): Linear(in_features=768, out_features=768, bias=True)
210
+ (key): Linear(in_features=768, out_features=768, bias=True)
211
+ (value): Linear(in_features=768, out_features=768, bias=True)
212
+ (dropout): Dropout(p=0.1, inplace=False)
213
+ )
214
+ (output): BertSelfOutput(
215
+ (dense): Linear(in_features=768, out_features=768, bias=True)
216
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
217
+ (dropout): Dropout(p=0.1, inplace=False)
218
+ )
219
+ )
220
+ (intermediate): BertIntermediate(
221
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
222
+ (intermediate_act_fn): GELUActivation()
223
+ )
224
+ (output): BertOutput(
225
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
226
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
227
+ (dropout): Dropout(p=0.1, inplace=False)
228
+ )
229
+ )
230
+ (9): BertLayer(
231
+ (attention): BertAttention(
232
+ (self): BertSelfAttention(
233
+ (query): Linear(in_features=768, out_features=768, bias=True)
234
+ (key): Linear(in_features=768, out_features=768, bias=True)
235
+ (value): Linear(in_features=768, out_features=768, bias=True)
236
+ (dropout): Dropout(p=0.1, inplace=False)
237
+ )
238
+ (output): BertSelfOutput(
239
+ (dense): Linear(in_features=768, out_features=768, bias=True)
240
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
241
+ (dropout): Dropout(p=0.1, inplace=False)
242
+ )
243
+ )
244
+ (intermediate): BertIntermediate(
245
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
246
+ (intermediate_act_fn): GELUActivation()
247
+ )
248
+ (output): BertOutput(
249
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
250
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
251
+ (dropout): Dropout(p=0.1, inplace=False)
252
+ )
253
+ )
254
+ (10): BertLayer(
255
+ (attention): BertAttention(
256
+ (self): BertSelfAttention(
257
+ (query): Linear(in_features=768, out_features=768, bias=True)
258
+ (key): Linear(in_features=768, out_features=768, bias=True)
259
+ (value): Linear(in_features=768, out_features=768, bias=True)
260
+ (dropout): Dropout(p=0.1, inplace=False)
261
+ )
262
+ (output): BertSelfOutput(
263
+ (dense): Linear(in_features=768, out_features=768, bias=True)
264
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
265
+ (dropout): Dropout(p=0.1, inplace=False)
266
+ )
267
+ )
268
+ (intermediate): BertIntermediate(
269
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
270
+ (intermediate_act_fn): GELUActivation()
271
+ )
272
+ (output): BertOutput(
273
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
274
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
275
+ (dropout): Dropout(p=0.1, inplace=False)
276
+ )
277
+ )
278
+ (11): BertLayer(
279
+ (attention): BertAttention(
280
+ (self): BertSelfAttention(
281
+ (query): Linear(in_features=768, out_features=768, bias=True)
282
+ (key): Linear(in_features=768, out_features=768, bias=True)
283
+ (value): Linear(in_features=768, out_features=768, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ )
286
+ (output): BertSelfOutput(
287
+ (dense): Linear(in_features=768, out_features=768, bias=True)
288
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
289
+ (dropout): Dropout(p=0.1, inplace=False)
290
+ )
291
+ )
292
+ (intermediate): BertIntermediate(
293
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
294
+ (intermediate_act_fn): GELUActivation()
295
+ )
296
+ (output): BertOutput(
297
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
298
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
299
+ (dropout): Dropout(p=0.1, inplace=False)
300
+ )
301
+ )
302
+ )
303
+ )
304
+ (pooler): BertPooler(
305
+ (dense): Linear(in_features=768, out_features=768, bias=True)
306
+ (activation): Tanh()
307
+ )
308
+ )
309
+ )
310
+ (locked_dropout): LockedDropout(p=0.5)
311
+ (linear): Linear(in_features=768, out_features=25, bias=True)
312
+ (loss_function): CrossEntropyLoss()
313
+ )"
314
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
315
+ 2023-10-23 19:08:24,658 MultiCorpus: 966 train + 219 dev + 204 test sentences
316
+ - NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
317
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
318
+ 2023-10-23 19:08:24,658 Train: 966 sentences
319
+ 2023-10-23 19:08:24,658 (train_with_dev=False, train_with_test=False)
320
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
321
+ 2023-10-23 19:08:24,658 Training Params:
322
+ 2023-10-23 19:08:24,658 - learning_rate: "3e-05"
323
+ 2023-10-23 19:08:24,658 - mini_batch_size: "4"
324
+ 2023-10-23 19:08:24,658 - max_epochs: "10"
325
+ 2023-10-23 19:08:24,658 - shuffle: "True"
326
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
327
+ 2023-10-23 19:08:24,658 Plugins:
328
+ 2023-10-23 19:08:24,658 - TensorboardLogger
329
+ 2023-10-23 19:08:24,658 - LinearScheduler | warmup_fraction: '0.1'
330
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
331
+ 2023-10-23 19:08:24,658 Final evaluation on model from best epoch (best-model.pt)
332
+ 2023-10-23 19:08:24,658 - metric: "('micro avg', 'f1-score')"
333
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
334
+ 2023-10-23 19:08:24,658 Computation:
335
+ 2023-10-23 19:08:24,658 - compute on device: cuda:0
336
+ 2023-10-23 19:08:24,658 - embedding storage: none
337
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
338
+ 2023-10-23 19:08:24,658 Model training base path: "hmbench-ajmc/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1"
339
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
340
+ 2023-10-23 19:08:24,658 ----------------------------------------------------------------------------------------------------
341
+ 2023-10-23 19:08:24,659 Logging anything other than scalars to TensorBoard is currently not supported.
342
+ 2023-10-23 19:08:27,148 epoch 1 - iter 24/242 - loss 3.67278832 - time (sec): 2.49 - samples/sec: 954.86 - lr: 0.000003 - momentum: 0.000000
343
+ 2023-10-23 19:08:28,624 epoch 1 - iter 48/242 - loss 2.95891825 - time (sec): 3.96 - samples/sec: 1163.54 - lr: 0.000006 - momentum: 0.000000
344
+ 2023-10-23 19:08:30,098 epoch 1 - iter 72/242 - loss 2.14067705 - time (sec): 5.44 - samples/sec: 1312.83 - lr: 0.000009 - momentum: 0.000000
345
+ 2023-10-23 19:08:31,609 epoch 1 - iter 96/242 - loss 1.70893243 - time (sec): 6.95 - samples/sec: 1448.77 - lr: 0.000012 - momentum: 0.000000
346
+ 2023-10-23 19:08:33,034 epoch 1 - iter 120/242 - loss 1.50416905 - time (sec): 8.37 - samples/sec: 1450.06 - lr: 0.000015 - momentum: 0.000000
347
+ 2023-10-23 19:08:34,527 epoch 1 - iter 144/242 - loss 1.31453287 - time (sec): 9.87 - samples/sec: 1482.04 - lr: 0.000018 - momentum: 0.000000
348
+ 2023-10-23 19:08:36,008 epoch 1 - iter 168/242 - loss 1.18407796 - time (sec): 11.35 - samples/sec: 1493.53 - lr: 0.000021 - momentum: 0.000000
349
+ 2023-10-23 19:08:37,517 epoch 1 - iter 192/242 - loss 1.05368018 - time (sec): 12.86 - samples/sec: 1526.00 - lr: 0.000024 - momentum: 0.000000
350
+ 2023-10-23 19:08:39,008 epoch 1 - iter 216/242 - loss 0.96540438 - time (sec): 14.35 - samples/sec: 1538.74 - lr: 0.000027 - momentum: 0.000000
351
+ 2023-10-23 19:08:40,490 epoch 1 - iter 240/242 - loss 0.89132846 - time (sec): 15.83 - samples/sec: 1549.55 - lr: 0.000030 - momentum: 0.000000
352
+ 2023-10-23 19:08:40,609 ----------------------------------------------------------------------------------------------------
353
+ 2023-10-23 19:08:40,610 EPOCH 1 done: loss 0.8843 - lr: 0.000030
354
+ 2023-10-23 19:08:41,342 DEV : loss 0.16898205876350403 - f1-score (micro avg) 0.688
355
+ 2023-10-23 19:08:41,346 saving best model
356
+ 2023-10-23 19:08:41,905 ----------------------------------------------------------------------------------------------------
357
+ 2023-10-23 19:08:43,374 epoch 2 - iter 24/242 - loss 0.14223354 - time (sec): 1.47 - samples/sec: 1563.85 - lr: 0.000030 - momentum: 0.000000
358
+ 2023-10-23 19:08:44,896 epoch 2 - iter 48/242 - loss 0.12916219 - time (sec): 2.99 - samples/sec: 1604.35 - lr: 0.000029 - momentum: 0.000000
359
+ 2023-10-23 19:08:46,388 epoch 2 - iter 72/242 - loss 0.14431620 - time (sec): 4.48 - samples/sec: 1634.23 - lr: 0.000029 - momentum: 0.000000
360
+ 2023-10-23 19:08:47,846 epoch 2 - iter 96/242 - loss 0.15498269 - time (sec): 5.94 - samples/sec: 1581.27 - lr: 0.000029 - momentum: 0.000000
361
+ 2023-10-23 19:08:49,407 epoch 2 - iter 120/242 - loss 0.14958350 - time (sec): 7.50 - samples/sec: 1620.93 - lr: 0.000028 - momentum: 0.000000
362
+ 2023-10-23 19:08:50,852 epoch 2 - iter 144/242 - loss 0.15519379 - time (sec): 8.95 - samples/sec: 1598.16 - lr: 0.000028 - momentum: 0.000000
363
+ 2023-10-23 19:08:52,342 epoch 2 - iter 168/242 - loss 0.15662606 - time (sec): 10.44 - samples/sec: 1617.00 - lr: 0.000028 - momentum: 0.000000
364
+ 2023-10-23 19:08:53,905 epoch 2 - iter 192/242 - loss 0.15791561 - time (sec): 12.00 - samples/sec: 1637.53 - lr: 0.000027 - momentum: 0.000000
365
+ 2023-10-23 19:08:55,386 epoch 2 - iter 216/242 - loss 0.15536465 - time (sec): 13.48 - samples/sec: 1631.40 - lr: 0.000027 - momentum: 0.000000
366
+ 2023-10-23 19:08:56,903 epoch 2 - iter 240/242 - loss 0.15044906 - time (sec): 15.00 - samples/sec: 1640.89 - lr: 0.000027 - momentum: 0.000000
367
+ 2023-10-23 19:08:57,017 ----------------------------------------------------------------------------------------------------
368
+ 2023-10-23 19:08:57,018 EPOCH 2 done: loss 0.1500 - lr: 0.000027
369
+ 2023-10-23 19:08:57,694 DEV : loss 0.11318839341402054 - f1-score (micro avg) 0.8138
370
+ 2023-10-23 19:08:57,698 saving best model
371
+ 2023-10-23 19:08:58,463 ----------------------------------------------------------------------------------------------------
372
+ 2023-10-23 19:08:59,918 epoch 3 - iter 24/242 - loss 0.11898681 - time (sec): 1.45 - samples/sec: 1518.64 - lr: 0.000026 - momentum: 0.000000
373
+ 2023-10-23 19:09:01,406 epoch 3 - iter 48/242 - loss 0.11011505 - time (sec): 2.94 - samples/sec: 1580.20 - lr: 0.000026 - momentum: 0.000000
374
+ 2023-10-23 19:09:02,963 epoch 3 - iter 72/242 - loss 0.10956550 - time (sec): 4.50 - samples/sec: 1600.39 - lr: 0.000026 - momentum: 0.000000
375
+ 2023-10-23 19:09:04,484 epoch 3 - iter 96/242 - loss 0.10869933 - time (sec): 6.02 - samples/sec: 1596.21 - lr: 0.000025 - momentum: 0.000000
376
+ 2023-10-23 19:09:05,964 epoch 3 - iter 120/242 - loss 0.10589468 - time (sec): 7.50 - samples/sec: 1607.90 - lr: 0.000025 - momentum: 0.000000
377
+ 2023-10-23 19:09:07,466 epoch 3 - iter 144/242 - loss 0.09866798 - time (sec): 9.00 - samples/sec: 1622.63 - lr: 0.000025 - momentum: 0.000000
378
+ 2023-10-23 19:09:08,987 epoch 3 - iter 168/242 - loss 0.09126189 - time (sec): 10.52 - samples/sec: 1613.49 - lr: 0.000024 - momentum: 0.000000
379
+ 2023-10-23 19:09:10,510 epoch 3 - iter 192/242 - loss 0.09019547 - time (sec): 12.05 - samples/sec: 1638.56 - lr: 0.000024 - momentum: 0.000000
380
+ 2023-10-23 19:09:11,957 epoch 3 - iter 216/242 - loss 0.09440161 - time (sec): 13.49 - samples/sec: 1638.11 - lr: 0.000024 - momentum: 0.000000
381
+ 2023-10-23 19:09:13,487 epoch 3 - iter 240/242 - loss 0.09448301 - time (sec): 15.02 - samples/sec: 1636.64 - lr: 0.000023 - momentum: 0.000000
382
+ 2023-10-23 19:09:13,604 ----------------------------------------------------------------------------------------------------
383
+ 2023-10-23 19:09:13,605 EPOCH 3 done: loss 0.0939 - lr: 0.000023
384
+ 2023-10-23 19:09:14,282 DEV : loss 0.12148646265268326 - f1-score (micro avg) 0.8361
385
+ 2023-10-23 19:09:14,286 saving best model
386
+ 2023-10-23 19:09:14,974 ----------------------------------------------------------------------------------------------------
387
+ 2023-10-23 19:09:16,469 epoch 4 - iter 24/242 - loss 0.05232625 - time (sec): 1.49 - samples/sec: 1565.00 - lr: 0.000023 - momentum: 0.000000
388
+ 2023-10-23 19:09:18,007 epoch 4 - iter 48/242 - loss 0.06181919 - time (sec): 3.03 - samples/sec: 1610.17 - lr: 0.000023 - momentum: 0.000000
389
+ 2023-10-23 19:09:19,519 epoch 4 - iter 72/242 - loss 0.05681268 - time (sec): 4.54 - samples/sec: 1612.44 - lr: 0.000022 - momentum: 0.000000
390
+ 2023-10-23 19:09:21,011 epoch 4 - iter 96/242 - loss 0.05581204 - time (sec): 6.04 - samples/sec: 1634.76 - lr: 0.000022 - momentum: 0.000000
391
+ 2023-10-23 19:09:22,526 epoch 4 - iter 120/242 - loss 0.05450289 - time (sec): 7.55 - samples/sec: 1665.43 - lr: 0.000022 - momentum: 0.000000
392
+ 2023-10-23 19:09:24,062 epoch 4 - iter 144/242 - loss 0.05942248 - time (sec): 9.09 - samples/sec: 1676.96 - lr: 0.000021 - momentum: 0.000000
393
+ 2023-10-23 19:09:25,505 epoch 4 - iter 168/242 - loss 0.06236623 - time (sec): 10.53 - samples/sec: 1653.02 - lr: 0.000021 - momentum: 0.000000
394
+ 2023-10-23 19:09:26,997 epoch 4 - iter 192/242 - loss 0.06517078 - time (sec): 12.02 - samples/sec: 1645.21 - lr: 0.000021 - momentum: 0.000000
395
+ 2023-10-23 19:09:28,498 epoch 4 - iter 216/242 - loss 0.06688996 - time (sec): 13.52 - samples/sec: 1649.95 - lr: 0.000020 - momentum: 0.000000
396
+ 2023-10-23 19:09:29,976 epoch 4 - iter 240/242 - loss 0.06577576 - time (sec): 15.00 - samples/sec: 1644.74 - lr: 0.000020 - momentum: 0.000000
397
+ 2023-10-23 19:09:30,085 ----------------------------------------------------------------------------------------------------
398
+ 2023-10-23 19:09:30,085 EPOCH 4 done: loss 0.0657 - lr: 0.000020
399
+ 2023-10-23 19:09:30,766 DEV : loss 0.14296689629554749 - f1-score (micro avg) 0.8371
400
+ 2023-10-23 19:09:30,769 saving best model
401
+ 2023-10-23 19:09:31,584 ----------------------------------------------------------------------------------------------------
402
+ 2023-10-23 19:09:33,075 epoch 5 - iter 24/242 - loss 0.05084982 - time (sec): 1.49 - samples/sec: 1516.20 - lr: 0.000020 - momentum: 0.000000
403
+ 2023-10-23 19:09:34,558 epoch 5 - iter 48/242 - loss 0.04919071 - time (sec): 2.97 - samples/sec: 1568.57 - lr: 0.000019 - momentum: 0.000000
404
+ 2023-10-23 19:09:36,050 epoch 5 - iter 72/242 - loss 0.04639649 - time (sec): 4.47 - samples/sec: 1578.63 - lr: 0.000019 - momentum: 0.000000
405
+ 2023-10-23 19:09:37,557 epoch 5 - iter 96/242 - loss 0.05195384 - time (sec): 5.97 - samples/sec: 1612.02 - lr: 0.000019 - momentum: 0.000000
406
+ 2023-10-23 19:09:39,133 epoch 5 - iter 120/242 - loss 0.05347717 - time (sec): 7.55 - samples/sec: 1643.81 - lr: 0.000018 - momentum: 0.000000
407
+ 2023-10-23 19:09:40,632 epoch 5 - iter 144/242 - loss 0.05618684 - time (sec): 9.05 - samples/sec: 1635.29 - lr: 0.000018 - momentum: 0.000000
408
+ 2023-10-23 19:09:42,117 epoch 5 - iter 168/242 - loss 0.05282894 - time (sec): 10.53 - samples/sec: 1642.35 - lr: 0.000018 - momentum: 0.000000
409
+ 2023-10-23 19:09:43,592 epoch 5 - iter 192/242 - loss 0.05381162 - time (sec): 12.01 - samples/sec: 1633.23 - lr: 0.000017 - momentum: 0.000000
410
+ 2023-10-23 19:09:45,071 epoch 5 - iter 216/242 - loss 0.05106616 - time (sec): 13.49 - samples/sec: 1631.97 - lr: 0.000017 - momentum: 0.000000
411
+ 2023-10-23 19:09:46,563 epoch 5 - iter 240/242 - loss 0.04927862 - time (sec): 14.98 - samples/sec: 1635.94 - lr: 0.000017 - momentum: 0.000000
412
+ 2023-10-23 19:09:46,693 ----------------------------------------------------------------------------------------------------
413
+ 2023-10-23 19:09:46,694 EPOCH 5 done: loss 0.0491 - lr: 0.000017
414
+ 2023-10-23 19:09:47,503 DEV : loss 0.1406407207250595 - f1-score (micro avg) 0.8586
415
+ 2023-10-23 19:09:47,507 saving best model
416
+ 2023-10-23 19:09:48,337 ----------------------------------------------------------------------------------------------------
417
+ 2023-10-23 19:09:49,861 epoch 6 - iter 24/242 - loss 0.02763052 - time (sec): 1.52 - samples/sec: 1640.20 - lr: 0.000016 - momentum: 0.000000
418
+ 2023-10-23 19:09:51,367 epoch 6 - iter 48/242 - loss 0.03172378 - time (sec): 3.03 - samples/sec: 1638.57 - lr: 0.000016 - momentum: 0.000000
419
+ 2023-10-23 19:09:52,874 epoch 6 - iter 72/242 - loss 0.03088796 - time (sec): 4.54 - samples/sec: 1605.08 - lr: 0.000016 - momentum: 0.000000
420
+ 2023-10-23 19:09:54,322 epoch 6 - iter 96/242 - loss 0.03083092 - time (sec): 5.98 - samples/sec: 1593.30 - lr: 0.000015 - momentum: 0.000000
421
+ 2023-10-23 19:09:55,885 epoch 6 - iter 120/242 - loss 0.03083829 - time (sec): 7.55 - samples/sec: 1631.99 - lr: 0.000015 - momentum: 0.000000
422
+ 2023-10-23 19:09:57,389 epoch 6 - iter 144/242 - loss 0.03101966 - time (sec): 9.05 - samples/sec: 1630.36 - lr: 0.000015 - momentum: 0.000000
423
+ 2023-10-23 19:09:58,912 epoch 6 - iter 168/242 - loss 0.03005750 - time (sec): 10.57 - samples/sec: 1612.84 - lr: 0.000014 - momentum: 0.000000
424
+ 2023-10-23 19:10:00,420 epoch 6 - iter 192/242 - loss 0.03093790 - time (sec): 12.08 - samples/sec: 1623.25 - lr: 0.000014 - momentum: 0.000000
425
+ 2023-10-23 19:10:01,945 epoch 6 - iter 216/242 - loss 0.03244273 - time (sec): 13.61 - samples/sec: 1614.91 - lr: 0.000014 - momentum: 0.000000
426
+ 2023-10-23 19:10:03,457 epoch 6 - iter 240/242 - loss 0.03281395 - time (sec): 15.12 - samples/sec: 1624.79 - lr: 0.000013 - momentum: 0.000000
427
+ 2023-10-23 19:10:03,569 ----------------------------------------------------------------------------------------------------
428
+ 2023-10-23 19:10:03,570 EPOCH 6 done: loss 0.0330 - lr: 0.000013
429
+ 2023-10-23 19:10:04,253 DEV : loss 0.16732417047023773 - f1-score (micro avg) 0.8203
430
+ 2023-10-23 19:10:04,257 ----------------------------------------------------------------------------------------------------
431
+ 2023-10-23 19:10:05,756 epoch 7 - iter 24/242 - loss 0.02010722 - time (sec): 1.50 - samples/sec: 1786.01 - lr: 0.000013 - momentum: 0.000000
432
+ 2023-10-23 19:10:07,260 epoch 7 - iter 48/242 - loss 0.01912838 - time (sec): 3.00 - samples/sec: 1692.72 - lr: 0.000013 - momentum: 0.000000
433
+ 2023-10-23 19:10:08,752 epoch 7 - iter 72/242 - loss 0.01980953 - time (sec): 4.49 - samples/sec: 1672.37 - lr: 0.000012 - momentum: 0.000000
434
+ 2023-10-23 19:10:10,269 epoch 7 - iter 96/242 - loss 0.01804005 - time (sec): 6.01 - samples/sec: 1668.74 - lr: 0.000012 - momentum: 0.000000
435
+ 2023-10-23 19:10:11,771 epoch 7 - iter 120/242 - loss 0.02131044 - time (sec): 7.51 - samples/sec: 1653.19 - lr: 0.000012 - momentum: 0.000000
436
+ 2023-10-23 19:10:13,334 epoch 7 - iter 144/242 - loss 0.02211216 - time (sec): 9.08 - samples/sec: 1632.28 - lr: 0.000011 - momentum: 0.000000
437
+ 2023-10-23 19:10:14,818 epoch 7 - iter 168/242 - loss 0.02084424 - time (sec): 10.56 - samples/sec: 1639.15 - lr: 0.000011 - momentum: 0.000000
438
+ 2023-10-23 19:10:16,335 epoch 7 - iter 192/242 - loss 0.02001712 - time (sec): 12.08 - samples/sec: 1637.00 - lr: 0.000011 - momentum: 0.000000
439
+ 2023-10-23 19:10:17,824 epoch 7 - iter 216/242 - loss 0.01982037 - time (sec): 13.57 - samples/sec: 1630.98 - lr: 0.000010 - momentum: 0.000000
440
+ 2023-10-23 19:10:19,343 epoch 7 - iter 240/242 - loss 0.02068453 - time (sec): 15.09 - samples/sec: 1628.08 - lr: 0.000010 - momentum: 0.000000
441
+ 2023-10-23 19:10:19,458 ----------------------------------------------------------------------------------------------------
442
+ 2023-10-23 19:10:19,458 EPOCH 7 done: loss 0.0222 - lr: 0.000010
443
+ 2023-10-23 19:10:20,144 DEV : loss 0.1981634944677353 - f1-score (micro avg) 0.8117
444
+ 2023-10-23 19:10:20,148 ----------------------------------------------------------------------------------------------------
445
+ 2023-10-23 19:10:21,661 epoch 8 - iter 24/242 - loss 0.01112978 - time (sec): 1.51 - samples/sec: 1643.20 - lr: 0.000010 - momentum: 0.000000
446
+ 2023-10-23 19:10:23,193 epoch 8 - iter 48/242 - loss 0.01314728 - time (sec): 3.04 - samples/sec: 1633.53 - lr: 0.000009 - momentum: 0.000000
447
+ 2023-10-23 19:10:24,698 epoch 8 - iter 72/242 - loss 0.01036883 - time (sec): 4.55 - samples/sec: 1613.16 - lr: 0.000009 - momentum: 0.000000
448
+ 2023-10-23 19:10:26,181 epoch 8 - iter 96/242 - loss 0.00962037 - time (sec): 6.03 - samples/sec: 1615.05 - lr: 0.000009 - momentum: 0.000000
449
+ 2023-10-23 19:10:27,664 epoch 8 - iter 120/242 - loss 0.00900148 - time (sec): 7.52 - samples/sec: 1590.82 - lr: 0.000008 - momentum: 0.000000
450
+ 2023-10-23 19:10:29,153 epoch 8 - iter 144/242 - loss 0.01176402 - time (sec): 9.00 - samples/sec: 1605.00 - lr: 0.000008 - momentum: 0.000000
451
+ 2023-10-23 19:10:30,671 epoch 8 - iter 168/242 - loss 0.01387140 - time (sec): 10.52 - samples/sec: 1614.93 - lr: 0.000008 - momentum: 0.000000
452
+ 2023-10-23 19:10:32,239 epoch 8 - iter 192/242 - loss 0.01286732 - time (sec): 12.09 - samples/sec: 1636.86 - lr: 0.000007 - momentum: 0.000000
453
+ 2023-10-23 19:10:33,735 epoch 8 - iter 216/242 - loss 0.01188376 - time (sec): 13.59 - samples/sec: 1637.46 - lr: 0.000007 - momentum: 0.000000
454
+ 2023-10-23 19:10:35,237 epoch 8 - iter 240/242 - loss 0.01274869 - time (sec): 15.09 - samples/sec: 1622.43 - lr: 0.000007 - momentum: 0.000000
455
+ 2023-10-23 19:10:35,361 ----------------------------------------------------------------------------------------------------
456
+ 2023-10-23 19:10:35,361 EPOCH 8 done: loss 0.0126 - lr: 0.000007
457
+ 2023-10-23 19:10:36,056 DEV : loss 0.1954435408115387 - f1-score (micro avg) 0.836
458
+ 2023-10-23 19:10:36,060 ----------------------------------------------------------------------------------------------------
459
+ 2023-10-23 19:10:37,530 epoch 9 - iter 24/242 - loss 0.03445182 - time (sec): 1.47 - samples/sec: 1639.08 - lr: 0.000006 - momentum: 0.000000
460
+ 2023-10-23 19:10:39,032 epoch 9 - iter 48/242 - loss 0.01826025 - time (sec): 2.97 - samples/sec: 1653.82 - lr: 0.000006 - momentum: 0.000000
461
+ 2023-10-23 19:10:40,575 epoch 9 - iter 72/242 - loss 0.01223116 - time (sec): 4.51 - samples/sec: 1638.54 - lr: 0.000006 - momentum: 0.000000
462
+ 2023-10-23 19:10:42,117 epoch 9 - iter 96/242 - loss 0.01231785 - time (sec): 6.06 - samples/sec: 1665.45 - lr: 0.000005 - momentum: 0.000000
463
+ 2023-10-23 19:10:43,614 epoch 9 - iter 120/242 - loss 0.01167177 - time (sec): 7.55 - samples/sec: 1649.32 - lr: 0.000005 - momentum: 0.000000
464
+ 2023-10-23 19:10:45,156 epoch 9 - iter 144/242 - loss 0.01014355 - time (sec): 9.10 - samples/sec: 1658.01 - lr: 0.000005 - momentum: 0.000000
465
+ 2023-10-23 19:10:46,651 epoch 9 - iter 168/242 - loss 0.00971928 - time (sec): 10.59 - samples/sec: 1646.35 - lr: 0.000004 - momentum: 0.000000
466
+ 2023-10-23 19:10:48,168 epoch 9 - iter 192/242 - loss 0.01008407 - time (sec): 12.11 - samples/sec: 1646.11 - lr: 0.000004 - momentum: 0.000000
467
+ 2023-10-23 19:10:49,679 epoch 9 - iter 216/242 - loss 0.01053302 - time (sec): 13.62 - samples/sec: 1622.77 - lr: 0.000004 - momentum: 0.000000
468
+ 2023-10-23 19:10:51,195 epoch 9 - iter 240/242 - loss 0.00948029 - time (sec): 15.13 - samples/sec: 1628.46 - lr: 0.000003 - momentum: 0.000000
469
+ 2023-10-23 19:10:51,307 ----------------------------------------------------------------------------------------------------
470
+ 2023-10-23 19:10:51,307 EPOCH 9 done: loss 0.0094 - lr: 0.000003
471
+ 2023-10-23 19:10:51,996 DEV : loss 0.20104923844337463 - f1-score (micro avg) 0.8418
472
+ 2023-10-23 19:10:51,999 ----------------------------------------------------------------------------------------------------
473
+ 2023-10-23 19:10:53,575 epoch 10 - iter 24/242 - loss 0.01608682 - time (sec): 1.57 - samples/sec: 1660.56 - lr: 0.000003 - momentum: 0.000000
474
+ 2023-10-23 19:10:55,138 epoch 10 - iter 48/242 - loss 0.01053632 - time (sec): 3.14 - samples/sec: 1642.25 - lr: 0.000003 - momentum: 0.000000
475
+ 2023-10-23 19:10:56,641 epoch 10 - iter 72/242 - loss 0.00703278 - time (sec): 4.64 - samples/sec: 1685.53 - lr: 0.000002 - momentum: 0.000000
476
+ 2023-10-23 19:10:58,153 epoch 10 - iter 96/242 - loss 0.00645811 - time (sec): 6.15 - samples/sec: 1682.84 - lr: 0.000002 - momentum: 0.000000
477
+ 2023-10-23 19:10:59,681 epoch 10 - iter 120/242 - loss 0.00615817 - time (sec): 7.68 - samples/sec: 1673.66 - lr: 0.000002 - momentum: 0.000000
478
+ 2023-10-23 19:11:01,209 epoch 10 - iter 144/242 - loss 0.00528460 - time (sec): 9.21 - samples/sec: 1654.88 - lr: 0.000001 - momentum: 0.000000
479
+ 2023-10-23 19:11:02,673 epoch 10 - iter 168/242 - loss 0.00538711 - time (sec): 10.67 - samples/sec: 1631.70 - lr: 0.000001 - momentum: 0.000000
480
+ 2023-10-23 19:11:04,163 epoch 10 - iter 192/242 - loss 0.00517269 - time (sec): 12.16 - samples/sec: 1631.04 - lr: 0.000001 - momentum: 0.000000
481
+ 2023-10-23 19:11:05,668 epoch 10 - iter 216/242 - loss 0.00576643 - time (sec): 13.67 - samples/sec: 1634.52 - lr: 0.000000 - momentum: 0.000000
482
+ 2023-10-23 19:11:07,154 epoch 10 - iter 240/242 - loss 0.00579968 - time (sec): 15.15 - samples/sec: 1624.90 - lr: 0.000000 - momentum: 0.000000
483
+ 2023-10-23 19:11:07,263 ----------------------------------------------------------------------------------------------------
484
+ 2023-10-23 19:11:07,263 EPOCH 10 done: loss 0.0058 - lr: 0.000000
485
+ 2023-10-23 19:11:07,954 DEV : loss 0.207326278090477 - f1-score (micro avg) 0.8464
486
+ 2023-10-23 19:11:08,620 ----------------------------------------------------------------------------------------------------
487
+ 2023-10-23 19:11:08,621 Loading model from best epoch ...
488
+ 2023-10-23 19:11:10,583 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
489
+ 2023-10-23 19:11:11,448
490
+ Results:
491
+ - F-score (micro) 0.8157
492
+ - F-score (macro) 0.5867
493
+ - Accuracy 0.7116
494
+
495
+ By class:
496
+ precision recall f1-score support
497
+
498
+ pers 0.8592 0.8777 0.8683 139
499
+ scope 0.8085 0.8837 0.8444 129
500
+ work 0.6739 0.7750 0.7209 80
501
+ loc 1.0000 0.3333 0.5000 9
502
+ date 0.0000 0.0000 0.0000 3
503
+
504
+ micro avg 0.7963 0.8361 0.8157 360
505
+ macro avg 0.6683 0.5740 0.5867 360
506
+ weighted avg 0.7962 0.8361 0.8106 360
507
+
508
+ 2023-10-23 19:11:11,448 ----------------------------------------------------------------------------------------------------