Token Classification
Flair
PyTorch
Spanish
sequence-tagger-model
File size: 39,136 Bytes
fb1f685
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
2024-04-29 19:13:06,967 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:06,968 Model: "SequenceTagger(
  (embeddings): TransformerWordEmbeddings(
    (model): XLMRobertaModel(
      (embeddings): XLMRobertaEmbeddings(
        (word_embeddings): Embedding(250003, 1024)
        (position_embeddings): Embedding(514, 1024, padding_idx=1)
        (token_type_embeddings): Embedding(1, 1024)
        (LayerNorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
        (dropout): Dropout(p=0.1, inplace=False)
      )
      (encoder): XLMRobertaEncoder(
        (layer): ModuleList(
          (0-23): 24 x XLMRobertaLayer(
            (attention): XLMRobertaAttention(
              (self): XLMRobertaSelfAttention(
                (query): Linear(in_features=1024, out_features=1024, bias=True)
                (key): Linear(in_features=1024, out_features=1024, bias=True)
                (value): Linear(in_features=1024, out_features=1024, bias=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
              (output): XLMRobertaSelfOutput(
                (dense): Linear(in_features=1024, out_features=1024, bias=True)
                (LayerNorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
            )
            (intermediate): XLMRobertaIntermediate(
              (dense): Linear(in_features=1024, out_features=4096, bias=True)
              (intermediate_act_fn): GELUActivation()
            )
            (output): XLMRobertaOutput(
              (dense): Linear(in_features=4096, out_features=1024, bias=True)
              (LayerNorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
              (dropout): Dropout(p=0.1, inplace=False)
            )
          )
        )
      )
      (pooler): XLMRobertaPooler(
        (dense): Linear(in_features=1024, out_features=1024, bias=True)
        (activation): Tanh()
      )
    )
  )
  (locked_dropout): LockedDropout(p=0.5)
  (linear): Linear(in_features=1024, out_features=25, bias=True)
  (loss_function): CrossEntropyLoss()
)"
2024-04-29 19:13:06,968 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:06,968 Corpus: "Corpus: 5301 train + 589 dev + 654 test sentences"
2024-04-29 19:13:06,968 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:06,968 Parameters:
2024-04-29 19:13:06,968  - learning_rate: "0.000005"
2024-04-29 19:13:06,968  - mini_batch_size: "4"
2024-04-29 19:13:06,968  - patience: "3"
2024-04-29 19:13:06,968  - anneal_factor: "0.5"
2024-04-29 19:13:06,968  - max_epochs: "20"
2024-04-29 19:13:06,968  - shuffle: "True"
2024-04-29 19:13:06,968  - train_with_dev: "False"
2024-04-29 19:13:06,968  - batch_growth_annealing: "False"
2024-04-29 19:13:06,968 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:06,968 Model training base path: "resources/taggers/ner-spanish-large-np-finetune"
2024-04-29 19:13:06,968 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:06,968 Device: cuda:0
2024-04-29 19:13:06,969 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:06,969 Embeddings storage mode: none
2024-04-29 19:13:06,969 ----------------------------------------------------------------------------------------------------
2024-04-29 19:13:30,396 epoch 1 - iter 132/1326 - loss 0.62261396 - time (sec): 23.43 - samples/sec: 1442.78 - lr: 0.000005
2024-04-29 19:13:52,923 epoch 1 - iter 264/1326 - loss 0.38574243 - time (sec): 45.95 - samples/sec: 1428.48 - lr: 0.000005
2024-04-29 19:14:13,281 epoch 1 - iter 396/1326 - loss 0.33621740 - time (sec): 66.31 - samples/sec: 1214.90 - lr: 0.000005
2024-04-29 19:14:33,173 epoch 1 - iter 528/1326 - loss 0.27934057 - time (sec): 86.20 - samples/sec: 1133.17 - lr: 0.000005
2024-04-29 19:14:54,027 epoch 1 - iter 660/1326 - loss 0.28667008 - time (sec): 107.06 - samples/sec: 1088.07 - lr: 0.000005
2024-04-29 19:15:13,282 epoch 1 - iter 792/1326 - loss 0.26728950 - time (sec): 126.31 - samples/sec: 1005.56 - lr: 0.000005
2024-04-29 19:15:32,488 epoch 1 - iter 924/1326 - loss 0.24258707 - time (sec): 145.52 - samples/sec: 961.73 - lr: 0.000005
2024-04-29 19:15:52,452 epoch 1 - iter 1056/1326 - loss 0.22206141 - time (sec): 165.48 - samples/sec: 923.87 - lr: 0.000005
2024-04-29 19:16:12,585 epoch 1 - iter 1188/1326 - loss 0.21078620 - time (sec): 185.62 - samples/sec: 898.59 - lr: 0.000005
2024-04-29 19:16:35,102 epoch 1 - iter 1320/1326 - loss 0.18962778 - time (sec): 208.13 - samples/sec: 944.88 - lr: 0.000005
2024-04-29 19:16:36,057 ----------------------------------------------------------------------------------------------------
2024-04-29 19:16:36,057 EPOCH 1 done: loss 0.1903 - lr 0.000005
2024-04-29 19:16:41,931 Evaluating as a multi-label problem: False
2024-04-29 19:16:41,938 DEV : loss 0.08347803354263306 - f1-score (micro avg)  0.2415
2024-04-29 19:16:41,946 saving best model
2024-04-29 19:16:43,696 ----------------------------------------------------------------------------------------------------
2024-04-29 19:17:04,299 epoch 2 - iter 132/1326 - loss 0.05433737 - time (sec): 20.60 - samples/sec: 939.30 - lr: 0.000005
2024-04-29 19:17:25,134 epoch 2 - iter 264/1326 - loss 0.07221647 - time (sec): 41.44 - samples/sec: 959.97 - lr: 0.000005
2024-04-29 19:17:45,581 epoch 2 - iter 396/1326 - loss 0.06800126 - time (sec): 61.88 - samples/sec: 926.79 - lr: 0.000005
2024-04-29 19:18:05,931 epoch 2 - iter 528/1326 - loss 0.06976185 - time (sec): 82.24 - samples/sec: 908.61 - lr: 0.000005
2024-04-29 19:18:26,703 epoch 2 - iter 660/1326 - loss 0.07144914 - time (sec): 103.01 - samples/sec: 916.36 - lr: 0.000005
2024-04-29 19:18:47,551 epoch 2 - iter 792/1326 - loss 0.07129850 - time (sec): 123.86 - samples/sec: 921.43 - lr: 0.000005
2024-04-29 19:19:08,831 epoch 2 - iter 924/1326 - loss 0.07364315 - time (sec): 145.13 - samples/sec: 937.72 - lr: 0.000005
2024-04-29 19:19:30,404 epoch 2 - iter 1056/1326 - loss 0.07429358 - time (sec): 166.71 - samples/sec: 947.01 - lr: 0.000005
2024-04-29 19:19:51,382 epoch 2 - iter 1188/1326 - loss 0.07472375 - time (sec): 187.69 - samples/sec: 951.54 - lr: 0.000005
2024-04-29 19:20:11,837 epoch 2 - iter 1320/1326 - loss 0.07525889 - time (sec): 208.14 - samples/sec: 945.83 - lr: 0.000005
2024-04-29 19:20:12,667 ----------------------------------------------------------------------------------------------------
2024-04-29 19:20:12,667 EPOCH 2 done: loss 0.0751 - lr 0.000005
2024-04-29 19:20:19,289 Evaluating as a multi-label problem: False
2024-04-29 19:20:19,296 DEV : loss 0.06594807654619217 - f1-score (micro avg)  0.6776
2024-04-29 19:20:19,305 saving best model
2024-04-29 19:20:21,153 ----------------------------------------------------------------------------------------------------
2024-04-29 19:20:42,076 epoch 3 - iter 132/1326 - loss 0.03075654 - time (sec): 20.92 - samples/sec: 1045.61 - lr: 0.000005
2024-04-29 19:21:03,363 epoch 3 - iter 264/1326 - loss 0.06844082 - time (sec): 42.21 - samples/sec: 1076.19 - lr: 0.000005
2024-04-29 19:21:24,222 epoch 3 - iter 396/1326 - loss 0.06805718 - time (sec): 63.07 - samples/sec: 1049.91 - lr: 0.000005
2024-04-29 19:21:44,517 epoch 3 - iter 528/1326 - loss 0.06164637 - time (sec): 83.36 - samples/sec: 980.80 - lr: 0.000005
2024-04-29 19:22:05,198 epoch 3 - iter 660/1326 - loss 0.05812095 - time (sec): 104.04 - samples/sec: 971.12 - lr: 0.000005
2024-04-29 19:22:25,741 epoch 3 - iter 792/1326 - loss 0.05654579 - time (sec): 124.59 - samples/sec: 951.60 - lr: 0.000005
2024-04-29 19:22:46,750 epoch 3 - iter 924/1326 - loss 0.05279483 - time (sec): 145.60 - samples/sec: 952.78 - lr: 0.000005
2024-04-29 19:23:07,347 epoch 3 - iter 1056/1326 - loss 0.05517769 - time (sec): 166.19 - samples/sec: 948.83 - lr: 0.000005
2024-04-29 19:23:28,146 epoch 3 - iter 1188/1326 - loss 0.05270269 - time (sec): 186.99 - samples/sec: 942.41 - lr: 0.000005
2024-04-29 19:23:49,190 epoch 3 - iter 1320/1326 - loss 0.05130536 - time (sec): 208.04 - samples/sec: 943.77 - lr: 0.000005
2024-04-29 19:23:50,082 ----------------------------------------------------------------------------------------------------
2024-04-29 19:23:50,082 EPOCH 3 done: loss 0.0511 - lr 0.000005
2024-04-29 19:23:56,695 Evaluating as a multi-label problem: False
2024-04-29 19:23:56,702 DEV : loss 0.06074140965938568 - f1-score (micro avg)  0.6805
2024-04-29 19:23:56,711 saving best model
2024-04-29 19:23:58,467 ----------------------------------------------------------------------------------------------------
2024-04-29 19:24:19,456 epoch 4 - iter 132/1326 - loss 0.05354733 - time (sec): 20.99 - samples/sec: 965.04 - lr: 0.000005
2024-04-29 19:24:40,634 epoch 4 - iter 264/1326 - loss 0.05477016 - time (sec): 42.17 - samples/sec: 987.83 - lr: 0.000005
2024-04-29 19:25:02,087 epoch 4 - iter 396/1326 - loss 0.04696782 - time (sec): 63.62 - samples/sec: 1021.52 - lr: 0.000005
2024-04-29 19:25:22,809 epoch 4 - iter 528/1326 - loss 0.04301686 - time (sec): 84.34 - samples/sec: 989.27 - lr: 0.000005
2024-04-29 19:25:43,866 epoch 4 - iter 660/1326 - loss 0.04884093 - time (sec): 105.40 - samples/sec: 969.26 - lr: 0.000005
2024-04-29 19:26:04,584 epoch 4 - iter 792/1326 - loss 0.04698956 - time (sec): 126.12 - samples/sec: 952.53 - lr: 0.000005
2024-04-29 19:26:25,832 epoch 4 - iter 924/1326 - loss 0.05039226 - time (sec): 147.36 - samples/sec: 964.02 - lr: 0.000005
2024-04-29 19:26:46,770 epoch 4 - iter 1056/1326 - loss 0.05071681 - time (sec): 168.30 - samples/sec: 958.30 - lr: 0.000005
2024-04-29 19:27:07,471 epoch 4 - iter 1188/1326 - loss 0.05135564 - time (sec): 189.00 - samples/sec: 954.33 - lr: 0.000005
2024-04-29 19:27:27,862 epoch 4 - iter 1320/1326 - loss 0.04986070 - time (sec): 209.40 - samples/sec: 940.94 - lr: 0.000005
2024-04-29 19:27:28,674 ----------------------------------------------------------------------------------------------------
2024-04-29 19:27:28,674 EPOCH 4 done: loss 0.0498 - lr 0.000005
2024-04-29 19:27:35,422 Evaluating as a multi-label problem: False
2024-04-29 19:27:35,430 DEV : loss 0.06857836991548538 - f1-score (micro avg)  0.7354
2024-04-29 19:27:35,440 saving best model
2024-04-29 19:27:37,201 ----------------------------------------------------------------------------------------------------
2024-04-29 19:27:57,750 epoch 5 - iter 132/1326 - loss 0.05922121 - time (sec): 20.55 - samples/sec: 900.56 - lr: 0.000004
2024-04-29 19:28:18,733 epoch 5 - iter 264/1326 - loss 0.04787888 - time (sec): 41.53 - samples/sec: 920.18 - lr: 0.000004
2024-04-29 19:28:39,884 epoch 5 - iter 396/1326 - loss 0.04606074 - time (sec): 62.68 - samples/sec: 933.19 - lr: 0.000004
2024-04-29 19:29:00,559 epoch 5 - iter 528/1326 - loss 0.04040456 - time (sec): 83.36 - samples/sec: 939.85 - lr: 0.000004
2024-04-29 19:29:21,358 epoch 5 - iter 660/1326 - loss 0.03768252 - time (sec): 104.16 - samples/sec: 939.20 - lr: 0.000004
2024-04-29 19:29:42,554 epoch 5 - iter 792/1326 - loss 0.03721055 - time (sec): 125.35 - samples/sec: 954.40 - lr: 0.000004
2024-04-29 19:30:03,515 epoch 5 - iter 924/1326 - loss 0.04173413 - time (sec): 146.31 - samples/sec: 952.84 - lr: 0.000004
2024-04-29 19:30:24,180 epoch 5 - iter 1056/1326 - loss 0.04019307 - time (sec): 166.98 - samples/sec: 945.63 - lr: 0.000004
2024-04-29 19:30:44,955 epoch 5 - iter 1188/1326 - loss 0.04065091 - time (sec): 187.75 - samples/sec: 945.36 - lr: 0.000004
2024-04-29 19:31:05,582 epoch 5 - iter 1320/1326 - loss 0.03986708 - time (sec): 208.38 - samples/sec: 944.48 - lr: 0.000004
2024-04-29 19:31:06,418 ----------------------------------------------------------------------------------------------------
2024-04-29 19:31:06,418 EPOCH 5 done: loss 0.0397 - lr 0.000004
2024-04-29 19:31:13,174 Evaluating as a multi-label problem: False
2024-04-29 19:31:13,181 DEV : loss 0.08110673725605011 - f1-score (micro avg)  0.7018
2024-04-29 19:31:13,191 ----------------------------------------------------------------------------------------------------
2024-04-29 19:31:34,400 epoch 6 - iter 132/1326 - loss 0.03815522 - time (sec): 21.21 - samples/sec: 956.70 - lr: 0.000004
2024-04-29 19:31:55,259 epoch 6 - iter 264/1326 - loss 0.02812200 - time (sec): 42.07 - samples/sec: 942.47 - lr: 0.000004
2024-04-29 19:32:16,061 epoch 6 - iter 396/1326 - loss 0.02891512 - time (sec): 62.87 - samples/sec: 943.72 - lr: 0.000004
2024-04-29 19:32:36,977 epoch 6 - iter 528/1326 - loss 0.03180526 - time (sec): 83.79 - samples/sec: 958.40 - lr: 0.000004
2024-04-29 19:32:57,437 epoch 6 - iter 660/1326 - loss 0.03231067 - time (sec): 104.25 - samples/sec: 933.33 - lr: 0.000004
2024-04-29 19:33:18,594 epoch 6 - iter 792/1326 - loss 0.03713967 - time (sec): 125.40 - samples/sec: 948.30 - lr: 0.000004
2024-04-29 19:33:39,684 epoch 6 - iter 924/1326 - loss 0.04015671 - time (sec): 146.49 - samples/sec: 954.01 - lr: 0.000004
2024-04-29 19:34:00,351 epoch 6 - iter 1056/1326 - loss 0.03794536 - time (sec): 167.16 - samples/sec: 955.37 - lr: 0.000004
2024-04-29 19:34:21,179 epoch 6 - iter 1188/1326 - loss 0.03738144 - time (sec): 187.99 - samples/sec: 950.05 - lr: 0.000004
2024-04-29 19:34:41,735 epoch 6 - iter 1320/1326 - loss 0.03564541 - time (sec): 208.54 - samples/sec: 942.71 - lr: 0.000004
2024-04-29 19:34:42,592 ----------------------------------------------------------------------------------------------------
2024-04-29 19:34:42,592 EPOCH 6 done: loss 0.0355 - lr 0.000004
2024-04-29 19:34:49,634 Evaluating as a multi-label problem: False
2024-04-29 19:34:49,640 DEV : loss 0.07366479188203812 - f1-score (micro avg)  0.7319
2024-04-29 19:34:49,648 ----------------------------------------------------------------------------------------------------
2024-04-29 19:35:10,664 epoch 7 - iter 132/1326 - loss 0.02986449 - time (sec): 21.02 - samples/sec: 1005.59 - lr: 0.000004
2024-04-29 19:35:31,301 epoch 7 - iter 264/1326 - loss 0.02579215 - time (sec): 41.65 - samples/sec: 979.37 - lr: 0.000004
2024-04-29 19:35:51,752 epoch 7 - iter 396/1326 - loss 0.02557348 - time (sec): 62.10 - samples/sec: 929.89 - lr: 0.000004
2024-04-29 19:36:12,434 epoch 7 - iter 528/1326 - loss 0.02216509 - time (sec): 82.79 - samples/sec: 937.33 - lr: 0.000004
2024-04-29 19:36:33,493 epoch 7 - iter 660/1326 - loss 0.03440919 - time (sec): 103.84 - samples/sec: 940.35 - lr: 0.000004
2024-04-29 19:36:54,140 epoch 7 - iter 792/1326 - loss 0.03607959 - time (sec): 124.49 - samples/sec: 937.29 - lr: 0.000004
2024-04-29 19:37:14,771 epoch 7 - iter 924/1326 - loss 0.03383704 - time (sec): 145.12 - samples/sec: 934.65 - lr: 0.000004
2024-04-29 19:37:36,142 epoch 7 - iter 1056/1326 - loss 0.03426834 - time (sec): 166.49 - samples/sec: 949.38 - lr: 0.000004
2024-04-29 19:37:57,002 epoch 7 - iter 1188/1326 - loss 0.03352186 - time (sec): 187.35 - samples/sec: 943.93 - lr: 0.000004
2024-04-29 19:38:17,959 epoch 7 - iter 1320/1326 - loss 0.03484361 - time (sec): 208.31 - samples/sec: 945.40 - lr: 0.000004
2024-04-29 19:38:18,781 ----------------------------------------------------------------------------------------------------
2024-04-29 19:38:18,781 EPOCH 7 done: loss 0.0347 - lr 0.000004
2024-04-29 19:38:25,463 Evaluating as a multi-label problem: False
2024-04-29 19:38:25,471 DEV : loss 0.07581108063459396 - f1-score (micro avg)  0.6724
2024-04-29 19:38:25,479 ----------------------------------------------------------------------------------------------------
2024-04-29 19:38:47,172 epoch 8 - iter 132/1326 - loss 0.01665357 - time (sec): 21.69 - samples/sec: 1085.75 - lr: 0.000004
2024-04-29 19:39:08,019 epoch 8 - iter 264/1326 - loss 0.01646746 - time (sec): 42.54 - samples/sec: 1005.05 - lr: 0.000004
2024-04-29 19:39:28,795 epoch 8 - iter 396/1326 - loss 0.02015931 - time (sec): 63.32 - samples/sec: 966.81 - lr: 0.000004
2024-04-29 19:39:49,482 epoch 8 - iter 528/1326 - loss 0.01847810 - time (sec): 84.00 - samples/sec: 950.00 - lr: 0.000003
2024-04-29 19:40:10,218 epoch 8 - iter 660/1326 - loss 0.02597538 - time (sec): 104.74 - samples/sec: 938.07 - lr: 0.000003
2024-04-29 19:40:30,984 epoch 8 - iter 792/1326 - loss 0.03006068 - time (sec): 125.51 - samples/sec: 939.37 - lr: 0.000003
2024-04-29 19:40:51,804 epoch 8 - iter 924/1326 - loss 0.03293034 - time (sec): 146.33 - samples/sec: 934.87 - lr: 0.000003
2024-04-29 19:41:12,625 epoch 8 - iter 1056/1326 - loss 0.03359083 - time (sec): 167.15 - samples/sec: 933.82 - lr: 0.000003
2024-04-29 19:41:33,314 epoch 8 - iter 1188/1326 - loss 0.03258698 - time (sec): 187.84 - samples/sec: 935.06 - lr: 0.000003
2024-04-29 19:41:54,067 epoch 8 - iter 1320/1326 - loss 0.03148140 - time (sec): 208.59 - samples/sec: 935.60 - lr: 0.000003
2024-04-29 19:41:55,132 ----------------------------------------------------------------------------------------------------
2024-04-29 19:41:55,133 EPOCH 8 done: loss 0.0311 - lr 0.000003
2024-04-29 19:42:01,776 Evaluating as a multi-label problem: False
2024-04-29 19:42:01,783 DEV : loss 0.07848106324672699 - f1-score (micro avg)  0.7319
2024-04-29 19:42:01,793 ----------------------------------------------------------------------------------------------------
2024-04-29 19:42:23,088 epoch 9 - iter 132/1326 - loss 0.05663547 - time (sec): 21.30 - samples/sec: 957.54 - lr: 0.000003
2024-04-29 19:42:43,801 epoch 9 - iter 264/1326 - loss 0.03800695 - time (sec): 42.01 - samples/sec: 912.51 - lr: 0.000003
2024-04-29 19:43:04,820 epoch 9 - iter 396/1326 - loss 0.04253517 - time (sec): 63.03 - samples/sec: 935.29 - lr: 0.000003
2024-04-29 19:43:26,182 epoch 9 - iter 528/1326 - loss 0.03301648 - time (sec): 84.39 - samples/sec: 979.99 - lr: 0.000003
2024-04-29 19:43:46,865 epoch 9 - iter 660/1326 - loss 0.03207756 - time (sec): 105.07 - samples/sec: 970.86 - lr: 0.000003
2024-04-29 19:44:07,837 epoch 9 - iter 792/1326 - loss 0.02954204 - time (sec): 126.04 - samples/sec: 968.05 - lr: 0.000003
2024-04-29 19:44:28,325 epoch 9 - iter 924/1326 - loss 0.02863646 - time (sec): 146.53 - samples/sec: 957.30 - lr: 0.000003
2024-04-29 19:44:49,246 epoch 9 - iter 1056/1326 - loss 0.02794484 - time (sec): 167.45 - samples/sec: 953.28 - lr: 0.000003
2024-04-29 19:45:09,934 epoch 9 - iter 1188/1326 - loss 0.02632601 - time (sec): 188.14 - samples/sec: 957.31 - lr: 0.000003
2024-04-29 19:45:30,508 epoch 9 - iter 1320/1326 - loss 0.02520696 - time (sec): 208.72 - samples/sec: 944.49 - lr: 0.000003
2024-04-29 19:45:31,338 ----------------------------------------------------------------------------------------------------
2024-04-29 19:45:31,338 EPOCH 9 done: loss 0.0251 - lr 0.000003
2024-04-29 19:45:38,067 Evaluating as a multi-label problem: False
2024-04-29 19:45:38,074 DEV : loss 0.08854210376739502 - f1-score (micro avg)  0.7296
2024-04-29 19:45:38,082 ----------------------------------------------------------------------------------------------------
2024-04-29 19:45:59,093 epoch 10 - iter 132/1326 - loss 0.01233456 - time (sec): 21.01 - samples/sec: 955.33 - lr: 0.000003
2024-04-29 19:46:19,871 epoch 10 - iter 264/1326 - loss 0.01442453 - time (sec): 41.79 - samples/sec: 952.73 - lr: 0.000003
2024-04-29 19:46:40,768 epoch 10 - iter 396/1326 - loss 0.02226487 - time (sec): 62.69 - samples/sec: 949.03 - lr: 0.000003
2024-04-29 19:47:01,240 epoch 10 - iter 528/1326 - loss 0.02393851 - time (sec): 83.16 - samples/sec: 934.21 - lr: 0.000003
2024-04-29 19:47:22,284 epoch 10 - iter 660/1326 - loss 0.02377848 - time (sec): 104.20 - samples/sec: 950.07 - lr: 0.000003
2024-04-29 19:47:42,834 epoch 10 - iter 792/1326 - loss 0.02245760 - time (sec): 124.75 - samples/sec: 938.46 - lr: 0.000003
2024-04-29 19:48:04,028 epoch 10 - iter 924/1326 - loss 0.02302608 - time (sec): 145.95 - samples/sec: 946.93 - lr: 0.000003
2024-04-29 19:48:24,992 epoch 10 - iter 1056/1326 - loss 0.02390901 - time (sec): 166.91 - samples/sec: 950.20 - lr: 0.000003
2024-04-29 19:48:45,625 epoch 10 - iter 1188/1326 - loss 0.02196178 - time (sec): 187.54 - samples/sec: 947.51 - lr: 0.000003
2024-04-29 19:49:06,386 epoch 10 - iter 1320/1326 - loss 0.02391247 - time (sec): 208.30 - samples/sec: 941.85 - lr: 0.000003
2024-04-29 19:49:07,302 ----------------------------------------------------------------------------------------------------
2024-04-29 19:49:07,302 EPOCH 10 done: loss 0.0237 - lr 0.000003
2024-04-29 19:49:14,037 Evaluating as a multi-label problem: False
2024-04-29 19:49:14,044 DEV : loss 0.06483861804008484 - f1-score (micro avg)  0.6888
2024-04-29 19:49:14,054 ----------------------------------------------------------------------------------------------------
2024-04-29 19:49:35,184 epoch 11 - iter 132/1326 - loss 0.01376016 - time (sec): 21.13 - samples/sec: 954.01 - lr: 0.000002
2024-04-29 19:49:56,040 epoch 11 - iter 264/1326 - loss 0.00968912 - time (sec): 41.99 - samples/sec: 983.60 - lr: 0.000002
2024-04-29 19:50:17,256 epoch 11 - iter 396/1326 - loss 0.01934988 - time (sec): 63.20 - samples/sec: 982.16 - lr: 0.000002
2024-04-29 19:50:37,928 epoch 11 - iter 528/1326 - loss 0.02163005 - time (sec): 83.87 - samples/sec: 960.39 - lr: 0.000002
2024-04-29 19:50:58,697 epoch 11 - iter 660/1326 - loss 0.02394657 - time (sec): 104.64 - samples/sec: 960.85 - lr: 0.000002
2024-04-29 19:51:19,441 epoch 11 - iter 792/1326 - loss 0.02269684 - time (sec): 125.39 - samples/sec: 961.89 - lr: 0.000002
2024-04-29 19:51:40,444 epoch 11 - iter 924/1326 - loss 0.02139990 - time (sec): 146.39 - samples/sec: 962.55 - lr: 0.000002
2024-04-29 19:52:00,980 epoch 11 - iter 1056/1326 - loss 0.02027515 - time (sec): 166.93 - samples/sec: 952.95 - lr: 0.000002
2024-04-29 19:52:21,518 epoch 11 - iter 1188/1326 - loss 0.02049033 - time (sec): 187.46 - samples/sec: 947.71 - lr: 0.000002
2024-04-29 19:52:42,468 epoch 11 - iter 1320/1326 - loss 0.02230984 - time (sec): 208.41 - samples/sec: 946.46 - lr: 0.000002
2024-04-29 19:52:43,279 ----------------------------------------------------------------------------------------------------
2024-04-29 19:52:43,279 EPOCH 11 done: loss 0.0223 - lr 0.000002
2024-04-29 19:52:50,027 Evaluating as a multi-label problem: False
2024-04-29 19:52:50,034 DEV : loss 0.06557412445545197 - f1-score (micro avg)  0.7149
2024-04-29 19:52:50,043 ----------------------------------------------------------------------------------------------------
2024-04-29 19:53:11,582 epoch 12 - iter 132/1326 - loss 0.01024615 - time (sec): 21.54 - samples/sec: 1071.15 - lr: 0.000002
2024-04-29 19:53:32,395 epoch 12 - iter 264/1326 - loss 0.00787421 - time (sec): 42.35 - samples/sec: 1031.69 - lr: 0.000002
2024-04-29 19:53:52,905 epoch 12 - iter 396/1326 - loss 0.01420230 - time (sec): 62.86 - samples/sec: 977.17 - lr: 0.000002
2024-04-29 19:54:13,373 epoch 12 - iter 528/1326 - loss 0.01885577 - time (sec): 83.33 - samples/sec: 946.77 - lr: 0.000002
2024-04-29 19:54:33,970 epoch 12 - iter 660/1326 - loss 0.01639885 - time (sec): 103.93 - samples/sec: 939.51 - lr: 0.000002
2024-04-29 19:54:54,465 epoch 12 - iter 792/1326 - loss 0.01795589 - time (sec): 124.42 - samples/sec: 926.71 - lr: 0.000002
2024-04-29 19:55:15,425 epoch 12 - iter 924/1326 - loss 0.01955026 - time (sec): 145.38 - samples/sec: 932.15 - lr: 0.000002
2024-04-29 19:55:36,226 epoch 12 - iter 1056/1326 - loss 0.01918242 - time (sec): 166.18 - samples/sec: 937.12 - lr: 0.000002
2024-04-29 19:55:57,215 epoch 12 - iter 1188/1326 - loss 0.02175725 - time (sec): 187.17 - samples/sec: 940.08 - lr: 0.000002
2024-04-29 19:56:18,316 epoch 12 - iter 1320/1326 - loss 0.02169361 - time (sec): 208.27 - samples/sec: 945.08 - lr: 0.000002
2024-04-29 19:56:19,169 ----------------------------------------------------------------------------------------------------
2024-04-29 19:56:19,169 EPOCH 12 done: loss 0.0216 - lr 0.000002
2024-04-29 19:56:25,901 Evaluating as a multi-label problem: False
2024-04-29 19:56:25,908 DEV : loss 0.06691710650920868 - f1-score (micro avg)  0.7311
2024-04-29 19:56:25,917 ----------------------------------------------------------------------------------------------------
2024-04-29 19:56:46,852 epoch 13 - iter 132/1326 - loss 0.03298837 - time (sec): 20.93 - samples/sec: 897.53 - lr: 0.000002
2024-04-29 19:57:07,720 epoch 13 - iter 264/1326 - loss 0.02150903 - time (sec): 41.80 - samples/sec: 951.62 - lr: 0.000002
2024-04-29 19:57:28,830 epoch 13 - iter 396/1326 - loss 0.02136301 - time (sec): 62.91 - samples/sec: 965.03 - lr: 0.000002
2024-04-29 19:57:49,321 epoch 13 - iter 528/1326 - loss 0.01898453 - time (sec): 83.40 - samples/sec: 945.01 - lr: 0.000002
2024-04-29 19:58:09,888 epoch 13 - iter 660/1326 - loss 0.01879336 - time (sec): 103.97 - samples/sec: 935.56 - lr: 0.000002
2024-04-29 19:58:30,626 epoch 13 - iter 792/1326 - loss 0.01660965 - time (sec): 124.71 - samples/sec: 939.56 - lr: 0.000002
2024-04-29 19:58:50,952 epoch 13 - iter 924/1326 - loss 0.01499323 - time (sec): 145.03 - samples/sec: 927.70 - lr: 0.000001
2024-04-29 19:59:12,003 epoch 13 - iter 1056/1326 - loss 0.01833583 - time (sec): 166.09 - samples/sec: 931.71 - lr: 0.000001
2024-04-29 19:59:32,777 epoch 13 - iter 1188/1326 - loss 0.01712627 - time (sec): 186.86 - samples/sec: 940.76 - lr: 0.000001
2024-04-29 19:59:53,925 epoch 13 - iter 1320/1326 - loss 0.01715871 - time (sec): 208.01 - samples/sec: 947.16 - lr: 0.000001
2024-04-29 19:59:54,752 ----------------------------------------------------------------------------------------------------
2024-04-29 19:59:54,752 EPOCH 13 done: loss 0.0171 - lr 0.000001
2024-04-29 20:00:01,490 Evaluating as a multi-label problem: False
2024-04-29 20:00:01,498 DEV : loss 0.06450295448303223 - f1-score (micro avg)  0.75
2024-04-29 20:00:01,507 saving best model
2024-04-29 20:00:03,658 ----------------------------------------------------------------------------------------------------
2024-04-29 20:00:24,702 epoch 14 - iter 132/1326 - loss 0.03889764 - time (sec): 21.04 - samples/sec: 941.38 - lr: 0.000001
2024-04-29 20:00:45,459 epoch 14 - iter 264/1326 - loss 0.02605007 - time (sec): 41.80 - samples/sec: 966.52 - lr: 0.000001
2024-04-29 20:01:06,329 epoch 14 - iter 396/1326 - loss 0.01987798 - time (sec): 62.67 - samples/sec: 978.59 - lr: 0.000001
2024-04-29 20:01:27,084 epoch 14 - iter 528/1326 - loss 0.01886847 - time (sec): 83.43 - samples/sec: 974.50 - lr: 0.000001
2024-04-29 20:01:47,683 epoch 14 - iter 660/1326 - loss 0.01798242 - time (sec): 104.02 - samples/sec: 955.07 - lr: 0.000001
2024-04-29 20:02:08,157 epoch 14 - iter 792/1326 - loss 0.01593590 - time (sec): 124.50 - samples/sec: 943.49 - lr: 0.000001
2024-04-29 20:02:29,259 epoch 14 - iter 924/1326 - loss 0.01623625 - time (sec): 145.60 - samples/sec: 947.55 - lr: 0.000001
2024-04-29 20:02:49,779 epoch 14 - iter 1056/1326 - loss 0.01708562 - time (sec): 166.12 - samples/sec: 936.69 - lr: 0.000001
2024-04-29 20:03:10,855 epoch 14 - iter 1188/1326 - loss 0.01556387 - time (sec): 187.20 - samples/sec: 949.03 - lr: 0.000001
2024-04-29 20:03:31,683 epoch 14 - iter 1320/1326 - loss 0.01533842 - time (sec): 208.02 - samples/sec: 947.01 - lr: 0.000001
2024-04-29 20:03:32,470 ----------------------------------------------------------------------------------------------------
2024-04-29 20:03:32,470 EPOCH 14 done: loss 0.0153 - lr 0.000001
2024-04-29 20:03:39,240 Evaluating as a multi-label problem: False
2024-04-29 20:03:39,247 DEV : loss 0.0911756381392479 - f1-score (micro avg)  0.7288
2024-04-29 20:03:39,257 ----------------------------------------------------------------------------------------------------
2024-04-29 20:04:00,018 epoch 15 - iter 132/1326 - loss 0.01237652 - time (sec): 20.76 - samples/sec: 878.16 - lr: 0.000001
2024-04-29 20:04:20,615 epoch 15 - iter 264/1326 - loss 0.01436397 - time (sec): 41.36 - samples/sec: 879.70 - lr: 0.000001
2024-04-29 20:04:41,840 epoch 15 - iter 396/1326 - loss 0.01188224 - time (sec): 62.58 - samples/sec: 935.60 - lr: 0.000001
2024-04-29 20:05:02,449 epoch 15 - iter 528/1326 - loss 0.01191348 - time (sec): 83.19 - samples/sec: 931.41 - lr: 0.000001
2024-04-29 20:05:23,576 epoch 15 - iter 660/1326 - loss 0.01318250 - time (sec): 104.32 - samples/sec: 936.74 - lr: 0.000001
2024-04-29 20:05:44,259 epoch 15 - iter 792/1326 - loss 0.01610301 - time (sec): 125.00 - samples/sec: 935.86 - lr: 0.000001
2024-04-29 20:06:05,148 epoch 15 - iter 924/1326 - loss 0.01402320 - time (sec): 145.89 - samples/sec: 935.57 - lr: 0.000001
2024-04-29 20:06:26,080 epoch 15 - iter 1056/1326 - loss 0.01456286 - time (sec): 166.82 - samples/sec: 943.62 - lr: 0.000001
2024-04-29 20:06:46,684 epoch 15 - iter 1188/1326 - loss 0.01366503 - time (sec): 187.43 - samples/sec: 941.11 - lr: 0.000001
2024-04-29 20:07:07,514 epoch 15 - iter 1320/1326 - loss 0.01271998 - time (sec): 208.26 - samples/sec: 946.56 - lr: 0.000001
2024-04-29 20:07:08,330 ----------------------------------------------------------------------------------------------------
2024-04-29 20:07:08,330 EPOCH 15 done: loss 0.0127 - lr 0.000001
2024-04-29 20:07:15,376 Evaluating as a multi-label problem: False
2024-04-29 20:07:15,383 DEV : loss 0.0763971135020256 - f1-score (micro avg)  0.7424
2024-04-29 20:07:15,392 ----------------------------------------------------------------------------------------------------
2024-04-29 20:07:35,973 epoch 16 - iter 132/1326 - loss 0.00604141 - time (sec): 20.58 - samples/sec: 925.87 - lr: 0.000001
2024-04-29 20:07:56,877 epoch 16 - iter 264/1326 - loss 0.00962106 - time (sec): 41.48 - samples/sec: 944.64 - lr: 0.000001
2024-04-29 20:08:17,697 epoch 16 - iter 396/1326 - loss 0.00897610 - time (sec): 62.30 - samples/sec: 966.51 - lr: 0.000001
2024-04-29 20:08:38,452 epoch 16 - iter 528/1326 - loss 0.00930250 - time (sec): 83.06 - samples/sec: 969.77 - lr: 0.000001
2024-04-29 20:08:58,968 epoch 16 - iter 660/1326 - loss 0.01240910 - time (sec): 103.58 - samples/sec: 948.32 - lr: 0.000001
2024-04-29 20:09:19,865 epoch 16 - iter 792/1326 - loss 0.01194240 - time (sec): 124.47 - samples/sec: 955.31 - lr: 0.000001
2024-04-29 20:09:40,936 epoch 16 - iter 924/1326 - loss 0.01136229 - time (sec): 145.54 - samples/sec: 954.94 - lr: 0.000001
2024-04-29 20:10:01,211 epoch 16 - iter 1056/1326 - loss 0.01241511 - time (sec): 165.82 - samples/sec: 941.01 - lr: 0.000001
2024-04-29 20:10:22,278 epoch 16 - iter 1188/1326 - loss 0.01237126 - time (sec): 186.89 - samples/sec: 943.29 - lr: 0.000001
2024-04-29 20:10:43,285 epoch 16 - iter 1320/1326 - loss 0.01312447 - time (sec): 207.89 - samples/sec: 945.12 - lr: 0.000000
2024-04-29 20:10:44,167 ----------------------------------------------------------------------------------------------------
2024-04-29 20:10:44,168 EPOCH 16 done: loss 0.0131 - lr 0.000000
2024-04-29 20:10:51,196 Evaluating as a multi-label problem: False
2024-04-29 20:10:51,203 DEV : loss 0.07494457066059113 - f1-score (micro avg)  0.7532
2024-04-29 20:10:51,212 saving best model
2024-04-29 20:10:53,114 ----------------------------------------------------------------------------------------------------
2024-04-29 20:11:14,140 epoch 17 - iter 132/1326 - loss 0.01086358 - time (sec): 21.03 - samples/sec: 984.72 - lr: 0.000000
2024-04-29 20:11:34,547 epoch 17 - iter 264/1326 - loss 0.00784167 - time (sec): 41.43 - samples/sec: 909.28 - lr: 0.000000
2024-04-29 20:11:55,214 epoch 17 - iter 396/1326 - loss 0.00810142 - time (sec): 62.10 - samples/sec: 903.48 - lr: 0.000000
2024-04-29 20:12:16,277 epoch 17 - iter 528/1326 - loss 0.01276904 - time (sec): 83.16 - samples/sec: 959.18 - lr: 0.000000
2024-04-29 20:12:37,331 epoch 17 - iter 660/1326 - loss 0.01460244 - time (sec): 104.22 - samples/sec: 968.65 - lr: 0.000000
2024-04-29 20:12:58,109 epoch 17 - iter 792/1326 - loss 0.01585436 - time (sec): 124.99 - samples/sec: 959.39 - lr: 0.000000
2024-04-29 20:13:18,704 epoch 17 - iter 924/1326 - loss 0.01492635 - time (sec): 145.59 - samples/sec: 951.40 - lr: 0.000000
2024-04-29 20:13:39,820 epoch 17 - iter 1056/1326 - loss 0.01390514 - time (sec): 166.71 - samples/sec: 959.19 - lr: 0.000000
2024-04-29 20:14:00,123 epoch 17 - iter 1188/1326 - loss 0.01331555 - time (sec): 187.01 - samples/sec: 942.26 - lr: 0.000000
2024-04-29 20:14:21,036 epoch 17 - iter 1320/1326 - loss 0.01345542 - time (sec): 207.92 - samples/sec: 947.22 - lr: 0.000000
2024-04-29 20:14:21,870 ----------------------------------------------------------------------------------------------------
2024-04-29 20:14:21,870 EPOCH 17 done: loss 0.0135 - lr 0.000000
2024-04-29 20:14:28,607 Evaluating as a multi-label problem: False
2024-04-29 20:14:28,614 DEV : loss 0.07131695002317429 - f1-score (micro avg)  0.7265
2024-04-29 20:14:28,624 ----------------------------------------------------------------------------------------------------
2024-04-29 20:14:49,351 epoch 18 - iter 132/1326 - loss 0.00639355 - time (sec): 20.73 - samples/sec: 843.00 - lr: 0.000000
2024-04-29 20:15:10,862 epoch 18 - iter 264/1326 - loss 0.00675824 - time (sec): 42.24 - samples/sec: 972.00 - lr: 0.000000
2024-04-29 20:15:31,579 epoch 18 - iter 396/1326 - loss 0.00743064 - time (sec): 62.95 - samples/sec: 969.54 - lr: 0.000000
2024-04-29 20:15:52,280 epoch 18 - iter 528/1326 - loss 0.00891165 - time (sec): 83.66 - samples/sec: 964.18 - lr: 0.000000
2024-04-29 20:16:13,313 epoch 18 - iter 660/1326 - loss 0.01136151 - time (sec): 104.69 - samples/sec: 974.25 - lr: 0.000000
2024-04-29 20:16:34,312 epoch 18 - iter 792/1326 - loss 0.01088078 - time (sec): 125.69 - samples/sec: 974.73 - lr: 0.000000
2024-04-29 20:16:54,598 epoch 18 - iter 924/1326 - loss 0.01056691 - time (sec): 145.97 - samples/sec: 955.73 - lr: 0.000000
2024-04-29 20:17:15,485 epoch 18 - iter 1056/1326 - loss 0.01338623 - time (sec): 166.86 - samples/sec: 954.02 - lr: 0.000000
2024-04-29 20:17:36,152 epoch 18 - iter 1188/1326 - loss 0.01294660 - time (sec): 187.53 - samples/sec: 946.42 - lr: 0.000000
2024-04-29 20:17:56,803 epoch 18 - iter 1320/1326 - loss 0.01185896 - time (sec): 208.18 - samples/sec: 943.11 - lr: 0.000000
2024-04-29 20:17:57,706 ----------------------------------------------------------------------------------------------------
2024-04-29 20:17:57,707 EPOCH 18 done: loss 0.0118 - lr 0.000000
2024-04-29 20:18:04,460 Evaluating as a multi-label problem: False
2024-04-29 20:18:04,467 DEV : loss 0.06895702332258224 - f1-score (micro avg)  0.7143
2024-04-29 20:18:04,477 ----------------------------------------------------------------------------------------------------
2024-04-29 20:18:25,241 epoch 19 - iter 132/1326 - loss 0.00779178 - time (sec): 20.76 - samples/sec: 859.34 - lr: 0.000000
2024-04-29 20:18:45,931 epoch 19 - iter 264/1326 - loss 0.00883730 - time (sec): 41.45 - samples/sec: 903.23 - lr: 0.000000
2024-04-29 20:19:06,584 epoch 19 - iter 396/1326 - loss 0.00865701 - time (sec): 62.11 - samples/sec: 929.04 - lr: 0.000000
2024-04-29 20:19:27,143 epoch 19 - iter 528/1326 - loss 0.00931012 - time (sec): 82.67 - samples/sec: 933.53 - lr: 0.000000
2024-04-29 20:19:47,967 epoch 19 - iter 660/1326 - loss 0.00893505 - time (sec): 103.49 - samples/sec: 941.05 - lr: 0.000000
2024-04-29 20:20:09,276 epoch 19 - iter 792/1326 - loss 0.00983372 - time (sec): 124.80 - samples/sec: 965.06 - lr: 0.000000
2024-04-29 20:20:29,681 epoch 19 - iter 924/1326 - loss 0.01071250 - time (sec): 145.20 - samples/sec: 946.81 - lr: 0.000000
2024-04-29 20:20:50,714 epoch 19 - iter 1056/1326 - loss 0.01008226 - time (sec): 166.24 - samples/sec: 945.81 - lr: 0.000000
2024-04-29 20:21:11,577 epoch 19 - iter 1188/1326 - loss 0.01218936 - time (sec): 187.10 - samples/sec: 948.77 - lr: 0.000000
2024-04-29 20:21:32,191 epoch 19 - iter 1320/1326 - loss 0.01151174 - time (sec): 207.71 - samples/sec: 946.17 - lr: 0.000000
2024-04-29 20:21:33,090 ----------------------------------------------------------------------------------------------------
2024-04-29 20:21:33,090 EPOCH 19 done: loss 0.0115 - lr 0.000000
2024-04-29 20:21:39,713 Evaluating as a multi-label problem: False
2024-04-29 20:21:39,722 DEV : loss 0.06978413462638855 - f1-score (micro avg)  0.7296
2024-04-29 20:21:39,734 ----------------------------------------------------------------------------------------------------
2024-04-29 20:22:00,944 epoch 20 - iter 132/1326 - loss 0.00412526 - time (sec): 21.21 - samples/sec: 947.92 - lr: 0.000000
2024-04-29 20:22:21,834 epoch 20 - iter 264/1326 - loss 0.00391955 - time (sec): 42.10 - samples/sec: 947.36 - lr: 0.000000
2024-04-29 20:22:42,606 epoch 20 - iter 396/1326 - loss 0.00617386 - time (sec): 62.87 - samples/sec: 941.76 - lr: 0.000000
2024-04-29 20:23:02,937 epoch 20 - iter 528/1326 - loss 0.00598707 - time (sec): 83.20 - samples/sec: 924.06 - lr: 0.000000
2024-04-29 20:23:23,893 epoch 20 - iter 660/1326 - loss 0.00815138 - time (sec): 104.16 - samples/sec: 928.58 - lr: 0.000000
2024-04-29 20:23:45,135 epoch 20 - iter 792/1326 - loss 0.00815129 - time (sec): 125.40 - samples/sec: 958.62 - lr: 0.000000
2024-04-29 20:24:05,808 epoch 20 - iter 924/1326 - loss 0.00848857 - time (sec): 146.07 - samples/sec: 951.78 - lr: 0.000000
2024-04-29 20:24:26,900 epoch 20 - iter 1056/1326 - loss 0.00786540 - time (sec): 167.17 - samples/sec: 960.67 - lr: 0.000000
2024-04-29 20:24:47,236 epoch 20 - iter 1188/1326 - loss 0.00819986 - time (sec): 187.50 - samples/sec: 949.58 - lr: 0.000000
2024-04-29 20:25:07,654 epoch 20 - iter 1320/1326 - loss 0.01085452 - time (sec): 207.92 - samples/sec: 944.93 - lr: 0.000000
2024-04-29 20:25:08,614 ----------------------------------------------------------------------------------------------------
2024-04-29 20:25:08,614 EPOCH 20 done: loss 0.0108 - lr 0.000000
2024-04-29 20:25:15,217 Evaluating as a multi-label problem: False
2024-04-29 20:25:15,224 DEV : loss 0.06992758810520172 - f1-score (micro avg)  0.7296
2024-04-29 20:25:16,992 ----------------------------------------------------------------------------------------------------
2024-04-29 20:25:47,102 SequenceTagger predicts: Dictionary with 25 tags: O, S-ORG, B-ORG, E-ORG, I-ORG, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-MISC, B-MISC, E-MISC, I-MISC, S-UTE, B-UTE, E-UTE, I-UTE, S-SINGLE_COMPANY, B-SINGLE_COMPANY, E-SINGLE_COMPANY, I-SINGLE_COMPANY
2024-04-29 20:25:54,356 Evaluating as a multi-label problem: False
2024-04-29 20:25:54,363 0.7039	0.7868	0.7431	0.5944
2024-04-29 20:25:54,363 
Results:
- F-score (micro) 0.7431
- F-score (macro) 0.7429
- Accuracy 0.5944

By class:
                precision    recall  f1-score   support

           UTE     0.7568    0.7887    0.7724        71
SINGLE_COMPANY     0.6538    0.7846    0.7133        65

     micro avg     0.7039    0.7868    0.7431       136
     macro avg     0.7053    0.7867    0.7429       136
  weighted avg     0.7076    0.7868    0.7442       136

2024-04-29 20:25:54,363 ----------------------------------------------------------------------------------------------------