2023-10-20 00:19:28,193 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,193 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 128) (position_embeddings): Embedding(512, 128) (token_type_embeddings): Embedding(2, 128) (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-1): 2 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=128, out_features=128, bias=True) (key): Linear(in_features=128, out_features=128, bias=True) (value): Linear(in_features=128, out_features=128, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=128, out_features=128, bias=True) (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=128, out_features=512, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=512, out_features=128, bias=True) (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=128, out_features=128, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=128, out_features=17, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-20 00:19:28,193 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,193 MultiCorpus: 1085 train + 148 dev + 364 test sentences - NER_HIPE_2022 Corpus: 1085 train + 148 dev + 364 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/sv/with_doc_seperator 2023-10-20 00:19:28,193 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,193 Train: 1085 sentences 2023-10-20 00:19:28,194 (train_with_dev=False, train_with_test=False) 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 Training Params: 2023-10-20 00:19:28,194 - learning_rate: "5e-05" 2023-10-20 00:19:28,194 - mini_batch_size: "8" 2023-10-20 00:19:28,194 - max_epochs: "10" 2023-10-20 00:19:28,194 - shuffle: "True" 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 Plugins: 2023-10-20 00:19:28,194 - TensorboardLogger 2023-10-20 00:19:28,194 - LinearScheduler | warmup_fraction: '0.1' 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 Final evaluation on model from best epoch (best-model.pt) 2023-10-20 00:19:28,194 - metric: "('micro avg', 'f1-score')" 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 Computation: 2023-10-20 00:19:28,194 - compute on device: cuda:0 2023-10-20 00:19:28,194 - embedding storage: none 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 Model training base path: "hmbench-newseye/sv-dbmdz/bert-tiny-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3" 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:28,194 Logging anything other than scalars to TensorBoard is currently not supported. 2023-10-20 00:19:28,543 epoch 1 - iter 13/136 - loss 3.19543078 - time (sec): 0.35 - samples/sec: 15469.24 - lr: 0.000004 - momentum: 0.000000 2023-10-20 00:19:28,948 epoch 1 - iter 26/136 - loss 3.22038822 - time (sec): 0.75 - samples/sec: 14165.33 - lr: 0.000009 - momentum: 0.000000 2023-10-20 00:19:29,293 epoch 1 - iter 39/136 - loss 3.19607271 - time (sec): 1.10 - samples/sec: 13517.65 - lr: 0.000014 - momentum: 0.000000 2023-10-20 00:19:29,657 epoch 1 - iter 52/136 - loss 3.04295524 - time (sec): 1.46 - samples/sec: 13726.22 - lr: 0.000019 - momentum: 0.000000 2023-10-20 00:19:30,009 epoch 1 - iter 65/136 - loss 2.92618227 - time (sec): 1.81 - samples/sec: 13779.48 - lr: 0.000024 - momentum: 0.000000 2023-10-20 00:19:30,360 epoch 1 - iter 78/136 - loss 2.80205878 - time (sec): 2.16 - samples/sec: 13685.90 - lr: 0.000028 - momentum: 0.000000 2023-10-20 00:19:30,743 epoch 1 - iter 91/136 - loss 2.59000035 - time (sec): 2.55 - samples/sec: 13756.84 - lr: 0.000033 - momentum: 0.000000 2023-10-20 00:19:31,105 epoch 1 - iter 104/136 - loss 2.40125823 - time (sec): 2.91 - samples/sec: 13836.15 - lr: 0.000038 - momentum: 0.000000 2023-10-20 00:19:31,472 epoch 1 - iter 117/136 - loss 2.22245030 - time (sec): 3.28 - samples/sec: 13768.69 - lr: 0.000043 - momentum: 0.000000 2023-10-20 00:19:31,828 epoch 1 - iter 130/136 - loss 2.09152747 - time (sec): 3.63 - samples/sec: 13646.48 - lr: 0.000047 - momentum: 0.000000 2023-10-20 00:19:31,981 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:31,981 EPOCH 1 done: loss 2.0204 - lr: 0.000047 2023-10-20 00:19:32,413 DEV : loss 0.48634469509124756 - f1-score (micro avg) 0.0 2023-10-20 00:19:32,417 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:32,782 epoch 2 - iter 13/136 - loss 0.61667082 - time (sec): 0.36 - samples/sec: 14605.12 - lr: 0.000050 - momentum: 0.000000 2023-10-20 00:19:33,118 epoch 2 - iter 26/136 - loss 0.59353872 - time (sec): 0.70 - samples/sec: 13857.27 - lr: 0.000049 - momentum: 0.000000 2023-10-20 00:19:33,469 epoch 2 - iter 39/136 - loss 0.59011983 - time (sec): 1.05 - samples/sec: 14077.07 - lr: 0.000048 - momentum: 0.000000 2023-10-20 00:19:33,825 epoch 2 - iter 52/136 - loss 0.60851898 - time (sec): 1.41 - samples/sec: 13582.25 - lr: 0.000048 - momentum: 0.000000 2023-10-20 00:19:34,159 epoch 2 - iter 65/136 - loss 0.61187084 - time (sec): 1.74 - samples/sec: 13912.05 - lr: 0.000047 - momentum: 0.000000 2023-10-20 00:19:34,497 epoch 2 - iter 78/136 - loss 0.60541885 - time (sec): 2.08 - samples/sec: 13825.03 - lr: 0.000047 - momentum: 0.000000 2023-10-20 00:19:34,868 epoch 2 - iter 91/136 - loss 0.59156458 - time (sec): 2.45 - samples/sec: 14217.21 - lr: 0.000046 - momentum: 0.000000 2023-10-20 00:19:35,206 epoch 2 - iter 104/136 - loss 0.58873765 - time (sec): 2.79 - samples/sec: 14234.13 - lr: 0.000046 - momentum: 0.000000 2023-10-20 00:19:35,569 epoch 2 - iter 117/136 - loss 0.57805974 - time (sec): 3.15 - samples/sec: 14271.10 - lr: 0.000045 - momentum: 0.000000 2023-10-20 00:19:35,934 epoch 2 - iter 130/136 - loss 0.57644076 - time (sec): 3.52 - samples/sec: 14234.98 - lr: 0.000045 - momentum: 0.000000 2023-10-20 00:19:36,083 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:36,083 EPOCH 2 done: loss 0.5772 - lr: 0.000045 2023-10-20 00:19:36,837 DEV : loss 0.3948337435722351 - f1-score (micro avg) 0.0075 2023-10-20 00:19:36,841 saving best model 2023-10-20 00:19:36,868 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:37,225 epoch 3 - iter 13/136 - loss 0.44779209 - time (sec): 0.36 - samples/sec: 16057.93 - lr: 0.000044 - momentum: 0.000000 2023-10-20 00:19:37,754 epoch 3 - iter 26/136 - loss 0.50576744 - time (sec): 0.88 - samples/sec: 12586.75 - lr: 0.000043 - momentum: 0.000000 2023-10-20 00:19:38,113 epoch 3 - iter 39/136 - loss 0.50317613 - time (sec): 1.24 - samples/sec: 12503.45 - lr: 0.000043 - momentum: 0.000000 2023-10-20 00:19:38,443 epoch 3 - iter 52/136 - loss 0.51708175 - time (sec): 1.57 - samples/sec: 12602.70 - lr: 0.000042 - momentum: 0.000000 2023-10-20 00:19:38,789 epoch 3 - iter 65/136 - loss 0.50451654 - time (sec): 1.92 - samples/sec: 12958.78 - lr: 0.000042 - momentum: 0.000000 2023-10-20 00:19:39,138 epoch 3 - iter 78/136 - loss 0.50040298 - time (sec): 2.27 - samples/sec: 13045.81 - lr: 0.000041 - momentum: 0.000000 2023-10-20 00:19:39,497 epoch 3 - iter 91/136 - loss 0.47992802 - time (sec): 2.63 - samples/sec: 13327.75 - lr: 0.000041 - momentum: 0.000000 2023-10-20 00:19:39,855 epoch 3 - iter 104/136 - loss 0.46795860 - time (sec): 2.99 - samples/sec: 13396.76 - lr: 0.000040 - momentum: 0.000000 2023-10-20 00:19:40,200 epoch 3 - iter 117/136 - loss 0.45778857 - time (sec): 3.33 - samples/sec: 13517.21 - lr: 0.000040 - momentum: 0.000000 2023-10-20 00:19:40,549 epoch 3 - iter 130/136 - loss 0.45961641 - time (sec): 3.68 - samples/sec: 13694.11 - lr: 0.000039 - momentum: 0.000000 2023-10-20 00:19:40,702 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:40,702 EPOCH 3 done: loss 0.4671 - lr: 0.000039 2023-10-20 00:19:41,462 DEV : loss 0.3406325578689575 - f1-score (micro avg) 0.0404 2023-10-20 00:19:41,465 saving best model 2023-10-20 00:19:41,501 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:41,853 epoch 4 - iter 13/136 - loss 0.48746090 - time (sec): 0.35 - samples/sec: 14872.86 - lr: 0.000038 - momentum: 0.000000 2023-10-20 00:19:42,204 epoch 4 - iter 26/136 - loss 0.44588645 - time (sec): 0.70 - samples/sec: 13722.59 - lr: 0.000038 - momentum: 0.000000 2023-10-20 00:19:42,561 epoch 4 - iter 39/136 - loss 0.42506908 - time (sec): 1.06 - samples/sec: 14534.35 - lr: 0.000037 - momentum: 0.000000 2023-10-20 00:19:42,880 epoch 4 - iter 52/136 - loss 0.44755287 - time (sec): 1.38 - samples/sec: 14231.25 - lr: 0.000037 - momentum: 0.000000 2023-10-20 00:19:43,226 epoch 4 - iter 65/136 - loss 0.43784752 - time (sec): 1.72 - samples/sec: 14269.97 - lr: 0.000036 - momentum: 0.000000 2023-10-20 00:19:43,565 epoch 4 - iter 78/136 - loss 0.43894956 - time (sec): 2.06 - samples/sec: 14082.79 - lr: 0.000036 - momentum: 0.000000 2023-10-20 00:19:43,922 epoch 4 - iter 91/136 - loss 0.42988161 - time (sec): 2.42 - samples/sec: 14071.62 - lr: 0.000035 - momentum: 0.000000 2023-10-20 00:19:44,290 epoch 4 - iter 104/136 - loss 0.44230228 - time (sec): 2.79 - samples/sec: 14248.51 - lr: 0.000035 - momentum: 0.000000 2023-10-20 00:19:44,637 epoch 4 - iter 117/136 - loss 0.42429817 - time (sec): 3.14 - samples/sec: 14318.88 - lr: 0.000034 - momentum: 0.000000 2023-10-20 00:19:45,006 epoch 4 - iter 130/136 - loss 0.42504051 - time (sec): 3.50 - samples/sec: 14173.93 - lr: 0.000034 - momentum: 0.000000 2023-10-20 00:19:45,183 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:45,184 EPOCH 4 done: loss 0.4245 - lr: 0.000034 2023-10-20 00:19:45,947 DEV : loss 0.31668657064437866 - f1-score (micro avg) 0.0994 2023-10-20 00:19:45,951 saving best model 2023-10-20 00:19:45,981 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:46,331 epoch 5 - iter 13/136 - loss 0.34120642 - time (sec): 0.35 - samples/sec: 16994.89 - lr: 0.000033 - momentum: 0.000000 2023-10-20 00:19:46,633 epoch 5 - iter 26/136 - loss 0.39190825 - time (sec): 0.65 - samples/sec: 16537.05 - lr: 0.000032 - momentum: 0.000000 2023-10-20 00:19:46,936 epoch 5 - iter 39/136 - loss 0.39795350 - time (sec): 0.95 - samples/sec: 16391.65 - lr: 0.000032 - momentum: 0.000000 2023-10-20 00:19:47,233 epoch 5 - iter 52/136 - loss 0.39454601 - time (sec): 1.25 - samples/sec: 16038.41 - lr: 0.000031 - momentum: 0.000000 2023-10-20 00:19:47,585 epoch 5 - iter 65/136 - loss 0.41093864 - time (sec): 1.60 - samples/sec: 15228.22 - lr: 0.000031 - momentum: 0.000000 2023-10-20 00:19:47,971 epoch 5 - iter 78/136 - loss 0.39943659 - time (sec): 1.99 - samples/sec: 14784.83 - lr: 0.000030 - momentum: 0.000000 2023-10-20 00:19:48,317 epoch 5 - iter 91/136 - loss 0.39752169 - time (sec): 2.33 - samples/sec: 14730.01 - lr: 0.000030 - momentum: 0.000000 2023-10-20 00:19:48,680 epoch 5 - iter 104/136 - loss 0.39856873 - time (sec): 2.70 - samples/sec: 14614.41 - lr: 0.000029 - momentum: 0.000000 2023-10-20 00:19:49,209 epoch 5 - iter 117/136 - loss 0.38988851 - time (sec): 3.23 - samples/sec: 14035.96 - lr: 0.000029 - momentum: 0.000000 2023-10-20 00:19:49,553 epoch 5 - iter 130/136 - loss 0.39838319 - time (sec): 3.57 - samples/sec: 13921.89 - lr: 0.000028 - momentum: 0.000000 2023-10-20 00:19:49,706 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:49,707 EPOCH 5 done: loss 0.3952 - lr: 0.000028 2023-10-20 00:19:50,491 DEV : loss 0.29026198387145996 - f1-score (micro avg) 0.225 2023-10-20 00:19:50,496 saving best model 2023-10-20 00:19:50,526 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:50,904 epoch 6 - iter 13/136 - loss 0.40501629 - time (sec): 0.38 - samples/sec: 15503.09 - lr: 0.000027 - momentum: 0.000000 2023-10-20 00:19:51,254 epoch 6 - iter 26/136 - loss 0.39227644 - time (sec): 0.73 - samples/sec: 14839.73 - lr: 0.000027 - momentum: 0.000000 2023-10-20 00:19:51,591 epoch 6 - iter 39/136 - loss 0.39543997 - time (sec): 1.06 - samples/sec: 14629.58 - lr: 0.000026 - momentum: 0.000000 2023-10-20 00:19:51,928 epoch 6 - iter 52/136 - loss 0.39707843 - time (sec): 1.40 - samples/sec: 14486.55 - lr: 0.000026 - momentum: 0.000000 2023-10-20 00:19:52,236 epoch 6 - iter 65/136 - loss 0.39641930 - time (sec): 1.71 - samples/sec: 14418.55 - lr: 0.000025 - momentum: 0.000000 2023-10-20 00:19:52,597 epoch 6 - iter 78/136 - loss 0.39710370 - time (sec): 2.07 - samples/sec: 14391.09 - lr: 0.000025 - momentum: 0.000000 2023-10-20 00:19:52,968 epoch 6 - iter 91/136 - loss 0.38469036 - time (sec): 2.44 - samples/sec: 14359.92 - lr: 0.000024 - momentum: 0.000000 2023-10-20 00:19:53,342 epoch 6 - iter 104/136 - loss 0.38388361 - time (sec): 2.82 - samples/sec: 14455.59 - lr: 0.000024 - momentum: 0.000000 2023-10-20 00:19:53,702 epoch 6 - iter 117/136 - loss 0.37635338 - time (sec): 3.18 - samples/sec: 14368.77 - lr: 0.000023 - momentum: 0.000000 2023-10-20 00:19:54,052 epoch 6 - iter 130/136 - loss 0.37725190 - time (sec): 3.53 - samples/sec: 14122.05 - lr: 0.000023 - momentum: 0.000000 2023-10-20 00:19:54,209 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:54,209 EPOCH 6 done: loss 0.3757 - lr: 0.000023 2023-10-20 00:19:54,975 DEV : loss 0.2947162091732025 - f1-score (micro avg) 0.2638 2023-10-20 00:19:54,980 saving best model 2023-10-20 00:19:55,012 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:55,361 epoch 7 - iter 13/136 - loss 0.38648801 - time (sec): 0.35 - samples/sec: 14180.39 - lr: 0.000022 - momentum: 0.000000 2023-10-20 00:19:55,741 epoch 7 - iter 26/136 - loss 0.36262574 - time (sec): 0.73 - samples/sec: 15490.68 - lr: 0.000021 - momentum: 0.000000 2023-10-20 00:19:56,106 epoch 7 - iter 39/136 - loss 0.35983565 - time (sec): 1.09 - samples/sec: 14776.19 - lr: 0.000021 - momentum: 0.000000 2023-10-20 00:19:56,472 epoch 7 - iter 52/136 - loss 0.35684906 - time (sec): 1.46 - samples/sec: 14790.86 - lr: 0.000020 - momentum: 0.000000 2023-10-20 00:19:56,817 epoch 7 - iter 65/136 - loss 0.36008654 - time (sec): 1.80 - samples/sec: 14641.85 - lr: 0.000020 - momentum: 0.000000 2023-10-20 00:19:57,175 epoch 7 - iter 78/136 - loss 0.35821688 - time (sec): 2.16 - samples/sec: 14347.23 - lr: 0.000019 - momentum: 0.000000 2023-10-20 00:19:57,524 epoch 7 - iter 91/136 - loss 0.35574197 - time (sec): 2.51 - samples/sec: 14122.79 - lr: 0.000019 - momentum: 0.000000 2023-10-20 00:19:57,884 epoch 7 - iter 104/136 - loss 0.35669877 - time (sec): 2.87 - samples/sec: 13991.03 - lr: 0.000018 - momentum: 0.000000 2023-10-20 00:19:58,242 epoch 7 - iter 117/136 - loss 0.36030247 - time (sec): 3.23 - samples/sec: 13918.38 - lr: 0.000018 - momentum: 0.000000 2023-10-20 00:19:58,593 epoch 7 - iter 130/136 - loss 0.35760863 - time (sec): 3.58 - samples/sec: 13836.56 - lr: 0.000017 - momentum: 0.000000 2023-10-20 00:19:58,742 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:58,743 EPOCH 7 done: loss 0.3553 - lr: 0.000017 2023-10-20 00:19:59,506 DEV : loss 0.27363094687461853 - f1-score (micro avg) 0.3691 2023-10-20 00:19:59,510 saving best model 2023-10-20 00:19:59,540 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:19:59,908 epoch 8 - iter 13/136 - loss 0.32011516 - time (sec): 0.37 - samples/sec: 13845.71 - lr: 0.000016 - momentum: 0.000000 2023-10-20 00:20:00,253 epoch 8 - iter 26/136 - loss 0.35476183 - time (sec): 0.71 - samples/sec: 12491.86 - lr: 0.000016 - momentum: 0.000000 2023-10-20 00:20:00,620 epoch 8 - iter 39/136 - loss 0.34749382 - time (sec): 1.08 - samples/sec: 12912.86 - lr: 0.000015 - momentum: 0.000000 2023-10-20 00:20:01,133 epoch 8 - iter 52/136 - loss 0.34460647 - time (sec): 1.59 - samples/sec: 11803.96 - lr: 0.000015 - momentum: 0.000000 2023-10-20 00:20:01,488 epoch 8 - iter 65/136 - loss 0.35309931 - time (sec): 1.95 - samples/sec: 12351.00 - lr: 0.000014 - momentum: 0.000000 2023-10-20 00:20:01,826 epoch 8 - iter 78/136 - loss 0.34468370 - time (sec): 2.29 - samples/sec: 12819.03 - lr: 0.000014 - momentum: 0.000000 2023-10-20 00:20:02,164 epoch 8 - iter 91/136 - loss 0.34633273 - time (sec): 2.62 - samples/sec: 13073.10 - lr: 0.000013 - momentum: 0.000000 2023-10-20 00:20:02,522 epoch 8 - iter 104/136 - loss 0.34688716 - time (sec): 2.98 - samples/sec: 13150.29 - lr: 0.000013 - momentum: 0.000000 2023-10-20 00:20:02,867 epoch 8 - iter 117/136 - loss 0.35957973 - time (sec): 3.33 - samples/sec: 13158.03 - lr: 0.000012 - momentum: 0.000000 2023-10-20 00:20:03,234 epoch 8 - iter 130/136 - loss 0.34947771 - time (sec): 3.69 - samples/sec: 13480.20 - lr: 0.000012 - momentum: 0.000000 2023-10-20 00:20:03,400 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:20:03,400 EPOCH 8 done: loss 0.3439 - lr: 0.000012 2023-10-20 00:20:04,170 DEV : loss 0.2809211313724518 - f1-score (micro avg) 0.3295 2023-10-20 00:20:04,173 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:20:04,502 epoch 9 - iter 13/136 - loss 0.28581067 - time (sec): 0.33 - samples/sec: 12511.89 - lr: 0.000011 - momentum: 0.000000 2023-10-20 00:20:04,859 epoch 9 - iter 26/136 - loss 0.31168531 - time (sec): 0.68 - samples/sec: 12900.20 - lr: 0.000010 - momentum: 0.000000 2023-10-20 00:20:05,210 epoch 9 - iter 39/136 - loss 0.31437904 - time (sec): 1.04 - samples/sec: 12554.30 - lr: 0.000010 - momentum: 0.000000 2023-10-20 00:20:05,579 epoch 9 - iter 52/136 - loss 0.33259729 - time (sec): 1.41 - samples/sec: 13770.64 - lr: 0.000009 - momentum: 0.000000 2023-10-20 00:20:05,941 epoch 9 - iter 65/136 - loss 0.34489532 - time (sec): 1.77 - samples/sec: 13841.07 - lr: 0.000009 - momentum: 0.000000 2023-10-20 00:20:06,290 epoch 9 - iter 78/136 - loss 0.34040873 - time (sec): 2.12 - samples/sec: 13635.95 - lr: 0.000008 - momentum: 0.000000 2023-10-20 00:20:06,671 epoch 9 - iter 91/136 - loss 0.33903400 - time (sec): 2.50 - samples/sec: 13604.65 - lr: 0.000008 - momentum: 0.000000 2023-10-20 00:20:07,066 epoch 9 - iter 104/136 - loss 0.33902197 - time (sec): 2.89 - samples/sec: 13807.93 - lr: 0.000007 - momentum: 0.000000 2023-10-20 00:20:07,436 epoch 9 - iter 117/136 - loss 0.33750559 - time (sec): 3.26 - samples/sec: 13641.81 - lr: 0.000007 - momentum: 0.000000 2023-10-20 00:20:07,821 epoch 9 - iter 130/136 - loss 0.33087811 - time (sec): 3.65 - samples/sec: 13492.13 - lr: 0.000006 - momentum: 0.000000 2023-10-20 00:20:08,000 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:20:08,000 EPOCH 9 done: loss 0.3309 - lr: 0.000006 2023-10-20 00:20:08,765 DEV : loss 0.2698618173599243 - f1-score (micro avg) 0.3812 2023-10-20 00:20:08,769 saving best model 2023-10-20 00:20:08,808 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:20:09,183 epoch 10 - iter 13/136 - loss 0.24667172 - time (sec): 0.37 - samples/sec: 13137.87 - lr: 0.000005 - momentum: 0.000000 2023-10-20 00:20:09,530 epoch 10 - iter 26/136 - loss 0.30568450 - time (sec): 0.72 - samples/sec: 11802.58 - lr: 0.000005 - momentum: 0.000000 2023-10-20 00:20:09,918 epoch 10 - iter 39/136 - loss 0.32803758 - time (sec): 1.11 - samples/sec: 12642.75 - lr: 0.000004 - momentum: 0.000000 2023-10-20 00:20:10,286 epoch 10 - iter 52/136 - loss 0.34218904 - time (sec): 1.48 - samples/sec: 12879.64 - lr: 0.000004 - momentum: 0.000000 2023-10-20 00:20:10,669 epoch 10 - iter 65/136 - loss 0.32639689 - time (sec): 1.86 - samples/sec: 13308.76 - lr: 0.000003 - momentum: 0.000000 2023-10-20 00:20:11,036 epoch 10 - iter 78/136 - loss 0.31420879 - time (sec): 2.23 - samples/sec: 13580.91 - lr: 0.000003 - momentum: 0.000000 2023-10-20 00:20:11,425 epoch 10 - iter 91/136 - loss 0.31330042 - time (sec): 2.62 - samples/sec: 13561.88 - lr: 0.000002 - momentum: 0.000000 2023-10-20 00:20:11,772 epoch 10 - iter 104/136 - loss 0.31188849 - time (sec): 2.96 - samples/sec: 13483.98 - lr: 0.000002 - momentum: 0.000000 2023-10-20 00:20:12,114 epoch 10 - iter 117/136 - loss 0.31942500 - time (sec): 3.31 - samples/sec: 13642.34 - lr: 0.000001 - momentum: 0.000000 2023-10-20 00:20:12,470 epoch 10 - iter 130/136 - loss 0.32810721 - time (sec): 3.66 - samples/sec: 13667.04 - lr: 0.000000 - momentum: 0.000000 2023-10-20 00:20:12,629 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:20:12,629 EPOCH 10 done: loss 0.3262 - lr: 0.000000 2023-10-20 00:20:13,544 DEV : loss 0.27006223797798157 - f1-score (micro avg) 0.3856 2023-10-20 00:20:13,548 saving best model 2023-10-20 00:20:13,606 ---------------------------------------------------------------------------------------------------- 2023-10-20 00:20:13,606 Loading model from best epoch ... 2023-10-20 00:20:13,675 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd, S-ORG, B-ORG, E-ORG, I-ORG 2023-10-20 00:20:14,476 Results: - F-score (micro) 0.3363 - F-score (macro) 0.1773 - Accuracy 0.2117 By class: precision recall f1-score support LOC 0.5020 0.3974 0.4436 312 PER 0.2336 0.3077 0.2656 208 ORG 0.0000 0.0000 0.0000 55 HumanProd 0.0000 0.0000 0.0000 22 micro avg 0.3608 0.3149 0.3363 597 macro avg 0.1839 0.1763 0.1773 597 weighted avg 0.3437 0.3149 0.3244 597 2023-10-20 00:20:14,476 ----------------------------------------------------------------------------------------------------