stefan-it's picture
Upload folder using huggingface_hub
a017383
2023-10-18 23:15:16,265 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,266 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 128)
(position_embeddings): Embedding(512, 128)
(token_type_embeddings): Embedding(2, 128)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-1): 2 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=128, out_features=128, bias=True)
(key): Linear(in_features=128, out_features=128, bias=True)
(value): Linear(in_features=128, out_features=128, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=128, out_features=128, bias=True)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=128, out_features=512, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=512, out_features=128, bias=True)
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=128, out_features=128, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=128, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-18 23:15:16,266 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,266 MultiCorpus: 5777 train + 722 dev + 723 test sentences
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /root/.flair/datasets/ner_icdar_europeana/nl
2023-10-18 23:15:16,266 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,266 Train: 5777 sentences
2023-10-18 23:15:16,266 (train_with_dev=False, train_with_test=False)
2023-10-18 23:15:16,266 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,266 Training Params:
2023-10-18 23:15:16,266 - learning_rate: "5e-05"
2023-10-18 23:15:16,266 - mini_batch_size: "8"
2023-10-18 23:15:16,266 - max_epochs: "10"
2023-10-18 23:15:16,266 - shuffle: "True"
2023-10-18 23:15:16,266 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,266 Plugins:
2023-10-18 23:15:16,266 - TensorboardLogger
2023-10-18 23:15:16,266 - LinearScheduler | warmup_fraction: '0.1'
2023-10-18 23:15:16,266 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,266 Final evaluation on model from best epoch (best-model.pt)
2023-10-18 23:15:16,266 - metric: "('micro avg', 'f1-score')"
2023-10-18 23:15:16,267 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,267 Computation:
2023-10-18 23:15:16,267 - compute on device: cuda:0
2023-10-18 23:15:16,267 - embedding storage: none
2023-10-18 23:15:16,267 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,267 Model training base path: "hmbench-icdar/nl-dbmdz/bert-tiny-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5"
2023-10-18 23:15:16,267 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,267 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:16,267 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-18 23:15:18,125 epoch 1 - iter 72/723 - loss 2.40908332 - time (sec): 1.86 - samples/sec: 9509.35 - lr: 0.000005 - momentum: 0.000000
2023-10-18 23:15:19,840 epoch 1 - iter 144/723 - loss 2.18163332 - time (sec): 3.57 - samples/sec: 9437.85 - lr: 0.000010 - momentum: 0.000000
2023-10-18 23:15:21,665 epoch 1 - iter 216/723 - loss 1.79762825 - time (sec): 5.40 - samples/sec: 9717.50 - lr: 0.000015 - momentum: 0.000000
2023-10-18 23:15:23,570 epoch 1 - iter 288/723 - loss 1.47395731 - time (sec): 7.30 - samples/sec: 9671.73 - lr: 0.000020 - momentum: 0.000000
2023-10-18 23:15:25,411 epoch 1 - iter 360/723 - loss 1.24759954 - time (sec): 9.14 - samples/sec: 9756.94 - lr: 0.000025 - momentum: 0.000000
2023-10-18 23:15:27,208 epoch 1 - iter 432/723 - loss 1.11186962 - time (sec): 10.94 - samples/sec: 9610.13 - lr: 0.000030 - momentum: 0.000000
2023-10-18 23:15:28,981 epoch 1 - iter 504/723 - loss 0.99525058 - time (sec): 12.71 - samples/sec: 9687.61 - lr: 0.000035 - momentum: 0.000000
2023-10-18 23:15:30,762 epoch 1 - iter 576/723 - loss 0.89980120 - time (sec): 14.50 - samples/sec: 9757.34 - lr: 0.000040 - momentum: 0.000000
2023-10-18 23:15:32,561 epoch 1 - iter 648/723 - loss 0.83006529 - time (sec): 16.29 - samples/sec: 9756.67 - lr: 0.000045 - momentum: 0.000000
2023-10-18 23:15:34,301 epoch 1 - iter 720/723 - loss 0.77620201 - time (sec): 18.03 - samples/sec: 9734.29 - lr: 0.000050 - momentum: 0.000000
2023-10-18 23:15:34,369 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:34,369 EPOCH 1 done: loss 0.7741 - lr: 0.000050
2023-10-18 23:15:35,657 DEV : loss 0.2758811414241791 - f1-score (micro avg) 0.0082
2023-10-18 23:15:35,671 saving best model
2023-10-18 23:15:35,701 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:37,598 epoch 2 - iter 72/723 - loss 0.25857317 - time (sec): 1.90 - samples/sec: 9119.75 - lr: 0.000049 - momentum: 0.000000
2023-10-18 23:15:39,477 epoch 2 - iter 144/723 - loss 0.23888287 - time (sec): 3.78 - samples/sec: 9439.22 - lr: 0.000049 - momentum: 0.000000
2023-10-18 23:15:41,200 epoch 2 - iter 216/723 - loss 0.23446606 - time (sec): 5.50 - samples/sec: 9413.80 - lr: 0.000048 - momentum: 0.000000
2023-10-18 23:15:42,964 epoch 2 - iter 288/723 - loss 0.22142250 - time (sec): 7.26 - samples/sec: 9657.27 - lr: 0.000048 - momentum: 0.000000
2023-10-18 23:15:44,709 epoch 2 - iter 360/723 - loss 0.21787192 - time (sec): 9.01 - samples/sec: 9648.55 - lr: 0.000047 - momentum: 0.000000
2023-10-18 23:15:46,447 epoch 2 - iter 432/723 - loss 0.21454913 - time (sec): 10.75 - samples/sec: 9633.59 - lr: 0.000047 - momentum: 0.000000
2023-10-18 23:15:48,218 epoch 2 - iter 504/723 - loss 0.21228027 - time (sec): 12.52 - samples/sec: 9686.73 - lr: 0.000046 - momentum: 0.000000
2023-10-18 23:15:50,039 epoch 2 - iter 576/723 - loss 0.21139498 - time (sec): 14.34 - samples/sec: 9809.15 - lr: 0.000046 - momentum: 0.000000
2023-10-18 23:15:51,821 epoch 2 - iter 648/723 - loss 0.20851881 - time (sec): 16.12 - samples/sec: 9824.09 - lr: 0.000045 - momentum: 0.000000
2023-10-18 23:15:53,645 epoch 2 - iter 720/723 - loss 0.20557912 - time (sec): 17.94 - samples/sec: 9786.99 - lr: 0.000044 - momentum: 0.000000
2023-10-18 23:15:53,709 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:53,710 EPOCH 2 done: loss 0.2053 - lr: 0.000044
2023-10-18 23:15:55,808 DEV : loss 0.2405446618795395 - f1-score (micro avg) 0.1862
2023-10-18 23:15:55,822 saving best model
2023-10-18 23:15:55,858 ----------------------------------------------------------------------------------------------------
2023-10-18 23:15:57,614 epoch 3 - iter 72/723 - loss 0.17701188 - time (sec): 1.76 - samples/sec: 10573.26 - lr: 0.000044 - momentum: 0.000000
2023-10-18 23:15:59,375 epoch 3 - iter 144/723 - loss 0.18046644 - time (sec): 3.52 - samples/sec: 10214.55 - lr: 0.000043 - momentum: 0.000000
2023-10-18 23:16:01,273 epoch 3 - iter 216/723 - loss 0.17924447 - time (sec): 5.41 - samples/sec: 10016.01 - lr: 0.000043 - momentum: 0.000000
2023-10-18 23:16:03,056 epoch 3 - iter 288/723 - loss 0.17534354 - time (sec): 7.20 - samples/sec: 9979.60 - lr: 0.000042 - momentum: 0.000000
2023-10-18 23:16:04,795 epoch 3 - iter 360/723 - loss 0.17549661 - time (sec): 8.94 - samples/sec: 9890.73 - lr: 0.000042 - momentum: 0.000000
2023-10-18 23:16:06,577 epoch 3 - iter 432/723 - loss 0.17806596 - time (sec): 10.72 - samples/sec: 9864.63 - lr: 0.000041 - momentum: 0.000000
2023-10-18 23:16:08,395 epoch 3 - iter 504/723 - loss 0.17649226 - time (sec): 12.54 - samples/sec: 9759.78 - lr: 0.000041 - momentum: 0.000000
2023-10-18 23:16:10,276 epoch 3 - iter 576/723 - loss 0.17718147 - time (sec): 14.42 - samples/sec: 9766.82 - lr: 0.000040 - momentum: 0.000000
2023-10-18 23:16:12,017 epoch 3 - iter 648/723 - loss 0.17435274 - time (sec): 16.16 - samples/sec: 9788.62 - lr: 0.000039 - momentum: 0.000000
2023-10-18 23:16:13,753 epoch 3 - iter 720/723 - loss 0.17453446 - time (sec): 17.89 - samples/sec: 9819.40 - lr: 0.000039 - momentum: 0.000000
2023-10-18 23:16:13,814 ----------------------------------------------------------------------------------------------------
2023-10-18 23:16:13,814 EPOCH 3 done: loss 0.1744 - lr: 0.000039
2023-10-18 23:16:15,580 DEV : loss 0.19726891815662384 - f1-score (micro avg) 0.4418
2023-10-18 23:16:15,595 saving best model
2023-10-18 23:16:15,630 ----------------------------------------------------------------------------------------------------
2023-10-18 23:16:17,465 epoch 4 - iter 72/723 - loss 0.19970285 - time (sec): 1.83 - samples/sec: 9313.18 - lr: 0.000038 - momentum: 0.000000
2023-10-18 23:16:19,232 epoch 4 - iter 144/723 - loss 0.16333378 - time (sec): 3.60 - samples/sec: 9634.99 - lr: 0.000038 - momentum: 0.000000
2023-10-18 23:16:21,027 epoch 4 - iter 216/723 - loss 0.16709855 - time (sec): 5.40 - samples/sec: 9499.46 - lr: 0.000037 - momentum: 0.000000
2023-10-18 23:16:22,919 epoch 4 - iter 288/723 - loss 0.16278167 - time (sec): 7.29 - samples/sec: 9586.69 - lr: 0.000037 - momentum: 0.000000
2023-10-18 23:16:24,700 epoch 4 - iter 360/723 - loss 0.16462834 - time (sec): 9.07 - samples/sec: 9779.30 - lr: 0.000036 - momentum: 0.000000
2023-10-18 23:16:26,498 epoch 4 - iter 432/723 - loss 0.16054808 - time (sec): 10.87 - samples/sec: 9870.03 - lr: 0.000036 - momentum: 0.000000
2023-10-18 23:16:28,289 epoch 4 - iter 504/723 - loss 0.15875214 - time (sec): 12.66 - samples/sec: 9784.37 - lr: 0.000035 - momentum: 0.000000
2023-10-18 23:16:30,002 epoch 4 - iter 576/723 - loss 0.15882572 - time (sec): 14.37 - samples/sec: 9832.90 - lr: 0.000034 - momentum: 0.000000
2023-10-18 23:16:32,220 epoch 4 - iter 648/723 - loss 0.15908778 - time (sec): 16.59 - samples/sec: 9607.35 - lr: 0.000034 - momentum: 0.000000
2023-10-18 23:16:33,961 epoch 4 - iter 720/723 - loss 0.15739102 - time (sec): 18.33 - samples/sec: 9590.24 - lr: 0.000033 - momentum: 0.000000
2023-10-18 23:16:34,026 ----------------------------------------------------------------------------------------------------
2023-10-18 23:16:34,026 EPOCH 4 done: loss 0.1573 - lr: 0.000033
2023-10-18 23:16:35,788 DEV : loss 0.18398496508598328 - f1-score (micro avg) 0.4749
2023-10-18 23:16:35,803 saving best model
2023-10-18 23:16:35,838 ----------------------------------------------------------------------------------------------------
2023-10-18 23:16:37,539 epoch 5 - iter 72/723 - loss 0.16602905 - time (sec): 1.70 - samples/sec: 9888.68 - lr: 0.000033 - momentum: 0.000000
2023-10-18 23:16:39,324 epoch 5 - iter 144/723 - loss 0.15855908 - time (sec): 3.49 - samples/sec: 9666.77 - lr: 0.000032 - momentum: 0.000000
2023-10-18 23:16:41,069 epoch 5 - iter 216/723 - loss 0.15565325 - time (sec): 5.23 - samples/sec: 9589.97 - lr: 0.000032 - momentum: 0.000000
2023-10-18 23:16:42,855 epoch 5 - iter 288/723 - loss 0.15137039 - time (sec): 7.02 - samples/sec: 9638.72 - lr: 0.000031 - momentum: 0.000000
2023-10-18 23:16:44,673 epoch 5 - iter 360/723 - loss 0.15306731 - time (sec): 8.84 - samples/sec: 9834.34 - lr: 0.000031 - momentum: 0.000000
2023-10-18 23:16:46,518 epoch 5 - iter 432/723 - loss 0.14937650 - time (sec): 10.68 - samples/sec: 9838.86 - lr: 0.000030 - momentum: 0.000000
2023-10-18 23:16:48,257 epoch 5 - iter 504/723 - loss 0.14562850 - time (sec): 12.42 - samples/sec: 9845.97 - lr: 0.000029 - momentum: 0.000000
2023-10-18 23:16:49,944 epoch 5 - iter 576/723 - loss 0.14633530 - time (sec): 14.11 - samples/sec: 9882.96 - lr: 0.000029 - momentum: 0.000000
2023-10-18 23:16:51,751 epoch 5 - iter 648/723 - loss 0.14812330 - time (sec): 15.91 - samples/sec: 9847.09 - lr: 0.000028 - momentum: 0.000000
2023-10-18 23:16:53,692 epoch 5 - iter 720/723 - loss 0.14451958 - time (sec): 17.85 - samples/sec: 9841.50 - lr: 0.000028 - momentum: 0.000000
2023-10-18 23:16:53,748 ----------------------------------------------------------------------------------------------------
2023-10-18 23:16:53,748 EPOCH 5 done: loss 0.1448 - lr: 0.000028
2023-10-18 23:16:55,530 DEV : loss 0.1783648133277893 - f1-score (micro avg) 0.4941
2023-10-18 23:16:55,544 saving best model
2023-10-18 23:16:55,580 ----------------------------------------------------------------------------------------------------
2023-10-18 23:16:57,339 epoch 6 - iter 72/723 - loss 0.12412838 - time (sec): 1.76 - samples/sec: 10153.63 - lr: 0.000027 - momentum: 0.000000
2023-10-18 23:16:59,153 epoch 6 - iter 144/723 - loss 0.12845739 - time (sec): 3.57 - samples/sec: 10111.64 - lr: 0.000027 - momentum: 0.000000
2023-10-18 23:17:01,000 epoch 6 - iter 216/723 - loss 0.13059131 - time (sec): 5.42 - samples/sec: 10079.33 - lr: 0.000026 - momentum: 0.000000
2023-10-18 23:17:02,729 epoch 6 - iter 288/723 - loss 0.13011154 - time (sec): 7.15 - samples/sec: 10062.07 - lr: 0.000026 - momentum: 0.000000
2023-10-18 23:17:04,837 epoch 6 - iter 360/723 - loss 0.13108817 - time (sec): 9.26 - samples/sec: 9646.45 - lr: 0.000025 - momentum: 0.000000
2023-10-18 23:17:06,702 epoch 6 - iter 432/723 - loss 0.13199062 - time (sec): 11.12 - samples/sec: 9684.01 - lr: 0.000024 - momentum: 0.000000
2023-10-18 23:17:08,534 epoch 6 - iter 504/723 - loss 0.13635868 - time (sec): 12.95 - samples/sec: 9661.56 - lr: 0.000024 - momentum: 0.000000
2023-10-18 23:17:10,365 epoch 6 - iter 576/723 - loss 0.13682865 - time (sec): 14.78 - samples/sec: 9566.59 - lr: 0.000023 - momentum: 0.000000
2023-10-18 23:17:12,189 epoch 6 - iter 648/723 - loss 0.13499612 - time (sec): 16.61 - samples/sec: 9533.05 - lr: 0.000023 - momentum: 0.000000
2023-10-18 23:17:13,891 epoch 6 - iter 720/723 - loss 0.13337880 - time (sec): 18.31 - samples/sec: 9592.34 - lr: 0.000022 - momentum: 0.000000
2023-10-18 23:17:13,955 ----------------------------------------------------------------------------------------------------
2023-10-18 23:17:13,955 EPOCH 6 done: loss 0.1334 - lr: 0.000022
2023-10-18 23:17:15,731 DEV : loss 0.18174949288368225 - f1-score (micro avg) 0.4952
2023-10-18 23:17:15,745 saving best model
2023-10-18 23:17:15,781 ----------------------------------------------------------------------------------------------------
2023-10-18 23:17:17,561 epoch 7 - iter 72/723 - loss 0.11870149 - time (sec): 1.78 - samples/sec: 9897.22 - lr: 0.000022 - momentum: 0.000000
2023-10-18 23:17:19,339 epoch 7 - iter 144/723 - loss 0.12495624 - time (sec): 3.56 - samples/sec: 9460.86 - lr: 0.000021 - momentum: 0.000000
2023-10-18 23:17:21,174 epoch 7 - iter 216/723 - loss 0.12415400 - time (sec): 5.39 - samples/sec: 9681.31 - lr: 0.000021 - momentum: 0.000000
2023-10-18 23:17:23,015 epoch 7 - iter 288/723 - loss 0.12335361 - time (sec): 7.23 - samples/sec: 9529.01 - lr: 0.000020 - momentum: 0.000000
2023-10-18 23:17:24,834 epoch 7 - iter 360/723 - loss 0.12487733 - time (sec): 9.05 - samples/sec: 9600.12 - lr: 0.000019 - momentum: 0.000000
2023-10-18 23:17:26,642 epoch 7 - iter 432/723 - loss 0.12677615 - time (sec): 10.86 - samples/sec: 9665.79 - lr: 0.000019 - momentum: 0.000000
2023-10-18 23:17:28,481 epoch 7 - iter 504/723 - loss 0.12846292 - time (sec): 12.70 - samples/sec: 9621.48 - lr: 0.000018 - momentum: 0.000000
2023-10-18 23:17:30,414 epoch 7 - iter 576/723 - loss 0.12813188 - time (sec): 14.63 - samples/sec: 9660.86 - lr: 0.000018 - momentum: 0.000000
2023-10-18 23:17:32,221 epoch 7 - iter 648/723 - loss 0.12928188 - time (sec): 16.44 - samples/sec: 9623.10 - lr: 0.000017 - momentum: 0.000000
2023-10-18 23:17:33,963 epoch 7 - iter 720/723 - loss 0.12811456 - time (sec): 18.18 - samples/sec: 9668.44 - lr: 0.000017 - momentum: 0.000000
2023-10-18 23:17:34,024 ----------------------------------------------------------------------------------------------------
2023-10-18 23:17:34,024 EPOCH 7 done: loss 0.1281 - lr: 0.000017
2023-10-18 23:17:35,808 DEV : loss 0.1717141568660736 - f1-score (micro avg) 0.5346
2023-10-18 23:17:35,823 saving best model
2023-10-18 23:17:35,858 ----------------------------------------------------------------------------------------------------
2023-10-18 23:17:38,073 epoch 8 - iter 72/723 - loss 0.12299219 - time (sec): 2.21 - samples/sec: 8057.14 - lr: 0.000016 - momentum: 0.000000
2023-10-18 23:17:39,830 epoch 8 - iter 144/723 - loss 0.11754273 - time (sec): 3.97 - samples/sec: 8740.64 - lr: 0.000016 - momentum: 0.000000
2023-10-18 23:17:41,690 epoch 8 - iter 216/723 - loss 0.11975555 - time (sec): 5.83 - samples/sec: 9103.15 - lr: 0.000015 - momentum: 0.000000
2023-10-18 23:17:43,614 epoch 8 - iter 288/723 - loss 0.11991985 - time (sec): 7.75 - samples/sec: 9207.08 - lr: 0.000014 - momentum: 0.000000
2023-10-18 23:17:45,415 epoch 8 - iter 360/723 - loss 0.12012028 - time (sec): 9.56 - samples/sec: 9293.87 - lr: 0.000014 - momentum: 0.000000
2023-10-18 23:17:47,181 epoch 8 - iter 432/723 - loss 0.12323901 - time (sec): 11.32 - samples/sec: 9411.71 - lr: 0.000013 - momentum: 0.000000
2023-10-18 23:17:48,997 epoch 8 - iter 504/723 - loss 0.12390650 - time (sec): 13.14 - samples/sec: 9515.34 - lr: 0.000013 - momentum: 0.000000
2023-10-18 23:17:50,731 epoch 8 - iter 576/723 - loss 0.12261161 - time (sec): 14.87 - samples/sec: 9531.16 - lr: 0.000012 - momentum: 0.000000
2023-10-18 23:17:52,540 epoch 8 - iter 648/723 - loss 0.12309668 - time (sec): 16.68 - samples/sec: 9526.96 - lr: 0.000012 - momentum: 0.000000
2023-10-18 23:17:54,314 epoch 8 - iter 720/723 - loss 0.12212658 - time (sec): 18.46 - samples/sec: 9518.50 - lr: 0.000011 - momentum: 0.000000
2023-10-18 23:17:54,372 ----------------------------------------------------------------------------------------------------
2023-10-18 23:17:54,372 EPOCH 8 done: loss 0.1221 - lr: 0.000011
2023-10-18 23:17:56,154 DEV : loss 0.17939499020576477 - f1-score (micro avg) 0.5277
2023-10-18 23:17:56,168 ----------------------------------------------------------------------------------------------------
2023-10-18 23:17:57,932 epoch 9 - iter 72/723 - loss 0.10482754 - time (sec): 1.76 - samples/sec: 10587.13 - lr: 0.000011 - momentum: 0.000000
2023-10-18 23:17:59,777 epoch 9 - iter 144/723 - loss 0.11941581 - time (sec): 3.61 - samples/sec: 10252.77 - lr: 0.000010 - momentum: 0.000000
2023-10-18 23:18:01,519 epoch 9 - iter 216/723 - loss 0.11428472 - time (sec): 5.35 - samples/sec: 10270.88 - lr: 0.000009 - momentum: 0.000000
2023-10-18 23:18:03,294 epoch 9 - iter 288/723 - loss 0.11765545 - time (sec): 7.13 - samples/sec: 10286.21 - lr: 0.000009 - momentum: 0.000000
2023-10-18 23:18:05,075 epoch 9 - iter 360/723 - loss 0.12157267 - time (sec): 8.91 - samples/sec: 10049.91 - lr: 0.000008 - momentum: 0.000000
2023-10-18 23:18:06,761 epoch 9 - iter 432/723 - loss 0.11936636 - time (sec): 10.59 - samples/sec: 10058.03 - lr: 0.000008 - momentum: 0.000000
2023-10-18 23:18:08,276 epoch 9 - iter 504/723 - loss 0.11873180 - time (sec): 12.11 - samples/sec: 10211.95 - lr: 0.000007 - momentum: 0.000000
2023-10-18 23:18:09,884 epoch 9 - iter 576/723 - loss 0.11689872 - time (sec): 13.71 - samples/sec: 10369.22 - lr: 0.000007 - momentum: 0.000000
2023-10-18 23:18:11,685 epoch 9 - iter 648/723 - loss 0.11799321 - time (sec): 15.52 - samples/sec: 10271.81 - lr: 0.000006 - momentum: 0.000000
2023-10-18 23:18:13,436 epoch 9 - iter 720/723 - loss 0.11948734 - time (sec): 17.27 - samples/sec: 10181.80 - lr: 0.000006 - momentum: 0.000000
2023-10-18 23:18:13,495 ----------------------------------------------------------------------------------------------------
2023-10-18 23:18:13,495 EPOCH 9 done: loss 0.1197 - lr: 0.000006
2023-10-18 23:18:15,639 DEV : loss 0.17294184863567352 - f1-score (micro avg) 0.5292
2023-10-18 23:18:15,655 ----------------------------------------------------------------------------------------------------
2023-10-18 23:18:17,542 epoch 10 - iter 72/723 - loss 0.12361351 - time (sec): 1.89 - samples/sec: 9438.66 - lr: 0.000005 - momentum: 0.000000
2023-10-18 23:18:19,410 epoch 10 - iter 144/723 - loss 0.11590017 - time (sec): 3.76 - samples/sec: 9446.32 - lr: 0.000004 - momentum: 0.000000
2023-10-18 23:18:21,248 epoch 10 - iter 216/723 - loss 0.11549287 - time (sec): 5.59 - samples/sec: 9439.46 - lr: 0.000004 - momentum: 0.000000
2023-10-18 23:18:23,070 epoch 10 - iter 288/723 - loss 0.11632240 - time (sec): 7.41 - samples/sec: 9349.45 - lr: 0.000003 - momentum: 0.000000
2023-10-18 23:18:24,972 epoch 10 - iter 360/723 - loss 0.12476098 - time (sec): 9.32 - samples/sec: 9447.80 - lr: 0.000003 - momentum: 0.000000
2023-10-18 23:18:26,863 epoch 10 - iter 432/723 - loss 0.12139655 - time (sec): 11.21 - samples/sec: 9515.38 - lr: 0.000002 - momentum: 0.000000
2023-10-18 23:18:28,705 epoch 10 - iter 504/723 - loss 0.12203358 - time (sec): 13.05 - samples/sec: 9588.11 - lr: 0.000002 - momentum: 0.000000
2023-10-18 23:18:30,467 epoch 10 - iter 576/723 - loss 0.12185105 - time (sec): 14.81 - samples/sec: 9508.37 - lr: 0.000001 - momentum: 0.000000
2023-10-18 23:18:32,221 epoch 10 - iter 648/723 - loss 0.12023157 - time (sec): 16.57 - samples/sec: 9540.08 - lr: 0.000001 - momentum: 0.000000
2023-10-18 23:18:34,016 epoch 10 - iter 720/723 - loss 0.11895430 - time (sec): 18.36 - samples/sec: 9573.31 - lr: 0.000000 - momentum: 0.000000
2023-10-18 23:18:34,073 ----------------------------------------------------------------------------------------------------
2023-10-18 23:18:34,073 EPOCH 10 done: loss 0.1190 - lr: 0.000000
2023-10-18 23:18:35,874 DEV : loss 0.1752191036939621 - f1-score (micro avg) 0.5318
2023-10-18 23:18:35,921 ----------------------------------------------------------------------------------------------------
2023-10-18 23:18:35,921 Loading model from best epoch ...
2023-10-18 23:18:36,002 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
2023-10-18 23:18:37,353
Results:
- F-score (micro) 0.5439
- F-score (macro) 0.3773
- Accuracy 0.3802
By class:
precision recall f1-score support
PER 0.5046 0.4564 0.4793 482
LOC 0.6705 0.6354 0.6525 458
ORG 0.0000 0.0000 0.0000 69
micro avg 0.5874 0.5064 0.5439 1009
macro avg 0.3917 0.3639 0.3773 1009
weighted avg 0.5454 0.5064 0.5251 1009
2023-10-18 23:18:37,353 ----------------------------------------------------------------------------------------------------