flair-icdar-nl / training.log
stefan-it's picture
Upload ./training.log with huggingface_hub
ead6a50
2023-10-25 03:05:38,651 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(1): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(2): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(3): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(4): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(5): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(6): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(7): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(8): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(9): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(10): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(11): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-25 03:05:38,652 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 MultiCorpus: 5777 train + 722 dev + 723 test sentences
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /home/ubuntu/.flair/datasets/ner_icdar_europeana/nl
2023-10-25 03:05:38,652 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 Train: 5777 sentences
2023-10-25 03:05:38,652 (train_with_dev=False, train_with_test=False)
2023-10-25 03:05:38,652 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 Training Params:
2023-10-25 03:05:38,652 - learning_rate: "3e-05"
2023-10-25 03:05:38,652 - mini_batch_size: "8"
2023-10-25 03:05:38,652 - max_epochs: "10"
2023-10-25 03:05:38,652 - shuffle: "True"
2023-10-25 03:05:38,652 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 Plugins:
2023-10-25 03:05:38,652 - TensorboardLogger
2023-10-25 03:05:38,652 - LinearScheduler | warmup_fraction: '0.1'
2023-10-25 03:05:38,652 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 Final evaluation on model from best epoch (best-model.pt)
2023-10-25 03:05:38,652 - metric: "('micro avg', 'f1-score')"
2023-10-25 03:05:38,652 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,652 Computation:
2023-10-25 03:05:38,653 - compute on device: cuda:0
2023-10-25 03:05:38,653 - embedding storage: none
2023-10-25 03:05:38,653 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,653 Model training base path: "hmbench-icdar/nl-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5"
2023-10-25 03:05:38,653 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,653 ----------------------------------------------------------------------------------------------------
2023-10-25 03:05:38,653 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-25 03:05:47,361 epoch 1 - iter 72/723 - loss 1.99617329 - time (sec): 8.71 - samples/sec: 1924.42 - lr: 0.000003 - momentum: 0.000000
2023-10-25 03:05:55,310 epoch 1 - iter 144/723 - loss 1.16280577 - time (sec): 16.66 - samples/sec: 1978.56 - lr: 0.000006 - momentum: 0.000000
2023-10-25 03:06:03,657 epoch 1 - iter 216/723 - loss 0.84060964 - time (sec): 25.00 - samples/sec: 2007.83 - lr: 0.000009 - momentum: 0.000000
2023-10-25 03:06:12,528 epoch 1 - iter 288/723 - loss 0.66558228 - time (sec): 33.87 - samples/sec: 2021.42 - lr: 0.000012 - momentum: 0.000000
2023-10-25 03:06:20,701 epoch 1 - iter 360/723 - loss 0.56766766 - time (sec): 42.05 - samples/sec: 2024.88 - lr: 0.000015 - momentum: 0.000000
2023-10-25 03:06:29,450 epoch 1 - iter 432/723 - loss 0.49389313 - time (sec): 50.80 - samples/sec: 2042.64 - lr: 0.000018 - momentum: 0.000000
2023-10-25 03:06:38,106 epoch 1 - iter 504/723 - loss 0.44520564 - time (sec): 59.45 - samples/sec: 2046.47 - lr: 0.000021 - momentum: 0.000000
2023-10-25 03:06:46,545 epoch 1 - iter 576/723 - loss 0.40819214 - time (sec): 67.89 - samples/sec: 2045.50 - lr: 0.000024 - momentum: 0.000000
2023-10-25 03:06:55,306 epoch 1 - iter 648/723 - loss 0.37477357 - time (sec): 76.65 - samples/sec: 2054.01 - lr: 0.000027 - momentum: 0.000000
2023-10-25 03:07:04,267 epoch 1 - iter 720/723 - loss 0.35059174 - time (sec): 85.61 - samples/sec: 2050.77 - lr: 0.000030 - momentum: 0.000000
2023-10-25 03:07:04,591 ----------------------------------------------------------------------------------------------------
2023-10-25 03:07:04,591 EPOCH 1 done: loss 0.3498 - lr: 0.000030
2023-10-25 03:07:07,856 DEV : loss 0.12723077833652496 - f1-score (micro avg) 0.605
2023-10-25 03:07:07,867 saving best model
2023-10-25 03:07:08,331 ----------------------------------------------------------------------------------------------------
2023-10-25 03:07:16,780 epoch 2 - iter 72/723 - loss 0.12378288 - time (sec): 8.45 - samples/sec: 2025.34 - lr: 0.000030 - momentum: 0.000000
2023-10-25 03:07:25,019 epoch 2 - iter 144/723 - loss 0.10929827 - time (sec): 16.69 - samples/sec: 2057.29 - lr: 0.000029 - momentum: 0.000000
2023-10-25 03:07:33,376 epoch 2 - iter 216/723 - loss 0.10475478 - time (sec): 25.04 - samples/sec: 2064.54 - lr: 0.000029 - momentum: 0.000000
2023-10-25 03:07:41,993 epoch 2 - iter 288/723 - loss 0.10331239 - time (sec): 33.66 - samples/sec: 2056.30 - lr: 0.000029 - momentum: 0.000000
2023-10-25 03:07:50,632 epoch 2 - iter 360/723 - loss 0.09975351 - time (sec): 42.30 - samples/sec: 2045.34 - lr: 0.000028 - momentum: 0.000000
2023-10-25 03:07:59,130 epoch 2 - iter 432/723 - loss 0.09772803 - time (sec): 50.80 - samples/sec: 2043.69 - lr: 0.000028 - momentum: 0.000000
2023-10-25 03:08:07,642 epoch 2 - iter 504/723 - loss 0.09812027 - time (sec): 59.31 - samples/sec: 2041.26 - lr: 0.000028 - momentum: 0.000000
2023-10-25 03:08:16,048 epoch 2 - iter 576/723 - loss 0.09695368 - time (sec): 67.72 - samples/sec: 2044.92 - lr: 0.000027 - momentum: 0.000000
2023-10-25 03:08:25,214 epoch 2 - iter 648/723 - loss 0.09757941 - time (sec): 76.88 - samples/sec: 2041.28 - lr: 0.000027 - momentum: 0.000000
2023-10-25 03:08:34,573 epoch 2 - iter 720/723 - loss 0.09551261 - time (sec): 86.24 - samples/sec: 2036.63 - lr: 0.000027 - momentum: 0.000000
2023-10-25 03:08:34,936 ----------------------------------------------------------------------------------------------------
2023-10-25 03:08:34,936 EPOCH 2 done: loss 0.0956 - lr: 0.000027
2023-10-25 03:08:38,643 DEV : loss 0.08566790819168091 - f1-score (micro avg) 0.7745
2023-10-25 03:08:38,655 saving best model
2023-10-25 03:08:39,247 ----------------------------------------------------------------------------------------------------
2023-10-25 03:08:47,889 epoch 3 - iter 72/723 - loss 0.05916132 - time (sec): 8.64 - samples/sec: 1981.91 - lr: 0.000026 - momentum: 0.000000
2023-10-25 03:08:56,374 epoch 3 - iter 144/723 - loss 0.06201111 - time (sec): 17.13 - samples/sec: 2019.26 - lr: 0.000026 - momentum: 0.000000
2023-10-25 03:09:05,644 epoch 3 - iter 216/723 - loss 0.05888965 - time (sec): 26.40 - samples/sec: 2030.27 - lr: 0.000026 - momentum: 0.000000
2023-10-25 03:09:14,319 epoch 3 - iter 288/723 - loss 0.05851997 - time (sec): 35.07 - samples/sec: 2041.19 - lr: 0.000025 - momentum: 0.000000
2023-10-25 03:09:22,895 epoch 3 - iter 360/723 - loss 0.05764096 - time (sec): 43.65 - samples/sec: 2054.60 - lr: 0.000025 - momentum: 0.000000
2023-10-25 03:09:31,205 epoch 3 - iter 432/723 - loss 0.06016911 - time (sec): 51.96 - samples/sec: 2050.28 - lr: 0.000025 - momentum: 0.000000
2023-10-25 03:09:39,400 epoch 3 - iter 504/723 - loss 0.05918448 - time (sec): 60.15 - samples/sec: 2049.80 - lr: 0.000024 - momentum: 0.000000
2023-10-25 03:09:47,713 epoch 3 - iter 576/723 - loss 0.05969279 - time (sec): 68.46 - samples/sec: 2053.93 - lr: 0.000024 - momentum: 0.000000
2023-10-25 03:09:55,945 epoch 3 - iter 648/723 - loss 0.06014119 - time (sec): 76.70 - samples/sec: 2057.74 - lr: 0.000024 - momentum: 0.000000
2023-10-25 03:10:04,767 epoch 3 - iter 720/723 - loss 0.05977622 - time (sec): 85.52 - samples/sec: 2051.90 - lr: 0.000023 - momentum: 0.000000
2023-10-25 03:10:05,178 ----------------------------------------------------------------------------------------------------
2023-10-25 03:10:05,178 EPOCH 3 done: loss 0.0596 - lr: 0.000023
2023-10-25 03:10:08,906 DEV : loss 0.09248801320791245 - f1-score (micro avg) 0.8169
2023-10-25 03:10:08,918 saving best model
2023-10-25 03:10:09,504 ----------------------------------------------------------------------------------------------------
2023-10-25 03:10:17,921 epoch 4 - iter 72/723 - loss 0.03098176 - time (sec): 8.42 - samples/sec: 1998.29 - lr: 0.000023 - momentum: 0.000000
2023-10-25 03:10:26,602 epoch 4 - iter 144/723 - loss 0.03944408 - time (sec): 17.10 - samples/sec: 2035.54 - lr: 0.000023 - momentum: 0.000000
2023-10-25 03:10:35,744 epoch 4 - iter 216/723 - loss 0.04237264 - time (sec): 26.24 - samples/sec: 2032.29 - lr: 0.000022 - momentum: 0.000000
2023-10-25 03:10:45,044 epoch 4 - iter 288/723 - loss 0.04386620 - time (sec): 35.54 - samples/sec: 1999.13 - lr: 0.000022 - momentum: 0.000000
2023-10-25 03:10:53,659 epoch 4 - iter 360/723 - loss 0.04538680 - time (sec): 44.15 - samples/sec: 2010.14 - lr: 0.000022 - momentum: 0.000000
2023-10-25 03:11:01,969 epoch 4 - iter 432/723 - loss 0.04433392 - time (sec): 52.46 - samples/sec: 2009.08 - lr: 0.000021 - momentum: 0.000000
2023-10-25 03:11:10,884 epoch 4 - iter 504/723 - loss 0.04255925 - time (sec): 61.38 - samples/sec: 2023.36 - lr: 0.000021 - momentum: 0.000000
2023-10-25 03:11:19,328 epoch 4 - iter 576/723 - loss 0.04165806 - time (sec): 69.82 - samples/sec: 2025.62 - lr: 0.000021 - momentum: 0.000000
2023-10-25 03:11:27,196 epoch 4 - iter 648/723 - loss 0.04211832 - time (sec): 77.69 - samples/sec: 2031.28 - lr: 0.000020 - momentum: 0.000000
2023-10-25 03:11:35,625 epoch 4 - iter 720/723 - loss 0.04183929 - time (sec): 86.12 - samples/sec: 2042.24 - lr: 0.000020 - momentum: 0.000000
2023-10-25 03:11:35,879 ----------------------------------------------------------------------------------------------------
2023-10-25 03:11:35,880 EPOCH 4 done: loss 0.0418 - lr: 0.000020
2023-10-25 03:11:39,311 DEV : loss 0.10374626517295837 - f1-score (micro avg) 0.7998
2023-10-25 03:11:39,323 ----------------------------------------------------------------------------------------------------
2023-10-25 03:11:47,653 epoch 5 - iter 72/723 - loss 0.03199178 - time (sec): 8.33 - samples/sec: 2125.79 - lr: 0.000020 - momentum: 0.000000
2023-10-25 03:11:56,611 epoch 5 - iter 144/723 - loss 0.03213636 - time (sec): 17.29 - samples/sec: 2049.33 - lr: 0.000019 - momentum: 0.000000
2023-10-25 03:12:04,995 epoch 5 - iter 216/723 - loss 0.03115061 - time (sec): 25.67 - samples/sec: 2057.92 - lr: 0.000019 - momentum: 0.000000
2023-10-25 03:12:13,192 epoch 5 - iter 288/723 - loss 0.03097977 - time (sec): 33.87 - samples/sec: 2042.62 - lr: 0.000019 - momentum: 0.000000
2023-10-25 03:12:21,459 epoch 5 - iter 360/723 - loss 0.02861777 - time (sec): 42.14 - samples/sec: 2046.34 - lr: 0.000018 - momentum: 0.000000
2023-10-25 03:12:30,963 epoch 5 - iter 432/723 - loss 0.02807181 - time (sec): 51.64 - samples/sec: 2026.17 - lr: 0.000018 - momentum: 0.000000
2023-10-25 03:12:39,302 epoch 5 - iter 504/723 - loss 0.02949394 - time (sec): 59.98 - samples/sec: 2025.30 - lr: 0.000018 - momentum: 0.000000
2023-10-25 03:12:48,150 epoch 5 - iter 576/723 - loss 0.03091767 - time (sec): 68.83 - samples/sec: 2028.32 - lr: 0.000017 - momentum: 0.000000
2023-10-25 03:12:57,221 epoch 5 - iter 648/723 - loss 0.03041035 - time (sec): 77.90 - samples/sec: 2032.00 - lr: 0.000017 - momentum: 0.000000
2023-10-25 03:13:05,882 epoch 5 - iter 720/723 - loss 0.03003918 - time (sec): 86.56 - samples/sec: 2030.99 - lr: 0.000017 - momentum: 0.000000
2023-10-25 03:13:06,115 ----------------------------------------------------------------------------------------------------
2023-10-25 03:13:06,116 EPOCH 5 done: loss 0.0300 - lr: 0.000017
2023-10-25 03:13:09,854 DEV : loss 0.10400616377592087 - f1-score (micro avg) 0.8329
2023-10-25 03:13:09,866 saving best model
2023-10-25 03:13:10,453 ----------------------------------------------------------------------------------------------------
2023-10-25 03:13:19,082 epoch 6 - iter 72/723 - loss 0.02189838 - time (sec): 8.63 - samples/sec: 2088.50 - lr: 0.000016 - momentum: 0.000000
2023-10-25 03:13:26,772 epoch 6 - iter 144/723 - loss 0.02550434 - time (sec): 16.32 - samples/sec: 2089.07 - lr: 0.000016 - momentum: 0.000000
2023-10-25 03:13:35,285 epoch 6 - iter 216/723 - loss 0.02613427 - time (sec): 24.83 - samples/sec: 2096.33 - lr: 0.000016 - momentum: 0.000000
2023-10-25 03:13:44,917 epoch 6 - iter 288/723 - loss 0.02482502 - time (sec): 34.46 - samples/sec: 2066.30 - lr: 0.000015 - momentum: 0.000000
2023-10-25 03:13:53,581 epoch 6 - iter 360/723 - loss 0.02424077 - time (sec): 43.13 - samples/sec: 2064.26 - lr: 0.000015 - momentum: 0.000000
2023-10-25 03:14:02,495 epoch 6 - iter 432/723 - loss 0.02422193 - time (sec): 52.04 - samples/sec: 2054.09 - lr: 0.000015 - momentum: 0.000000
2023-10-25 03:14:11,533 epoch 6 - iter 504/723 - loss 0.02433358 - time (sec): 61.08 - samples/sec: 2039.82 - lr: 0.000014 - momentum: 0.000000
2023-10-25 03:14:19,993 epoch 6 - iter 576/723 - loss 0.02425909 - time (sec): 69.54 - samples/sec: 2039.67 - lr: 0.000014 - momentum: 0.000000
2023-10-25 03:14:27,983 epoch 6 - iter 648/723 - loss 0.02400181 - time (sec): 77.53 - samples/sec: 2042.61 - lr: 0.000014 - momentum: 0.000000
2023-10-25 03:14:37,020 epoch 6 - iter 720/723 - loss 0.02315851 - time (sec): 86.57 - samples/sec: 2030.27 - lr: 0.000013 - momentum: 0.000000
2023-10-25 03:14:37,321 ----------------------------------------------------------------------------------------------------
2023-10-25 03:14:37,322 EPOCH 6 done: loss 0.0233 - lr: 0.000013
2023-10-25 03:14:40,761 DEV : loss 0.1466233879327774 - f1-score (micro avg) 0.8271
2023-10-25 03:14:40,773 ----------------------------------------------------------------------------------------------------
2023-10-25 03:14:49,901 epoch 7 - iter 72/723 - loss 0.01604370 - time (sec): 9.13 - samples/sec: 1997.10 - lr: 0.000013 - momentum: 0.000000
2023-10-25 03:14:58,957 epoch 7 - iter 144/723 - loss 0.01674537 - time (sec): 18.18 - samples/sec: 2010.15 - lr: 0.000013 - momentum: 0.000000
2023-10-25 03:15:07,312 epoch 7 - iter 216/723 - loss 0.01646182 - time (sec): 26.54 - samples/sec: 2011.93 - lr: 0.000012 - momentum: 0.000000
2023-10-25 03:15:16,127 epoch 7 - iter 288/723 - loss 0.01562860 - time (sec): 35.35 - samples/sec: 2010.60 - lr: 0.000012 - momentum: 0.000000
2023-10-25 03:15:24,723 epoch 7 - iter 360/723 - loss 0.01653965 - time (sec): 43.95 - samples/sec: 2014.53 - lr: 0.000012 - momentum: 0.000000
2023-10-25 03:15:33,248 epoch 7 - iter 432/723 - loss 0.01654614 - time (sec): 52.47 - samples/sec: 2019.73 - lr: 0.000011 - momentum: 0.000000
2023-10-25 03:15:41,936 epoch 7 - iter 504/723 - loss 0.01772708 - time (sec): 61.16 - samples/sec: 2012.83 - lr: 0.000011 - momentum: 0.000000
2023-10-25 03:15:51,042 epoch 7 - iter 576/723 - loss 0.01751254 - time (sec): 70.27 - samples/sec: 2014.86 - lr: 0.000011 - momentum: 0.000000
2023-10-25 03:15:59,505 epoch 7 - iter 648/723 - loss 0.01722334 - time (sec): 78.73 - samples/sec: 2017.94 - lr: 0.000010 - momentum: 0.000000
2023-10-25 03:16:07,618 epoch 7 - iter 720/723 - loss 0.01748532 - time (sec): 86.84 - samples/sec: 2020.86 - lr: 0.000010 - momentum: 0.000000
2023-10-25 03:16:08,082 ----------------------------------------------------------------------------------------------------
2023-10-25 03:16:08,082 EPOCH 7 done: loss 0.0175 - lr: 0.000010
2023-10-25 03:16:11,526 DEV : loss 0.14099310338497162 - f1-score (micro avg) 0.8444
2023-10-25 03:16:11,538 saving best model
2023-10-25 03:16:12,132 ----------------------------------------------------------------------------------------------------
2023-10-25 03:16:20,471 epoch 8 - iter 72/723 - loss 0.02389593 - time (sec): 8.34 - samples/sec: 2029.40 - lr: 0.000010 - momentum: 0.000000
2023-10-25 03:16:28,527 epoch 8 - iter 144/723 - loss 0.01519053 - time (sec): 16.39 - samples/sec: 2070.04 - lr: 0.000009 - momentum: 0.000000
2023-10-25 03:16:36,685 epoch 8 - iter 216/723 - loss 0.01528692 - time (sec): 24.55 - samples/sec: 2055.74 - lr: 0.000009 - momentum: 0.000000
2023-10-25 03:16:45,368 epoch 8 - iter 288/723 - loss 0.01454248 - time (sec): 33.23 - samples/sec: 2030.79 - lr: 0.000009 - momentum: 0.000000
2023-10-25 03:16:53,990 epoch 8 - iter 360/723 - loss 0.01392144 - time (sec): 41.86 - samples/sec: 2028.37 - lr: 0.000008 - momentum: 0.000000
2023-10-25 03:17:02,707 epoch 8 - iter 432/723 - loss 0.01333604 - time (sec): 50.57 - samples/sec: 2028.92 - lr: 0.000008 - momentum: 0.000000
2023-10-25 03:17:11,757 epoch 8 - iter 504/723 - loss 0.01255700 - time (sec): 59.62 - samples/sec: 2012.47 - lr: 0.000008 - momentum: 0.000000
2023-10-25 03:17:20,313 epoch 8 - iter 576/723 - loss 0.01296982 - time (sec): 68.18 - samples/sec: 2017.55 - lr: 0.000007 - momentum: 0.000000
2023-10-25 03:17:28,983 epoch 8 - iter 648/723 - loss 0.01386353 - time (sec): 76.85 - samples/sec: 2033.29 - lr: 0.000007 - momentum: 0.000000
2023-10-25 03:17:38,555 epoch 8 - iter 720/723 - loss 0.01336908 - time (sec): 86.42 - samples/sec: 2032.31 - lr: 0.000007 - momentum: 0.000000
2023-10-25 03:17:38,804 ----------------------------------------------------------------------------------------------------
2023-10-25 03:17:38,805 EPOCH 8 done: loss 0.0134 - lr: 0.000007
2023-10-25 03:17:42,543 DEV : loss 0.16321606934070587 - f1-score (micro avg) 0.8331
2023-10-25 03:17:42,555 ----------------------------------------------------------------------------------------------------
2023-10-25 03:17:52,154 epoch 9 - iter 72/723 - loss 0.00928599 - time (sec): 9.60 - samples/sec: 1891.08 - lr: 0.000006 - momentum: 0.000000
2023-10-25 03:18:00,763 epoch 9 - iter 144/723 - loss 0.01114101 - time (sec): 18.21 - samples/sec: 1981.42 - lr: 0.000006 - momentum: 0.000000
2023-10-25 03:18:09,687 epoch 9 - iter 216/723 - loss 0.00982378 - time (sec): 27.13 - samples/sec: 2013.82 - lr: 0.000006 - momentum: 0.000000
2023-10-25 03:18:18,372 epoch 9 - iter 288/723 - loss 0.00914376 - time (sec): 35.82 - samples/sec: 2021.99 - lr: 0.000005 - momentum: 0.000000
2023-10-25 03:18:26,374 epoch 9 - iter 360/723 - loss 0.00876306 - time (sec): 43.82 - samples/sec: 2027.74 - lr: 0.000005 - momentum: 0.000000
2023-10-25 03:18:34,609 epoch 9 - iter 432/723 - loss 0.00823411 - time (sec): 52.05 - samples/sec: 2024.16 - lr: 0.000005 - momentum: 0.000000
2023-10-25 03:18:43,316 epoch 9 - iter 504/723 - loss 0.00818760 - time (sec): 60.76 - samples/sec: 2033.51 - lr: 0.000004 - momentum: 0.000000
2023-10-25 03:18:51,923 epoch 9 - iter 576/723 - loss 0.00821037 - time (sec): 69.37 - samples/sec: 2034.49 - lr: 0.000004 - momentum: 0.000000
2023-10-25 03:19:00,772 epoch 9 - iter 648/723 - loss 0.00872746 - time (sec): 78.22 - samples/sec: 2030.67 - lr: 0.000004 - momentum: 0.000000
2023-10-25 03:19:09,106 epoch 9 - iter 720/723 - loss 0.00885136 - time (sec): 86.55 - samples/sec: 2030.87 - lr: 0.000003 - momentum: 0.000000
2023-10-25 03:19:09,383 ----------------------------------------------------------------------------------------------------
2023-10-25 03:19:09,383 EPOCH 9 done: loss 0.0088 - lr: 0.000003
2023-10-25 03:19:13,170 DEV : loss 0.18107134103775024 - f1-score (micro avg) 0.829
2023-10-25 03:19:13,182 ----------------------------------------------------------------------------------------------------
2023-10-25 03:19:22,203 epoch 10 - iter 72/723 - loss 0.00646861 - time (sec): 9.02 - samples/sec: 1972.88 - lr: 0.000003 - momentum: 0.000000
2023-10-25 03:19:31,133 epoch 10 - iter 144/723 - loss 0.00638719 - time (sec): 17.95 - samples/sec: 1994.98 - lr: 0.000003 - momentum: 0.000000
2023-10-25 03:19:39,625 epoch 10 - iter 216/723 - loss 0.00729037 - time (sec): 26.44 - samples/sec: 1986.12 - lr: 0.000002 - momentum: 0.000000
2023-10-25 03:19:48,001 epoch 10 - iter 288/723 - loss 0.00646852 - time (sec): 34.82 - samples/sec: 1995.23 - lr: 0.000002 - momentum: 0.000000
2023-10-25 03:19:56,662 epoch 10 - iter 360/723 - loss 0.00687157 - time (sec): 43.48 - samples/sec: 1991.42 - lr: 0.000002 - momentum: 0.000000
2023-10-25 03:20:05,230 epoch 10 - iter 432/723 - loss 0.00636256 - time (sec): 52.05 - samples/sec: 2003.13 - lr: 0.000001 - momentum: 0.000000
2023-10-25 03:20:13,979 epoch 10 - iter 504/723 - loss 0.00637499 - time (sec): 60.80 - samples/sec: 1997.17 - lr: 0.000001 - momentum: 0.000000
2023-10-25 03:20:22,735 epoch 10 - iter 576/723 - loss 0.00672918 - time (sec): 69.55 - samples/sec: 1987.42 - lr: 0.000001 - momentum: 0.000000
2023-10-25 03:20:31,736 epoch 10 - iter 648/723 - loss 0.00714825 - time (sec): 78.55 - samples/sec: 1993.22 - lr: 0.000000 - momentum: 0.000000
2023-10-25 03:20:40,664 epoch 10 - iter 720/723 - loss 0.00697419 - time (sec): 87.48 - samples/sec: 2006.56 - lr: 0.000000 - momentum: 0.000000
2023-10-25 03:20:40,944 ----------------------------------------------------------------------------------------------------
2023-10-25 03:20:40,944 EPOCH 10 done: loss 0.0070 - lr: 0.000000
2023-10-25 03:20:44,389 DEV : loss 0.17600025236606598 - f1-score (micro avg) 0.8312
2023-10-25 03:20:44,876 ----------------------------------------------------------------------------------------------------
2023-10-25 03:20:44,876 Loading model from best epoch ...
2023-10-25 03:20:46,545 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
2023-10-25 03:20:50,083
Results:
- F-score (micro) 0.8055
- F-score (macro) 0.6854
- Accuracy 0.6868
By class:
precision recall f1-score support
PER 0.8254 0.8237 0.8245 482
LOC 0.9010 0.7948 0.8445 458
ORG 0.4364 0.3478 0.3871 69
micro avg 0.8351 0.7780 0.8055 1009
macro avg 0.7209 0.6554 0.6854 1009
weighted avg 0.8331 0.7780 0.8037 1009
2023-10-25 03:20:50,084 ----------------------------------------------------------------------------------------------------