stefan-it's picture
Upload ./training.log with huggingface_hub
442b398
2023-10-23 21:32:27,709 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,709 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(1): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(2): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(3): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(4): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(5): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(6): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(7): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(8): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(9): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(10): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(11): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
- NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Train: 3575 sentences
2023-10-23 21:32:27,710 (train_with_dev=False, train_with_test=False)
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Training Params:
2023-10-23 21:32:27,710 - learning_rate: "5e-05"
2023-10-23 21:32:27,710 - mini_batch_size: "4"
2023-10-23 21:32:27,710 - max_epochs: "10"
2023-10-23 21:32:27,710 - shuffle: "True"
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Plugins:
2023-10-23 21:32:27,710 - TensorboardLogger
2023-10-23 21:32:27,710 - LinearScheduler | warmup_fraction: '0.1'
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Final evaluation on model from best epoch (best-model.pt)
2023-10-23 21:32:27,710 - metric: "('micro avg', 'f1-score')"
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Computation:
2023-10-23 21:32:27,710 - compute on device: cuda:0
2023-10-23 21:32:27,710 - embedding storage: none
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 ----------------------------------------------------------------------------------------------------
2023-10-23 21:32:27,710 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-23 21:32:33,214 epoch 1 - iter 89/894 - loss 1.85954162 - time (sec): 5.50 - samples/sec: 1522.26 - lr: 0.000005 - momentum: 0.000000
2023-10-23 21:32:39,007 epoch 1 - iter 178/894 - loss 1.13196503 - time (sec): 11.30 - samples/sec: 1545.75 - lr: 0.000010 - momentum: 0.000000
2023-10-23 21:32:44,612 epoch 1 - iter 267/894 - loss 0.87668233 - time (sec): 16.90 - samples/sec: 1554.74 - lr: 0.000015 - momentum: 0.000000
2023-10-23 21:32:50,192 epoch 1 - iter 356/894 - loss 0.72904744 - time (sec): 22.48 - samples/sec: 1556.98 - lr: 0.000020 - momentum: 0.000000
2023-10-23 21:32:56,003 epoch 1 - iter 445/894 - loss 0.63917825 - time (sec): 28.29 - samples/sec: 1561.58 - lr: 0.000025 - momentum: 0.000000
2023-10-23 21:33:01,626 epoch 1 - iter 534/894 - loss 0.57835257 - time (sec): 33.91 - samples/sec: 1553.54 - lr: 0.000030 - momentum: 0.000000
2023-10-23 21:33:07,308 epoch 1 - iter 623/894 - loss 0.52843333 - time (sec): 39.60 - samples/sec: 1544.71 - lr: 0.000035 - momentum: 0.000000
2023-10-23 21:33:12,815 epoch 1 - iter 712/894 - loss 0.48783348 - time (sec): 45.10 - samples/sec: 1544.53 - lr: 0.000040 - momentum: 0.000000
2023-10-23 21:33:18,426 epoch 1 - iter 801/894 - loss 0.46013170 - time (sec): 50.71 - samples/sec: 1538.23 - lr: 0.000045 - momentum: 0.000000
2023-10-23 21:33:24,155 epoch 1 - iter 890/894 - loss 0.43655607 - time (sec): 56.44 - samples/sec: 1525.91 - lr: 0.000050 - momentum: 0.000000
2023-10-23 21:33:24,403 ----------------------------------------------------------------------------------------------------
2023-10-23 21:33:24,404 EPOCH 1 done: loss 0.4350 - lr: 0.000050
2023-10-23 21:33:29,212 DEV : loss 0.17315027117729187 - f1-score (micro avg) 0.6335
2023-10-23 21:33:29,232 saving best model
2023-10-23 21:33:29,709 ----------------------------------------------------------------------------------------------------
2023-10-23 21:33:35,488 epoch 2 - iter 89/894 - loss 0.18260608 - time (sec): 5.78 - samples/sec: 1641.90 - lr: 0.000049 - momentum: 0.000000
2023-10-23 21:33:41,016 epoch 2 - iter 178/894 - loss 0.18828042 - time (sec): 11.31 - samples/sec: 1555.49 - lr: 0.000049 - momentum: 0.000000
2023-10-23 21:33:46,825 epoch 2 - iter 267/894 - loss 0.17496831 - time (sec): 17.12 - samples/sec: 1570.89 - lr: 0.000048 - momentum: 0.000000
2023-10-23 21:33:52,428 epoch 2 - iter 356/894 - loss 0.16664256 - time (sec): 22.72 - samples/sec: 1556.50 - lr: 0.000048 - momentum: 0.000000
2023-10-23 21:33:58,002 epoch 2 - iter 445/894 - loss 0.17404722 - time (sec): 28.29 - samples/sec: 1552.95 - lr: 0.000047 - momentum: 0.000000
2023-10-23 21:34:03,652 epoch 2 - iter 534/894 - loss 0.17916757 - time (sec): 33.94 - samples/sec: 1541.66 - lr: 0.000047 - momentum: 0.000000
2023-10-23 21:34:09,150 epoch 2 - iter 623/894 - loss 0.17434124 - time (sec): 39.44 - samples/sec: 1538.39 - lr: 0.000046 - momentum: 0.000000
2023-10-23 21:34:14,649 epoch 2 - iter 712/894 - loss 0.17000015 - time (sec): 44.94 - samples/sec: 1529.38 - lr: 0.000046 - momentum: 0.000000
2023-10-23 21:34:20,560 epoch 2 - iter 801/894 - loss 0.16586317 - time (sec): 50.85 - samples/sec: 1538.35 - lr: 0.000045 - momentum: 0.000000
2023-10-23 21:34:26,123 epoch 2 - iter 890/894 - loss 0.16468057 - time (sec): 56.41 - samples/sec: 1526.87 - lr: 0.000044 - momentum: 0.000000
2023-10-23 21:34:26,376 ----------------------------------------------------------------------------------------------------
2023-10-23 21:34:26,376 EPOCH 2 done: loss 0.1643 - lr: 0.000044
2023-10-23 21:34:32,851 DEV : loss 0.23010773956775665 - f1-score (micro avg) 0.6613
2023-10-23 21:34:32,871 saving best model
2023-10-23 21:34:33,473 ----------------------------------------------------------------------------------------------------
2023-10-23 21:34:39,532 epoch 3 - iter 89/894 - loss 0.07179713 - time (sec): 6.06 - samples/sec: 1737.21 - lr: 0.000044 - momentum: 0.000000
2023-10-23 21:34:45,194 epoch 3 - iter 178/894 - loss 0.08496786 - time (sec): 11.72 - samples/sec: 1633.46 - lr: 0.000043 - momentum: 0.000000
2023-10-23 21:34:50,794 epoch 3 - iter 267/894 - loss 0.09730028 - time (sec): 17.32 - samples/sec: 1589.49 - lr: 0.000043 - momentum: 0.000000
2023-10-23 21:34:56,397 epoch 3 - iter 356/894 - loss 0.09225857 - time (sec): 22.92 - samples/sec: 1557.32 - lr: 0.000042 - momentum: 0.000000
2023-10-23 21:35:02,115 epoch 3 - iter 445/894 - loss 0.09329019 - time (sec): 28.64 - samples/sec: 1544.18 - lr: 0.000042 - momentum: 0.000000
2023-10-23 21:35:07,724 epoch 3 - iter 534/894 - loss 0.09372400 - time (sec): 34.25 - samples/sec: 1544.04 - lr: 0.000041 - momentum: 0.000000
2023-10-23 21:35:13,385 epoch 3 - iter 623/894 - loss 0.09448497 - time (sec): 39.91 - samples/sec: 1554.01 - lr: 0.000041 - momentum: 0.000000
2023-10-23 21:35:18,794 epoch 3 - iter 712/894 - loss 0.09826920 - time (sec): 45.32 - samples/sec: 1528.70 - lr: 0.000040 - momentum: 0.000000
2023-10-23 21:35:24,473 epoch 3 - iter 801/894 - loss 0.10299681 - time (sec): 51.00 - samples/sec: 1523.56 - lr: 0.000039 - momentum: 0.000000
2023-10-23 21:35:30,086 epoch 3 - iter 890/894 - loss 0.10044136 - time (sec): 56.61 - samples/sec: 1524.66 - lr: 0.000039 - momentum: 0.000000
2023-10-23 21:35:30,318 ----------------------------------------------------------------------------------------------------
2023-10-23 21:35:30,318 EPOCH 3 done: loss 0.1002 - lr: 0.000039
2023-10-23 21:35:36,796 DEV : loss 0.20172163844108582 - f1-score (micro avg) 0.7141
2023-10-23 21:35:36,817 saving best model
2023-10-23 21:35:37,380 ----------------------------------------------------------------------------------------------------
2023-10-23 21:35:42,943 epoch 4 - iter 89/894 - loss 0.06516763 - time (sec): 5.56 - samples/sec: 1522.26 - lr: 0.000038 - momentum: 0.000000
2023-10-23 21:35:48,508 epoch 4 - iter 178/894 - loss 0.09442131 - time (sec): 11.13 - samples/sec: 1529.96 - lr: 0.000038 - momentum: 0.000000
2023-10-23 21:35:54,262 epoch 4 - iter 267/894 - loss 0.09252073 - time (sec): 16.88 - samples/sec: 1554.23 - lr: 0.000037 - momentum: 0.000000
2023-10-23 21:35:59,933 epoch 4 - iter 356/894 - loss 0.09139230 - time (sec): 22.55 - samples/sec: 1535.00 - lr: 0.000037 - momentum: 0.000000
2023-10-23 21:36:05,830 epoch 4 - iter 445/894 - loss 0.08513860 - time (sec): 28.45 - samples/sec: 1555.33 - lr: 0.000036 - momentum: 0.000000
2023-10-23 21:36:11,396 epoch 4 - iter 534/894 - loss 0.08819751 - time (sec): 34.02 - samples/sec: 1539.93 - lr: 0.000036 - momentum: 0.000000
2023-10-23 21:36:16,949 epoch 4 - iter 623/894 - loss 0.08464294 - time (sec): 39.57 - samples/sec: 1529.17 - lr: 0.000035 - momentum: 0.000000
2023-10-23 21:36:22,421 epoch 4 - iter 712/894 - loss 0.08580279 - time (sec): 45.04 - samples/sec: 1513.48 - lr: 0.000034 - momentum: 0.000000
2023-10-23 21:36:28,192 epoch 4 - iter 801/894 - loss 0.08417899 - time (sec): 50.81 - samples/sec: 1514.32 - lr: 0.000034 - momentum: 0.000000
2023-10-23 21:36:33,997 epoch 4 - iter 890/894 - loss 0.08189881 - time (sec): 56.62 - samples/sec: 1523.86 - lr: 0.000033 - momentum: 0.000000
2023-10-23 21:36:34,228 ----------------------------------------------------------------------------------------------------
2023-10-23 21:36:34,229 EPOCH 4 done: loss 0.0822 - lr: 0.000033
2023-10-23 21:36:40,747 DEV : loss 0.21917399764060974 - f1-score (micro avg) 0.731
2023-10-23 21:36:40,767 saving best model
2023-10-23 21:36:41,370 ----------------------------------------------------------------------------------------------------
2023-10-23 21:36:47,016 epoch 5 - iter 89/894 - loss 0.05884907 - time (sec): 5.65 - samples/sec: 1538.74 - lr: 0.000033 - momentum: 0.000000
2023-10-23 21:36:52,997 epoch 5 - iter 178/894 - loss 0.05271971 - time (sec): 11.63 - samples/sec: 1624.06 - lr: 0.000032 - momentum: 0.000000
2023-10-23 21:36:58,571 epoch 5 - iter 267/894 - loss 0.04804702 - time (sec): 17.20 - samples/sec: 1576.56 - lr: 0.000032 - momentum: 0.000000
2023-10-23 21:37:04,330 epoch 5 - iter 356/894 - loss 0.04906642 - time (sec): 22.96 - samples/sec: 1573.54 - lr: 0.000031 - momentum: 0.000000
2023-10-23 21:37:09,913 epoch 5 - iter 445/894 - loss 0.04994664 - time (sec): 28.54 - samples/sec: 1560.73 - lr: 0.000031 - momentum: 0.000000
2023-10-23 21:37:15,589 epoch 5 - iter 534/894 - loss 0.04889172 - time (sec): 34.22 - samples/sec: 1550.95 - lr: 0.000030 - momentum: 0.000000
2023-10-23 21:37:21,353 epoch 5 - iter 623/894 - loss 0.04744692 - time (sec): 39.98 - samples/sec: 1544.89 - lr: 0.000029 - momentum: 0.000000
2023-10-23 21:37:26,900 epoch 5 - iter 712/894 - loss 0.05021909 - time (sec): 45.53 - samples/sec: 1528.40 - lr: 0.000029 - momentum: 0.000000
2023-10-23 21:37:32,661 epoch 5 - iter 801/894 - loss 0.05064183 - time (sec): 51.29 - samples/sec: 1527.71 - lr: 0.000028 - momentum: 0.000000
2023-10-23 21:37:38,105 epoch 5 - iter 890/894 - loss 0.04987512 - time (sec): 56.73 - samples/sec: 1519.23 - lr: 0.000028 - momentum: 0.000000
2023-10-23 21:37:38,350 ----------------------------------------------------------------------------------------------------
2023-10-23 21:37:38,351 EPOCH 5 done: loss 0.0498 - lr: 0.000028
2023-10-23 21:37:44,863 DEV : loss 0.20383352041244507 - f1-score (micro avg) 0.7381
2023-10-23 21:37:44,883 saving best model
2023-10-23 21:37:45,479 ----------------------------------------------------------------------------------------------------
2023-10-23 21:37:51,408 epoch 6 - iter 89/894 - loss 0.01904247 - time (sec): 5.93 - samples/sec: 1589.36 - lr: 0.000027 - momentum: 0.000000
2023-10-23 21:37:56,793 epoch 6 - iter 178/894 - loss 0.02897594 - time (sec): 11.31 - samples/sec: 1492.41 - lr: 0.000027 - momentum: 0.000000
2023-10-23 21:38:02,457 epoch 6 - iter 267/894 - loss 0.03701067 - time (sec): 16.98 - samples/sec: 1514.52 - lr: 0.000026 - momentum: 0.000000
2023-10-23 21:38:08,497 epoch 6 - iter 356/894 - loss 0.03468888 - time (sec): 23.02 - samples/sec: 1526.30 - lr: 0.000026 - momentum: 0.000000
2023-10-23 21:38:14,169 epoch 6 - iter 445/894 - loss 0.03392340 - time (sec): 28.69 - samples/sec: 1518.26 - lr: 0.000025 - momentum: 0.000000
2023-10-23 21:38:19,734 epoch 6 - iter 534/894 - loss 0.03288088 - time (sec): 34.25 - samples/sec: 1514.24 - lr: 0.000024 - momentum: 0.000000
2023-10-23 21:38:25,229 epoch 6 - iter 623/894 - loss 0.03392332 - time (sec): 39.75 - samples/sec: 1504.18 - lr: 0.000024 - momentum: 0.000000
2023-10-23 21:38:30,742 epoch 6 - iter 712/894 - loss 0.03501878 - time (sec): 45.26 - samples/sec: 1509.80 - lr: 0.000023 - momentum: 0.000000
2023-10-23 21:38:36,343 epoch 6 - iter 801/894 - loss 0.03644499 - time (sec): 50.86 - samples/sec: 1520.56 - lr: 0.000023 - momentum: 0.000000
2023-10-23 21:38:42,110 epoch 6 - iter 890/894 - loss 0.03540901 - time (sec): 56.63 - samples/sec: 1521.86 - lr: 0.000022 - momentum: 0.000000
2023-10-23 21:38:42,357 ----------------------------------------------------------------------------------------------------
2023-10-23 21:38:42,357 EPOCH 6 done: loss 0.0355 - lr: 0.000022
2023-10-23 21:38:48,853 DEV : loss 0.253505676984787 - f1-score (micro avg) 0.7499
2023-10-23 21:38:48,874 saving best model
2023-10-23 21:38:49,472 ----------------------------------------------------------------------------------------------------
2023-10-23 21:38:55,299 epoch 7 - iter 89/894 - loss 0.01657475 - time (sec): 5.83 - samples/sec: 1585.69 - lr: 0.000022 - momentum: 0.000000
2023-10-23 21:39:01,391 epoch 7 - iter 178/894 - loss 0.02316210 - time (sec): 11.92 - samples/sec: 1592.58 - lr: 0.000021 - momentum: 0.000000
2023-10-23 21:39:06,969 epoch 7 - iter 267/894 - loss 0.02304160 - time (sec): 17.50 - samples/sec: 1576.26 - lr: 0.000021 - momentum: 0.000000
2023-10-23 21:39:12,479 epoch 7 - iter 356/894 - loss 0.02067861 - time (sec): 23.01 - samples/sec: 1539.39 - lr: 0.000020 - momentum: 0.000000
2023-10-23 21:39:18,127 epoch 7 - iter 445/894 - loss 0.02307564 - time (sec): 28.65 - samples/sec: 1519.76 - lr: 0.000019 - momentum: 0.000000
2023-10-23 21:39:23,764 epoch 7 - iter 534/894 - loss 0.02313820 - time (sec): 34.29 - samples/sec: 1520.77 - lr: 0.000019 - momentum: 0.000000
2023-10-23 21:39:29,376 epoch 7 - iter 623/894 - loss 0.02266132 - time (sec): 39.90 - samples/sec: 1527.61 - lr: 0.000018 - momentum: 0.000000
2023-10-23 21:39:35,012 epoch 7 - iter 712/894 - loss 0.02291381 - time (sec): 45.54 - samples/sec: 1521.63 - lr: 0.000018 - momentum: 0.000000
2023-10-23 21:39:40,592 epoch 7 - iter 801/894 - loss 0.02355819 - time (sec): 51.12 - samples/sec: 1520.94 - lr: 0.000017 - momentum: 0.000000
2023-10-23 21:39:46,210 epoch 7 - iter 890/894 - loss 0.02376008 - time (sec): 56.74 - samples/sec: 1518.40 - lr: 0.000017 - momentum: 0.000000
2023-10-23 21:39:46,480 ----------------------------------------------------------------------------------------------------
2023-10-23 21:39:46,481 EPOCH 7 done: loss 0.0242 - lr: 0.000017
2023-10-23 21:39:52,974 DEV : loss 0.24752555787563324 - f1-score (micro avg) 0.7515
2023-10-23 21:39:52,994 saving best model
2023-10-23 21:39:53,588 ----------------------------------------------------------------------------------------------------
2023-10-23 21:39:59,363 epoch 8 - iter 89/894 - loss 0.00797625 - time (sec): 5.77 - samples/sec: 1505.38 - lr: 0.000016 - momentum: 0.000000
2023-10-23 21:40:04,885 epoch 8 - iter 178/894 - loss 0.01179722 - time (sec): 11.30 - samples/sec: 1492.29 - lr: 0.000016 - momentum: 0.000000
2023-10-23 21:40:10,569 epoch 8 - iter 267/894 - loss 0.01080412 - time (sec): 16.98 - samples/sec: 1498.19 - lr: 0.000015 - momentum: 0.000000
2023-10-23 21:40:16,067 epoch 8 - iter 356/894 - loss 0.01373510 - time (sec): 22.48 - samples/sec: 1484.42 - lr: 0.000014 - momentum: 0.000000
2023-10-23 21:40:21,773 epoch 8 - iter 445/894 - loss 0.01537574 - time (sec): 28.18 - samples/sec: 1484.17 - lr: 0.000014 - momentum: 0.000000
2023-10-23 21:40:27,350 epoch 8 - iter 534/894 - loss 0.01333527 - time (sec): 33.76 - samples/sec: 1491.14 - lr: 0.000013 - momentum: 0.000000
2023-10-23 21:40:33,254 epoch 8 - iter 623/894 - loss 0.01300049 - time (sec): 39.67 - samples/sec: 1508.45 - lr: 0.000013 - momentum: 0.000000
2023-10-23 21:40:38,826 epoch 8 - iter 712/894 - loss 0.01338439 - time (sec): 45.24 - samples/sec: 1500.02 - lr: 0.000012 - momentum: 0.000000
2023-10-23 21:40:44,528 epoch 8 - iter 801/894 - loss 0.01251899 - time (sec): 50.94 - samples/sec: 1513.88 - lr: 0.000012 - momentum: 0.000000
2023-10-23 21:40:50,299 epoch 8 - iter 890/894 - loss 0.01191349 - time (sec): 56.71 - samples/sec: 1519.86 - lr: 0.000011 - momentum: 0.000000
2023-10-23 21:40:50,543 ----------------------------------------------------------------------------------------------------
2023-10-23 21:40:50,544 EPOCH 8 done: loss 0.0120 - lr: 0.000011
2023-10-23 21:40:57,027 DEV : loss 0.2516254484653473 - f1-score (micro avg) 0.7672
2023-10-23 21:40:57,048 saving best model
2023-10-23 21:40:57,641 ----------------------------------------------------------------------------------------------------
2023-10-23 21:41:03,127 epoch 9 - iter 89/894 - loss 0.00565499 - time (sec): 5.48 - samples/sec: 1435.26 - lr: 0.000011 - momentum: 0.000000
2023-10-23 21:41:09,113 epoch 9 - iter 178/894 - loss 0.00775794 - time (sec): 11.47 - samples/sec: 1558.87 - lr: 0.000010 - momentum: 0.000000
2023-10-23 21:41:14,835 epoch 9 - iter 267/894 - loss 0.00877010 - time (sec): 17.19 - samples/sec: 1553.70 - lr: 0.000009 - momentum: 0.000000
2023-10-23 21:41:20,509 epoch 9 - iter 356/894 - loss 0.00860547 - time (sec): 22.87 - samples/sec: 1548.72 - lr: 0.000009 - momentum: 0.000000
2023-10-23 21:41:26,256 epoch 9 - iter 445/894 - loss 0.00877409 - time (sec): 28.61 - samples/sec: 1545.27 - lr: 0.000008 - momentum: 0.000000
2023-10-23 21:41:32,176 epoch 9 - iter 534/894 - loss 0.00833655 - time (sec): 34.53 - samples/sec: 1545.37 - lr: 0.000008 - momentum: 0.000000
2023-10-23 21:41:37,747 epoch 9 - iter 623/894 - loss 0.00755503 - time (sec): 40.10 - samples/sec: 1539.71 - lr: 0.000007 - momentum: 0.000000
2023-10-23 21:41:43,212 epoch 9 - iter 712/894 - loss 0.00861964 - time (sec): 45.57 - samples/sec: 1525.39 - lr: 0.000007 - momentum: 0.000000
2023-10-23 21:41:48,681 epoch 9 - iter 801/894 - loss 0.00796957 - time (sec): 51.04 - samples/sec: 1513.43 - lr: 0.000006 - momentum: 0.000000
2023-10-23 21:41:54,306 epoch 9 - iter 890/894 - loss 0.00787224 - time (sec): 56.66 - samples/sec: 1519.07 - lr: 0.000006 - momentum: 0.000000
2023-10-23 21:41:54,556 ----------------------------------------------------------------------------------------------------
2023-10-23 21:41:54,556 EPOCH 9 done: loss 0.0079 - lr: 0.000006
2023-10-23 21:42:00,770 DEV : loss 0.25944018363952637 - f1-score (micro avg) 0.7679
2023-10-23 21:42:00,791 saving best model
2023-10-23 21:42:01,367 ----------------------------------------------------------------------------------------------------
2023-10-23 21:42:07,589 epoch 10 - iter 89/894 - loss 0.01315473 - time (sec): 6.22 - samples/sec: 1495.99 - lr: 0.000005 - momentum: 0.000000
2023-10-23 21:42:13,189 epoch 10 - iter 178/894 - loss 0.00864657 - time (sec): 11.82 - samples/sec: 1501.56 - lr: 0.000004 - momentum: 0.000000
2023-10-23 21:42:19,148 epoch 10 - iter 267/894 - loss 0.00690061 - time (sec): 17.78 - samples/sec: 1552.87 - lr: 0.000004 - momentum: 0.000000
2023-10-23 21:42:24,658 epoch 10 - iter 356/894 - loss 0.00654629 - time (sec): 23.29 - samples/sec: 1534.16 - lr: 0.000003 - momentum: 0.000000
2023-10-23 21:42:30,236 epoch 10 - iter 445/894 - loss 0.00595186 - time (sec): 28.87 - samples/sec: 1523.07 - lr: 0.000003 - momentum: 0.000000
2023-10-23 21:42:35,875 epoch 10 - iter 534/894 - loss 0.00673341 - time (sec): 34.51 - samples/sec: 1531.08 - lr: 0.000002 - momentum: 0.000000
2023-10-23 21:42:41,401 epoch 10 - iter 623/894 - loss 0.00642803 - time (sec): 40.03 - samples/sec: 1523.41 - lr: 0.000002 - momentum: 0.000000
2023-10-23 21:42:47,195 epoch 10 - iter 712/894 - loss 0.00581504 - time (sec): 45.83 - samples/sec: 1533.00 - lr: 0.000001 - momentum: 0.000000
2023-10-23 21:42:52,647 epoch 10 - iter 801/894 - loss 0.00544909 - time (sec): 51.28 - samples/sec: 1516.60 - lr: 0.000001 - momentum: 0.000000
2023-10-23 21:42:58,299 epoch 10 - iter 890/894 - loss 0.00516166 - time (sec): 56.93 - samples/sec: 1515.13 - lr: 0.000000 - momentum: 0.000000
2023-10-23 21:42:58,532 ----------------------------------------------------------------------------------------------------
2023-10-23 21:42:58,532 EPOCH 10 done: loss 0.0053 - lr: 0.000000
2023-10-23 21:43:04,758 DEV : loss 0.2663462460041046 - f1-score (micro avg) 0.7723
2023-10-23 21:43:04,778 saving best model
2023-10-23 21:43:05,930 ----------------------------------------------------------------------------------------------------
2023-10-23 21:43:05,931 Loading model from best epoch ...
2023-10-23 21:43:07,654 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
2023-10-23 21:43:12,469
Results:
- F-score (micro) 0.7542
- F-score (macro) 0.6651
- Accuracy 0.6209
By class:
precision recall f1-score support
loc 0.8209 0.8691 0.8443 596
pers 0.7000 0.7568 0.7273 333
org 0.5364 0.4470 0.4876 132
prod 0.6327 0.4697 0.5391 66
time 0.7200 0.7347 0.7273 49
micro avg 0.7467 0.7619 0.7542 1176
macro avg 0.6820 0.6554 0.6651 1176
weighted avg 0.7400 0.7619 0.7491 1176
2023-10-23 21:43:12,469 ----------------------------------------------------------------------------------------------------