stefan-it's picture
Upload ./training.log with huggingface_hub
ccced6a
2023-10-24 11:33:33,117 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,118 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(1): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(2): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(3): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(4): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(5): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(6): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(7): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(8): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(9): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(10): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(11): BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=21, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-24 11:33:33,118 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,118 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences
- NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator
2023-10-24 11:33:33,118 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,118 Train: 5901 sentences
2023-10-24 11:33:33,118 (train_with_dev=False, train_with_test=False)
2023-10-24 11:33:33,118 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,118 Training Params:
2023-10-24 11:33:33,118 - learning_rate: "3e-05"
2023-10-24 11:33:33,118 - mini_batch_size: "4"
2023-10-24 11:33:33,118 - max_epochs: "10"
2023-10-24 11:33:33,118 - shuffle: "True"
2023-10-24 11:33:33,118 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,118 Plugins:
2023-10-24 11:33:33,118 - TensorboardLogger
2023-10-24 11:33:33,118 - LinearScheduler | warmup_fraction: '0.1'
2023-10-24 11:33:33,118 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,118 Final evaluation on model from best epoch (best-model.pt)
2023-10-24 11:33:33,119 - metric: "('micro avg', 'f1-score')"
2023-10-24 11:33:33,119 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,119 Computation:
2023-10-24 11:33:33,119 - compute on device: cuda:0
2023-10-24 11:33:33,119 - embedding storage: none
2023-10-24 11:33:33,119 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,119 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4"
2023-10-24 11:33:33,119 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,119 ----------------------------------------------------------------------------------------------------
2023-10-24 11:33:33,119 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-24 11:33:42,246 epoch 1 - iter 147/1476 - loss 2.19536875 - time (sec): 9.13 - samples/sec: 1703.07 - lr: 0.000003 - momentum: 0.000000
2023-10-24 11:33:52,168 epoch 1 - iter 294/1476 - loss 1.30454022 - time (sec): 19.05 - samples/sec: 1773.95 - lr: 0.000006 - momentum: 0.000000
2023-10-24 11:34:01,785 epoch 1 - iter 441/1476 - loss 1.00249830 - time (sec): 28.67 - samples/sec: 1775.42 - lr: 0.000009 - momentum: 0.000000
2023-10-24 11:34:11,288 epoch 1 - iter 588/1476 - loss 0.83438704 - time (sec): 38.17 - samples/sec: 1754.02 - lr: 0.000012 - momentum: 0.000000
2023-10-24 11:34:20,503 epoch 1 - iter 735/1476 - loss 0.72982303 - time (sec): 47.38 - samples/sec: 1741.88 - lr: 0.000015 - momentum: 0.000000
2023-10-24 11:34:30,629 epoch 1 - iter 882/1476 - loss 0.63143162 - time (sec): 57.51 - samples/sec: 1765.73 - lr: 0.000018 - momentum: 0.000000
2023-10-24 11:34:40,266 epoch 1 - iter 1029/1476 - loss 0.56910312 - time (sec): 67.15 - samples/sec: 1768.78 - lr: 0.000021 - momentum: 0.000000
2023-10-24 11:34:49,663 epoch 1 - iter 1176/1476 - loss 0.52180038 - time (sec): 76.54 - samples/sec: 1762.83 - lr: 0.000024 - momentum: 0.000000
2023-10-24 11:34:59,193 epoch 1 - iter 1323/1476 - loss 0.48616883 - time (sec): 86.07 - samples/sec: 1744.55 - lr: 0.000027 - momentum: 0.000000
2023-10-24 11:35:08,590 epoch 1 - iter 1470/1476 - loss 0.45721393 - time (sec): 95.47 - samples/sec: 1738.79 - lr: 0.000030 - momentum: 0.000000
2023-10-24 11:35:08,928 ----------------------------------------------------------------------------------------------------
2023-10-24 11:35:08,928 EPOCH 1 done: loss 0.4565 - lr: 0.000030
2023-10-24 11:35:14,879 DEV : loss 0.12407723069190979 - f1-score (micro avg) 0.7495
2023-10-24 11:35:14,900 saving best model
2023-10-24 11:35:15,459 ----------------------------------------------------------------------------------------------------
2023-10-24 11:35:24,516 epoch 2 - iter 147/1476 - loss 0.13332891 - time (sec): 9.06 - samples/sec: 1667.07 - lr: 0.000030 - momentum: 0.000000
2023-10-24 11:35:34,173 epoch 2 - iter 294/1476 - loss 0.14011117 - time (sec): 18.71 - samples/sec: 1675.78 - lr: 0.000029 - momentum: 0.000000
2023-10-24 11:35:43,927 epoch 2 - iter 441/1476 - loss 0.13181640 - time (sec): 28.47 - samples/sec: 1697.25 - lr: 0.000029 - momentum: 0.000000
2023-10-24 11:35:53,440 epoch 2 - iter 588/1476 - loss 0.12886116 - time (sec): 37.98 - samples/sec: 1707.08 - lr: 0.000029 - momentum: 0.000000
2023-10-24 11:36:03,482 epoch 2 - iter 735/1476 - loss 0.12767160 - time (sec): 48.02 - samples/sec: 1727.57 - lr: 0.000028 - momentum: 0.000000
2023-10-24 11:36:13,643 epoch 2 - iter 882/1476 - loss 0.12873025 - time (sec): 58.18 - samples/sec: 1750.18 - lr: 0.000028 - momentum: 0.000000
2023-10-24 11:36:23,066 epoch 2 - iter 1029/1476 - loss 0.12421138 - time (sec): 67.61 - samples/sec: 1754.15 - lr: 0.000028 - momentum: 0.000000
2023-10-24 11:36:32,370 epoch 2 - iter 1176/1476 - loss 0.12289273 - time (sec): 76.91 - samples/sec: 1735.37 - lr: 0.000027 - momentum: 0.000000
2023-10-24 11:36:41,846 epoch 2 - iter 1323/1476 - loss 0.12330188 - time (sec): 86.39 - samples/sec: 1733.94 - lr: 0.000027 - momentum: 0.000000
2023-10-24 11:36:51,235 epoch 2 - iter 1470/1476 - loss 0.12313860 - time (sec): 95.78 - samples/sec: 1731.47 - lr: 0.000027 - momentum: 0.000000
2023-10-24 11:36:51,587 ----------------------------------------------------------------------------------------------------
2023-10-24 11:36:51,587 EPOCH 2 done: loss 0.1233 - lr: 0.000027
2023-10-24 11:37:00,093 DEV : loss 0.1295129507780075 - f1-score (micro avg) 0.7805
2023-10-24 11:37:00,114 saving best model
2023-10-24 11:37:00,822 ----------------------------------------------------------------------------------------------------
2023-10-24 11:37:10,905 epoch 3 - iter 147/1476 - loss 0.07444805 - time (sec): 10.08 - samples/sec: 1832.61 - lr: 0.000026 - momentum: 0.000000
2023-10-24 11:37:20,538 epoch 3 - iter 294/1476 - loss 0.07028273 - time (sec): 19.72 - samples/sec: 1768.17 - lr: 0.000026 - momentum: 0.000000
2023-10-24 11:37:30,165 epoch 3 - iter 441/1476 - loss 0.07005359 - time (sec): 29.34 - samples/sec: 1756.79 - lr: 0.000026 - momentum: 0.000000
2023-10-24 11:37:40,127 epoch 3 - iter 588/1476 - loss 0.07284502 - time (sec): 39.30 - samples/sec: 1792.76 - lr: 0.000025 - momentum: 0.000000
2023-10-24 11:37:49,293 epoch 3 - iter 735/1476 - loss 0.07496556 - time (sec): 48.47 - samples/sec: 1772.37 - lr: 0.000025 - momentum: 0.000000
2023-10-24 11:37:58,598 epoch 3 - iter 882/1476 - loss 0.07439841 - time (sec): 57.77 - samples/sec: 1760.35 - lr: 0.000025 - momentum: 0.000000
2023-10-24 11:38:08,091 epoch 3 - iter 1029/1476 - loss 0.07434422 - time (sec): 67.27 - samples/sec: 1752.65 - lr: 0.000024 - momentum: 0.000000
2023-10-24 11:38:17,959 epoch 3 - iter 1176/1476 - loss 0.07475674 - time (sec): 77.14 - samples/sec: 1749.95 - lr: 0.000024 - momentum: 0.000000
2023-10-24 11:38:27,481 epoch 3 - iter 1323/1476 - loss 0.07544812 - time (sec): 86.66 - samples/sec: 1731.60 - lr: 0.000024 - momentum: 0.000000
2023-10-24 11:38:37,000 epoch 3 - iter 1470/1476 - loss 0.07534427 - time (sec): 96.18 - samples/sec: 1724.93 - lr: 0.000023 - momentum: 0.000000
2023-10-24 11:38:37,342 ----------------------------------------------------------------------------------------------------
2023-10-24 11:38:37,342 EPOCH 3 done: loss 0.0751 - lr: 0.000023
2023-10-24 11:38:45,917 DEV : loss 0.15013425052165985 - f1-score (micro avg) 0.8109
2023-10-24 11:38:45,938 saving best model
2023-10-24 11:38:46,610 ----------------------------------------------------------------------------------------------------
2023-10-24 11:38:56,162 epoch 4 - iter 147/1476 - loss 0.06185939 - time (sec): 9.55 - samples/sec: 1718.42 - lr: 0.000023 - momentum: 0.000000
2023-10-24 11:39:05,672 epoch 4 - iter 294/1476 - loss 0.05870855 - time (sec): 19.06 - samples/sec: 1747.27 - lr: 0.000023 - momentum: 0.000000
2023-10-24 11:39:15,400 epoch 4 - iter 441/1476 - loss 0.05814235 - time (sec): 28.79 - samples/sec: 1755.24 - lr: 0.000022 - momentum: 0.000000
2023-10-24 11:39:25,523 epoch 4 - iter 588/1476 - loss 0.05681694 - time (sec): 38.91 - samples/sec: 1781.77 - lr: 0.000022 - momentum: 0.000000
2023-10-24 11:39:34,946 epoch 4 - iter 735/1476 - loss 0.05738224 - time (sec): 48.34 - samples/sec: 1761.48 - lr: 0.000022 - momentum: 0.000000
2023-10-24 11:39:44,387 epoch 4 - iter 882/1476 - loss 0.05743691 - time (sec): 57.78 - samples/sec: 1752.88 - lr: 0.000021 - momentum: 0.000000
2023-10-24 11:39:54,099 epoch 4 - iter 1029/1476 - loss 0.05689872 - time (sec): 67.49 - samples/sec: 1750.19 - lr: 0.000021 - momentum: 0.000000
2023-10-24 11:40:03,519 epoch 4 - iter 1176/1476 - loss 0.05774149 - time (sec): 76.91 - samples/sec: 1742.28 - lr: 0.000021 - momentum: 0.000000
2023-10-24 11:40:12,635 epoch 4 - iter 1323/1476 - loss 0.05709225 - time (sec): 86.02 - samples/sec: 1736.14 - lr: 0.000020 - momentum: 0.000000
2023-10-24 11:40:22,388 epoch 4 - iter 1470/1476 - loss 0.05698675 - time (sec): 95.78 - samples/sec: 1731.88 - lr: 0.000020 - momentum: 0.000000
2023-10-24 11:40:22,740 ----------------------------------------------------------------------------------------------------
2023-10-24 11:40:22,740 EPOCH 4 done: loss 0.0568 - lr: 0.000020
2023-10-24 11:40:31,295 DEV : loss 0.18162360787391663 - f1-score (micro avg) 0.8217
2023-10-24 11:40:31,316 saving best model
2023-10-24 11:40:32,019 ----------------------------------------------------------------------------------------------------
2023-10-24 11:40:41,069 epoch 5 - iter 147/1476 - loss 0.02214466 - time (sec): 9.05 - samples/sec: 1600.70 - lr: 0.000020 - momentum: 0.000000
2023-10-24 11:40:50,694 epoch 5 - iter 294/1476 - loss 0.03873008 - time (sec): 18.67 - samples/sec: 1669.50 - lr: 0.000019 - momentum: 0.000000
2023-10-24 11:41:00,332 epoch 5 - iter 441/1476 - loss 0.04486905 - time (sec): 28.31 - samples/sec: 1697.77 - lr: 0.000019 - momentum: 0.000000
2023-10-24 11:41:10,485 epoch 5 - iter 588/1476 - loss 0.04328846 - time (sec): 38.47 - samples/sec: 1732.73 - lr: 0.000019 - momentum: 0.000000
2023-10-24 11:41:20,075 epoch 5 - iter 735/1476 - loss 0.03927674 - time (sec): 48.06 - samples/sec: 1729.27 - lr: 0.000018 - momentum: 0.000000
2023-10-24 11:41:29,733 epoch 5 - iter 882/1476 - loss 0.03772238 - time (sec): 57.71 - samples/sec: 1724.97 - lr: 0.000018 - momentum: 0.000000
2023-10-24 11:41:39,031 epoch 5 - iter 1029/1476 - loss 0.03912415 - time (sec): 67.01 - samples/sec: 1718.50 - lr: 0.000018 - momentum: 0.000000
2023-10-24 11:41:48,430 epoch 5 - iter 1176/1476 - loss 0.03900961 - time (sec): 76.41 - samples/sec: 1719.13 - lr: 0.000017 - momentum: 0.000000
2023-10-24 11:41:57,994 epoch 5 - iter 1323/1476 - loss 0.03943131 - time (sec): 85.97 - samples/sec: 1727.94 - lr: 0.000017 - momentum: 0.000000
2023-10-24 11:42:07,675 epoch 5 - iter 1470/1476 - loss 0.03973549 - time (sec): 95.66 - samples/sec: 1732.63 - lr: 0.000017 - momentum: 0.000000
2023-10-24 11:42:08,053 ----------------------------------------------------------------------------------------------------
2023-10-24 11:42:08,053 EPOCH 5 done: loss 0.0396 - lr: 0.000017
2023-10-24 11:42:16,603 DEV : loss 0.18701893091201782 - f1-score (micro avg) 0.8223
2023-10-24 11:42:16,625 saving best model
2023-10-24 11:42:17,335 ----------------------------------------------------------------------------------------------------
2023-10-24 11:42:26,431 epoch 6 - iter 147/1476 - loss 0.02804029 - time (sec): 9.10 - samples/sec: 1668.22 - lr: 0.000016 - momentum: 0.000000
2023-10-24 11:42:35,842 epoch 6 - iter 294/1476 - loss 0.03043188 - time (sec): 18.51 - samples/sec: 1705.54 - lr: 0.000016 - momentum: 0.000000
2023-10-24 11:42:45,688 epoch 6 - iter 441/1476 - loss 0.02872066 - time (sec): 28.35 - samples/sec: 1717.11 - lr: 0.000016 - momentum: 0.000000
2023-10-24 11:42:55,608 epoch 6 - iter 588/1476 - loss 0.02985256 - time (sec): 38.27 - samples/sec: 1749.52 - lr: 0.000015 - momentum: 0.000000
2023-10-24 11:43:04,920 epoch 6 - iter 735/1476 - loss 0.03195604 - time (sec): 47.58 - samples/sec: 1727.10 - lr: 0.000015 - momentum: 0.000000
2023-10-24 11:43:14,749 epoch 6 - iter 882/1476 - loss 0.03203786 - time (sec): 57.41 - samples/sec: 1743.61 - lr: 0.000015 - momentum: 0.000000
2023-10-24 11:43:23,979 epoch 6 - iter 1029/1476 - loss 0.03051764 - time (sec): 66.64 - samples/sec: 1730.71 - lr: 0.000014 - momentum: 0.000000
2023-10-24 11:43:33,811 epoch 6 - iter 1176/1476 - loss 0.03005521 - time (sec): 76.48 - samples/sec: 1738.38 - lr: 0.000014 - momentum: 0.000000
2023-10-24 11:43:43,375 epoch 6 - iter 1323/1476 - loss 0.02868545 - time (sec): 86.04 - samples/sec: 1734.68 - lr: 0.000014 - momentum: 0.000000
2023-10-24 11:43:52,946 epoch 6 - iter 1470/1476 - loss 0.02869632 - time (sec): 95.61 - samples/sec: 1736.22 - lr: 0.000013 - momentum: 0.000000
2023-10-24 11:43:53,294 ----------------------------------------------------------------------------------------------------
2023-10-24 11:43:53,294 EPOCH 6 done: loss 0.0286 - lr: 0.000013
2023-10-24 11:44:01,846 DEV : loss 0.18682819604873657 - f1-score (micro avg) 0.8323
2023-10-24 11:44:01,867 saving best model
2023-10-24 11:44:02,600 ----------------------------------------------------------------------------------------------------
2023-10-24 11:44:12,156 epoch 7 - iter 147/1476 - loss 0.02464445 - time (sec): 9.55 - samples/sec: 1681.06 - lr: 0.000013 - momentum: 0.000000
2023-10-24 11:44:21,557 epoch 7 - iter 294/1476 - loss 0.01641864 - time (sec): 18.96 - samples/sec: 1711.58 - lr: 0.000013 - momentum: 0.000000
2023-10-24 11:44:31,099 epoch 7 - iter 441/1476 - loss 0.01616771 - time (sec): 28.50 - samples/sec: 1705.93 - lr: 0.000012 - momentum: 0.000000
2023-10-24 11:44:41,530 epoch 7 - iter 588/1476 - loss 0.01938895 - time (sec): 38.93 - samples/sec: 1762.53 - lr: 0.000012 - momentum: 0.000000
2023-10-24 11:44:50,752 epoch 7 - iter 735/1476 - loss 0.02029554 - time (sec): 48.15 - samples/sec: 1739.57 - lr: 0.000012 - momentum: 0.000000
2023-10-24 11:45:00,625 epoch 7 - iter 882/1476 - loss 0.02059997 - time (sec): 58.02 - samples/sec: 1760.12 - lr: 0.000011 - momentum: 0.000000
2023-10-24 11:45:10,132 epoch 7 - iter 1029/1476 - loss 0.01976046 - time (sec): 67.53 - samples/sec: 1759.00 - lr: 0.000011 - momentum: 0.000000
2023-10-24 11:45:19,388 epoch 7 - iter 1176/1476 - loss 0.01986334 - time (sec): 76.79 - samples/sec: 1745.44 - lr: 0.000011 - momentum: 0.000000
2023-10-24 11:45:29,182 epoch 7 - iter 1323/1476 - loss 0.01944864 - time (sec): 86.58 - samples/sec: 1747.75 - lr: 0.000010 - momentum: 0.000000
2023-10-24 11:45:38,320 epoch 7 - iter 1470/1476 - loss 0.02028984 - time (sec): 95.72 - samples/sec: 1732.06 - lr: 0.000010 - momentum: 0.000000
2023-10-24 11:45:38,675 ----------------------------------------------------------------------------------------------------
2023-10-24 11:45:38,676 EPOCH 7 done: loss 0.0202 - lr: 0.000010
2023-10-24 11:45:47,196 DEV : loss 0.20797935128211975 - f1-score (micro avg) 0.8111
2023-10-24 11:45:47,218 ----------------------------------------------------------------------------------------------------
2023-10-24 11:45:56,470 epoch 8 - iter 147/1476 - loss 0.01332305 - time (sec): 9.25 - samples/sec: 1639.48 - lr: 0.000010 - momentum: 0.000000
2023-10-24 11:46:05,956 epoch 8 - iter 294/1476 - loss 0.01434066 - time (sec): 18.74 - samples/sec: 1702.54 - lr: 0.000009 - momentum: 0.000000
2023-10-24 11:46:15,522 epoch 8 - iter 441/1476 - loss 0.01188966 - time (sec): 28.30 - samples/sec: 1695.47 - lr: 0.000009 - momentum: 0.000000
2023-10-24 11:46:25,247 epoch 8 - iter 588/1476 - loss 0.01123738 - time (sec): 38.03 - samples/sec: 1710.90 - lr: 0.000009 - momentum: 0.000000
2023-10-24 11:46:34,377 epoch 8 - iter 735/1476 - loss 0.01212194 - time (sec): 47.16 - samples/sec: 1705.38 - lr: 0.000008 - momentum: 0.000000
2023-10-24 11:46:43,826 epoch 8 - iter 882/1476 - loss 0.01402160 - time (sec): 56.61 - samples/sec: 1707.35 - lr: 0.000008 - momentum: 0.000000
2023-10-24 11:46:53,196 epoch 8 - iter 1029/1476 - loss 0.01278898 - time (sec): 65.98 - samples/sec: 1705.09 - lr: 0.000008 - momentum: 0.000000
2023-10-24 11:47:02,661 epoch 8 - iter 1176/1476 - loss 0.01370436 - time (sec): 75.44 - samples/sec: 1702.36 - lr: 0.000007 - momentum: 0.000000
2023-10-24 11:47:13,240 epoch 8 - iter 1323/1476 - loss 0.01414650 - time (sec): 86.02 - samples/sec: 1736.26 - lr: 0.000007 - momentum: 0.000000
2023-10-24 11:47:22,917 epoch 8 - iter 1470/1476 - loss 0.01344871 - time (sec): 95.70 - samples/sec: 1733.25 - lr: 0.000007 - momentum: 0.000000
2023-10-24 11:47:23,283 ----------------------------------------------------------------------------------------------------
2023-10-24 11:47:23,284 EPOCH 8 done: loss 0.0134 - lr: 0.000007
2023-10-24 11:47:31,840 DEV : loss 0.20348380506038666 - f1-score (micro avg) 0.8382
2023-10-24 11:47:31,861 saving best model
2023-10-24 11:47:32,563 ----------------------------------------------------------------------------------------------------
2023-10-24 11:47:41,827 epoch 9 - iter 147/1476 - loss 0.01514449 - time (sec): 9.26 - samples/sec: 1719.99 - lr: 0.000006 - momentum: 0.000000
2023-10-24 11:47:51,196 epoch 9 - iter 294/1476 - loss 0.01349928 - time (sec): 18.63 - samples/sec: 1685.34 - lr: 0.000006 - momentum: 0.000000
2023-10-24 11:48:00,922 epoch 9 - iter 441/1476 - loss 0.01272195 - time (sec): 28.36 - samples/sec: 1719.51 - lr: 0.000006 - momentum: 0.000000
2023-10-24 11:48:10,722 epoch 9 - iter 588/1476 - loss 0.01109428 - time (sec): 38.16 - samples/sec: 1701.76 - lr: 0.000005 - momentum: 0.000000
2023-10-24 11:48:20,429 epoch 9 - iter 735/1476 - loss 0.01137696 - time (sec): 47.87 - samples/sec: 1718.45 - lr: 0.000005 - momentum: 0.000000
2023-10-24 11:48:29,798 epoch 9 - iter 882/1476 - loss 0.01053126 - time (sec): 57.23 - samples/sec: 1710.30 - lr: 0.000005 - momentum: 0.000000
2023-10-24 11:48:39,266 epoch 9 - iter 1029/1476 - loss 0.00958536 - time (sec): 66.70 - samples/sec: 1705.70 - lr: 0.000004 - momentum: 0.000000
2023-10-24 11:48:48,628 epoch 9 - iter 1176/1476 - loss 0.00860303 - time (sec): 76.06 - samples/sec: 1705.96 - lr: 0.000004 - momentum: 0.000000
2023-10-24 11:48:57,868 epoch 9 - iter 1323/1476 - loss 0.00911407 - time (sec): 85.30 - samples/sec: 1704.38 - lr: 0.000004 - momentum: 0.000000
2023-10-24 11:49:08,559 epoch 9 - iter 1470/1476 - loss 0.00933873 - time (sec): 96.00 - samples/sec: 1727.89 - lr: 0.000003 - momentum: 0.000000
2023-10-24 11:49:08,913 ----------------------------------------------------------------------------------------------------
2023-10-24 11:49:08,913 EPOCH 9 done: loss 0.0094 - lr: 0.000003
2023-10-24 11:49:17,452 DEV : loss 0.22658559679985046 - f1-score (micro avg) 0.833
2023-10-24 11:49:17,473 ----------------------------------------------------------------------------------------------------
2023-10-24 11:49:26,802 epoch 10 - iter 147/1476 - loss 0.00520916 - time (sec): 9.33 - samples/sec: 1666.67 - lr: 0.000003 - momentum: 0.000000
2023-10-24 11:49:36,575 epoch 10 - iter 294/1476 - loss 0.00900406 - time (sec): 19.10 - samples/sec: 1747.49 - lr: 0.000003 - momentum: 0.000000
2023-10-24 11:49:46,668 epoch 10 - iter 441/1476 - loss 0.00793928 - time (sec): 29.19 - samples/sec: 1773.84 - lr: 0.000002 - momentum: 0.000000
2023-10-24 11:49:55,920 epoch 10 - iter 588/1476 - loss 0.00756196 - time (sec): 38.45 - samples/sec: 1742.28 - lr: 0.000002 - momentum: 0.000000
2023-10-24 11:50:05,747 epoch 10 - iter 735/1476 - loss 0.00720491 - time (sec): 48.27 - samples/sec: 1746.82 - lr: 0.000002 - momentum: 0.000000
2023-10-24 11:50:15,241 epoch 10 - iter 882/1476 - loss 0.00669091 - time (sec): 57.77 - samples/sec: 1738.90 - lr: 0.000001 - momentum: 0.000000
2023-10-24 11:50:25,224 epoch 10 - iter 1029/1476 - loss 0.00658869 - time (sec): 67.75 - samples/sec: 1752.74 - lr: 0.000001 - momentum: 0.000000
2023-10-24 11:50:34,706 epoch 10 - iter 1176/1476 - loss 0.00770641 - time (sec): 77.23 - samples/sec: 1746.61 - lr: 0.000001 - momentum: 0.000000
2023-10-24 11:50:43,872 epoch 10 - iter 1323/1476 - loss 0.00707407 - time (sec): 86.40 - samples/sec: 1744.59 - lr: 0.000000 - momentum: 0.000000
2023-10-24 11:50:53,041 epoch 10 - iter 1470/1476 - loss 0.00668891 - time (sec): 95.57 - samples/sec: 1734.36 - lr: 0.000000 - momentum: 0.000000
2023-10-24 11:50:53,402 ----------------------------------------------------------------------------------------------------
2023-10-24 11:50:53,403 EPOCH 10 done: loss 0.0068 - lr: 0.000000
2023-10-24 11:51:01,961 DEV : loss 0.23608621954917908 - f1-score (micro avg) 0.8291
2023-10-24 11:51:02,537 ----------------------------------------------------------------------------------------------------
2023-10-24 11:51:02,538 Loading model from best epoch ...
2023-10-24 11:51:04,405 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod
2023-10-24 11:51:10,712
Results:
- F-score (micro) 0.792
- F-score (macro) 0.7045
- Accuracy 0.6809
By class:
precision recall f1-score support
loc 0.8640 0.8660 0.8650 858
pers 0.7417 0.7914 0.7658 537
org 0.5248 0.5606 0.5421 132
time 0.5373 0.6667 0.5950 54
prod 0.8113 0.7049 0.7544 61
micro avg 0.7798 0.8045 0.7920 1642
macro avg 0.6958 0.7179 0.7045 1642
weighted avg 0.7840 0.8045 0.7936 1642
2023-10-24 11:51:10,712 ----------------------------------------------------------------------------------------------------