|
2023-10-19 19:51:06,768 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,768 Model: "SequenceTagger( |
|
(embeddings): TransformerWordEmbeddings( |
|
(model): BertModel( |
|
(embeddings): BertEmbeddings( |
|
(word_embeddings): Embedding(32001, 128) |
|
(position_embeddings): Embedding(512, 128) |
|
(token_type_embeddings): Embedding(2, 128) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(encoder): BertEncoder( |
|
(layer): ModuleList( |
|
(0-1): 2 x BertLayer( |
|
(attention): BertAttention( |
|
(self): BertSelfAttention( |
|
(query): Linear(in_features=128, out_features=128, bias=True) |
|
(key): Linear(in_features=128, out_features=128, bias=True) |
|
(value): Linear(in_features=128, out_features=128, bias=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
(output): BertSelfOutput( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
(intermediate): BertIntermediate( |
|
(dense): Linear(in_features=128, out_features=512, bias=True) |
|
(intermediate_act_fn): GELUActivation() |
|
) |
|
(output): BertOutput( |
|
(dense): Linear(in_features=512, out_features=128, bias=True) |
|
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True) |
|
(dropout): Dropout(p=0.1, inplace=False) |
|
) |
|
) |
|
) |
|
) |
|
(pooler): BertPooler( |
|
(dense): Linear(in_features=128, out_features=128, bias=True) |
|
(activation): Tanh() |
|
) |
|
) |
|
) |
|
(locked_dropout): LockedDropout(p=0.5) |
|
(linear): Linear(in_features=128, out_features=17, bias=True) |
|
(loss_function): CrossEntropyLoss() |
|
)" |
|
2023-10-19 19:51:06,768 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,768 MultiCorpus: 7142 train + 698 dev + 2570 test sentences |
|
- NER_HIPE_2022 Corpus: 7142 train + 698 dev + 2570 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/fr/with_doc_seperator |
|
2023-10-19 19:51:06,768 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,768 Train: 7142 sentences |
|
2023-10-19 19:51:06,768 (train_with_dev=False, train_with_test=False) |
|
2023-10-19 19:51:06,768 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,768 Training Params: |
|
2023-10-19 19:51:06,768 - learning_rate: "3e-05" |
|
2023-10-19 19:51:06,768 - mini_batch_size: "4" |
|
2023-10-19 19:51:06,769 - max_epochs: "10" |
|
2023-10-19 19:51:06,769 - shuffle: "True" |
|
2023-10-19 19:51:06,769 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,769 Plugins: |
|
2023-10-19 19:51:06,769 - TensorboardLogger |
|
2023-10-19 19:51:06,769 - LinearScheduler | warmup_fraction: '0.1' |
|
2023-10-19 19:51:06,769 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,769 Final evaluation on model from best epoch (best-model.pt) |
|
2023-10-19 19:51:06,769 - metric: "('micro avg', 'f1-score')" |
|
2023-10-19 19:51:06,769 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,769 Computation: |
|
2023-10-19 19:51:06,769 - compute on device: cuda:0 |
|
2023-10-19 19:51:06,769 - embedding storage: none |
|
2023-10-19 19:51:06,769 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,769 Model training base path: "hmbench-newseye/fr-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2" |
|
2023-10-19 19:51:06,769 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,769 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:06,769 Logging anything other than scalars to TensorBoard is currently not supported. |
|
2023-10-19 19:51:09,948 epoch 1 - iter 178/1786 - loss 2.79687367 - time (sec): 3.18 - samples/sec: 8405.70 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-19 19:51:13,196 epoch 1 - iter 356/1786 - loss 2.50644631 - time (sec): 6.43 - samples/sec: 7917.10 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-19 19:51:16,242 epoch 1 - iter 534/1786 - loss 2.11669177 - time (sec): 9.47 - samples/sec: 7847.71 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-19 19:51:19,447 epoch 1 - iter 712/1786 - loss 1.78468271 - time (sec): 12.68 - samples/sec: 7832.18 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-19 19:51:22,749 epoch 1 - iter 890/1786 - loss 1.58099334 - time (sec): 15.98 - samples/sec: 7696.48 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-19 19:51:25,906 epoch 1 - iter 1068/1786 - loss 1.44626933 - time (sec): 19.14 - samples/sec: 7688.35 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-19 19:51:29,188 epoch 1 - iter 1246/1786 - loss 1.32135288 - time (sec): 22.42 - samples/sec: 7711.92 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-19 19:51:32,297 epoch 1 - iter 1424/1786 - loss 1.22571550 - time (sec): 25.53 - samples/sec: 7701.62 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-19 19:51:35,449 epoch 1 - iter 1602/1786 - loss 1.14718597 - time (sec): 28.68 - samples/sec: 7738.29 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-19 19:51:38,565 epoch 1 - iter 1780/1786 - loss 1.08157944 - time (sec): 31.80 - samples/sec: 7802.23 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-19 19:51:38,668 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:38,668 EPOCH 1 done: loss 1.0801 - lr: 0.000030 |
|
2023-10-19 19:51:40,134 DEV : loss 0.3227035701274872 - f1-score (micro avg) 0.1395 |
|
2023-10-19 19:51:40,149 saving best model |
|
2023-10-19 19:51:40,183 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:51:43,289 epoch 2 - iter 178/1786 - loss 0.50600833 - time (sec): 3.11 - samples/sec: 7616.92 - lr: 0.000030 - momentum: 0.000000 |
|
2023-10-19 19:51:46,333 epoch 2 - iter 356/1786 - loss 0.46557022 - time (sec): 6.15 - samples/sec: 7898.55 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-19 19:51:49,371 epoch 2 - iter 534/1786 - loss 0.46513364 - time (sec): 9.19 - samples/sec: 7766.67 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-19 19:51:52,594 epoch 2 - iter 712/1786 - loss 0.45220596 - time (sec): 12.41 - samples/sec: 7737.80 - lr: 0.000029 - momentum: 0.000000 |
|
2023-10-19 19:51:56,019 epoch 2 - iter 890/1786 - loss 0.45404700 - time (sec): 15.84 - samples/sec: 7667.36 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-19 19:51:59,181 epoch 2 - iter 1068/1786 - loss 0.44855842 - time (sec): 19.00 - samples/sec: 7742.35 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-19 19:52:02,417 epoch 2 - iter 1246/1786 - loss 0.44009549 - time (sec): 22.23 - samples/sec: 7836.30 - lr: 0.000028 - momentum: 0.000000 |
|
2023-10-19 19:52:05,463 epoch 2 - iter 1424/1786 - loss 0.43715972 - time (sec): 25.28 - samples/sec: 7889.50 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-19 19:52:08,578 epoch 2 - iter 1602/1786 - loss 0.43518582 - time (sec): 28.39 - samples/sec: 7891.60 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-19 19:52:11,635 epoch 2 - iter 1780/1786 - loss 0.43341001 - time (sec): 31.45 - samples/sec: 7891.97 - lr: 0.000027 - momentum: 0.000000 |
|
2023-10-19 19:52:11,727 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:52:11,727 EPOCH 2 done: loss 0.4333 - lr: 0.000027 |
|
2023-10-19 19:52:14,547 DEV : loss 0.2583433985710144 - f1-score (micro avg) 0.3512 |
|
2023-10-19 19:52:14,562 saving best model |
|
2023-10-19 19:52:14,594 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:52:17,843 epoch 3 - iter 178/1786 - loss 0.36689807 - time (sec): 3.25 - samples/sec: 7327.59 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-19 19:52:21,112 epoch 3 - iter 356/1786 - loss 0.34773508 - time (sec): 6.52 - samples/sec: 7774.91 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-19 19:52:24,217 epoch 3 - iter 534/1786 - loss 0.34418896 - time (sec): 9.62 - samples/sec: 7853.08 - lr: 0.000026 - momentum: 0.000000 |
|
2023-10-19 19:52:27,388 epoch 3 - iter 712/1786 - loss 0.35168554 - time (sec): 12.79 - samples/sec: 7808.77 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-19 19:52:30,291 epoch 3 - iter 890/1786 - loss 0.35595638 - time (sec): 15.70 - samples/sec: 8022.78 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-19 19:52:33,085 epoch 3 - iter 1068/1786 - loss 0.35812865 - time (sec): 18.49 - samples/sec: 8195.12 - lr: 0.000025 - momentum: 0.000000 |
|
2023-10-19 19:52:36,081 epoch 3 - iter 1246/1786 - loss 0.35668010 - time (sec): 21.49 - samples/sec: 8128.47 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-19 19:52:39,100 epoch 3 - iter 1424/1786 - loss 0.35997953 - time (sec): 24.50 - samples/sec: 8132.30 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-19 19:52:42,129 epoch 3 - iter 1602/1786 - loss 0.36107656 - time (sec): 27.53 - samples/sec: 8152.51 - lr: 0.000024 - momentum: 0.000000 |
|
2023-10-19 19:52:45,104 epoch 3 - iter 1780/1786 - loss 0.35849403 - time (sec): 30.51 - samples/sec: 8127.99 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-19 19:52:45,209 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:52:45,209 EPOCH 3 done: loss 0.3587 - lr: 0.000023 |
|
2023-10-19 19:52:47,579 DEV : loss 0.2307889312505722 - f1-score (micro avg) 0.4343 |
|
2023-10-19 19:52:47,594 saving best model |
|
2023-10-19 19:52:47,627 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:52:50,603 epoch 4 - iter 178/1786 - loss 0.32503973 - time (sec): 2.98 - samples/sec: 8709.72 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-19 19:52:53,661 epoch 4 - iter 356/1786 - loss 0.33876825 - time (sec): 6.03 - samples/sec: 8239.92 - lr: 0.000023 - momentum: 0.000000 |
|
2023-10-19 19:52:56,706 epoch 4 - iter 534/1786 - loss 0.34909168 - time (sec): 9.08 - samples/sec: 8139.72 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-19 19:52:59,800 epoch 4 - iter 712/1786 - loss 0.33361868 - time (sec): 12.17 - samples/sec: 8180.84 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-19 19:53:02,796 epoch 4 - iter 890/1786 - loss 0.32921053 - time (sec): 15.17 - samples/sec: 8202.07 - lr: 0.000022 - momentum: 0.000000 |
|
2023-10-19 19:53:05,850 epoch 4 - iter 1068/1786 - loss 0.32434636 - time (sec): 18.22 - samples/sec: 8142.54 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-19 19:53:08,954 epoch 4 - iter 1246/1786 - loss 0.32469591 - time (sec): 21.33 - samples/sec: 8072.21 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-19 19:53:11,970 epoch 4 - iter 1424/1786 - loss 0.32393392 - time (sec): 24.34 - samples/sec: 8076.36 - lr: 0.000021 - momentum: 0.000000 |
|
2023-10-19 19:53:15,084 epoch 4 - iter 1602/1786 - loss 0.32421030 - time (sec): 27.46 - samples/sec: 8099.84 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-19 19:53:18,196 epoch 4 - iter 1780/1786 - loss 0.32088289 - time (sec): 30.57 - samples/sec: 8119.54 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-19 19:53:18,287 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:53:18,287 EPOCH 4 done: loss 0.3211 - lr: 0.000020 |
|
2023-10-19 19:53:21,110 DEV : loss 0.2131872922182083 - f1-score (micro avg) 0.4691 |
|
2023-10-19 19:53:21,125 saving best model |
|
2023-10-19 19:53:21,160 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:53:24,078 epoch 5 - iter 178/1786 - loss 0.31966126 - time (sec): 2.92 - samples/sec: 8571.95 - lr: 0.000020 - momentum: 0.000000 |
|
2023-10-19 19:53:27,158 epoch 5 - iter 356/1786 - loss 0.30979983 - time (sec): 6.00 - samples/sec: 8542.72 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-19 19:53:30,224 epoch 5 - iter 534/1786 - loss 0.30109050 - time (sec): 9.06 - samples/sec: 8366.97 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-19 19:53:33,352 epoch 5 - iter 712/1786 - loss 0.30298422 - time (sec): 12.19 - samples/sec: 8163.31 - lr: 0.000019 - momentum: 0.000000 |
|
2023-10-19 19:53:36,499 epoch 5 - iter 890/1786 - loss 0.30175508 - time (sec): 15.34 - samples/sec: 7974.28 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-19 19:53:39,560 epoch 5 - iter 1068/1786 - loss 0.29354489 - time (sec): 18.40 - samples/sec: 8027.67 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-19 19:53:42,547 epoch 5 - iter 1246/1786 - loss 0.29628320 - time (sec): 21.39 - samples/sec: 7996.02 - lr: 0.000018 - momentum: 0.000000 |
|
2023-10-19 19:53:45,679 epoch 5 - iter 1424/1786 - loss 0.29373074 - time (sec): 24.52 - samples/sec: 8009.08 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-19 19:53:48,821 epoch 5 - iter 1602/1786 - loss 0.29243459 - time (sec): 27.66 - samples/sec: 8049.63 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-19 19:53:51,974 epoch 5 - iter 1780/1786 - loss 0.29269761 - time (sec): 30.81 - samples/sec: 8046.10 - lr: 0.000017 - momentum: 0.000000 |
|
2023-10-19 19:53:52,092 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:53:52,092 EPOCH 5 done: loss 0.2925 - lr: 0.000017 |
|
2023-10-19 19:53:54,463 DEV : loss 0.20653365552425385 - f1-score (micro avg) 0.4815 |
|
2023-10-19 19:53:54,477 saving best model |
|
2023-10-19 19:53:54,510 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:53:57,812 epoch 6 - iter 178/1786 - loss 0.26481799 - time (sec): 3.30 - samples/sec: 7606.85 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-19 19:54:00,948 epoch 6 - iter 356/1786 - loss 0.26935936 - time (sec): 6.44 - samples/sec: 7536.84 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-19 19:54:04,139 epoch 6 - iter 534/1786 - loss 0.27348573 - time (sec): 9.63 - samples/sec: 7506.83 - lr: 0.000016 - momentum: 0.000000 |
|
2023-10-19 19:54:07,207 epoch 6 - iter 712/1786 - loss 0.27134878 - time (sec): 12.70 - samples/sec: 7760.38 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-19 19:54:10,303 epoch 6 - iter 890/1786 - loss 0.26920475 - time (sec): 15.79 - samples/sec: 7915.32 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-19 19:54:13,384 epoch 6 - iter 1068/1786 - loss 0.27009143 - time (sec): 18.87 - samples/sec: 7904.76 - lr: 0.000015 - momentum: 0.000000 |
|
2023-10-19 19:54:16,439 epoch 6 - iter 1246/1786 - loss 0.27126902 - time (sec): 21.93 - samples/sec: 7886.87 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-19 19:54:19,503 epoch 6 - iter 1424/1786 - loss 0.27292987 - time (sec): 24.99 - samples/sec: 7902.13 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-19 19:54:22,691 epoch 6 - iter 1602/1786 - loss 0.27225979 - time (sec): 28.18 - samples/sec: 7926.75 - lr: 0.000014 - momentum: 0.000000 |
|
2023-10-19 19:54:25,821 epoch 6 - iter 1780/1786 - loss 0.27262868 - time (sec): 31.31 - samples/sec: 7923.30 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-19 19:54:25,918 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:54:25,918 EPOCH 6 done: loss 0.2727 - lr: 0.000013 |
|
2023-10-19 19:54:28,745 DEV : loss 0.2003042846918106 - f1-score (micro avg) 0.4843 |
|
2023-10-19 19:54:28,759 saving best model |
|
2023-10-19 19:54:28,791 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:54:31,964 epoch 7 - iter 178/1786 - loss 0.24380929 - time (sec): 3.17 - samples/sec: 8323.54 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-19 19:54:35,071 epoch 7 - iter 356/1786 - loss 0.25726475 - time (sec): 6.28 - samples/sec: 8238.91 - lr: 0.000013 - momentum: 0.000000 |
|
2023-10-19 19:54:38,117 epoch 7 - iter 534/1786 - loss 0.25410518 - time (sec): 9.33 - samples/sec: 8058.86 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-19 19:54:41,153 epoch 7 - iter 712/1786 - loss 0.25769061 - time (sec): 12.36 - samples/sec: 7964.33 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-19 19:54:44,262 epoch 7 - iter 890/1786 - loss 0.25823402 - time (sec): 15.47 - samples/sec: 7972.49 - lr: 0.000012 - momentum: 0.000000 |
|
2023-10-19 19:54:47,394 epoch 7 - iter 1068/1786 - loss 0.25747742 - time (sec): 18.60 - samples/sec: 7959.44 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-19 19:54:50,595 epoch 7 - iter 1246/1786 - loss 0.25719904 - time (sec): 21.80 - samples/sec: 7962.56 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-19 19:54:53,694 epoch 7 - iter 1424/1786 - loss 0.25589988 - time (sec): 24.90 - samples/sec: 8050.60 - lr: 0.000011 - momentum: 0.000000 |
|
2023-10-19 19:54:56,745 epoch 7 - iter 1602/1786 - loss 0.25845771 - time (sec): 27.95 - samples/sec: 8021.59 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-19 19:54:59,724 epoch 7 - iter 1780/1786 - loss 0.25787745 - time (sec): 30.93 - samples/sec: 8027.94 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-19 19:54:59,817 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:54:59,818 EPOCH 7 done: loss 0.2580 - lr: 0.000010 |
|
2023-10-19 19:55:02,193 DEV : loss 0.2004682868719101 - f1-score (micro avg) 0.5019 |
|
2023-10-19 19:55:02,209 saving best model |
|
2023-10-19 19:55:02,246 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:55:05,316 epoch 8 - iter 178/1786 - loss 0.24861152 - time (sec): 3.07 - samples/sec: 8175.43 - lr: 0.000010 - momentum: 0.000000 |
|
2023-10-19 19:55:08,419 epoch 8 - iter 356/1786 - loss 0.24086117 - time (sec): 6.17 - samples/sec: 8098.43 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-19 19:55:11,525 epoch 8 - iter 534/1786 - loss 0.24815113 - time (sec): 9.28 - samples/sec: 7956.56 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-19 19:55:14,603 epoch 8 - iter 712/1786 - loss 0.24997628 - time (sec): 12.36 - samples/sec: 7987.55 - lr: 0.000009 - momentum: 0.000000 |
|
2023-10-19 19:55:17,675 epoch 8 - iter 890/1786 - loss 0.24787170 - time (sec): 15.43 - samples/sec: 8019.54 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-19 19:55:20,753 epoch 8 - iter 1068/1786 - loss 0.24807736 - time (sec): 18.51 - samples/sec: 8065.54 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-19 19:55:23,762 epoch 8 - iter 1246/1786 - loss 0.24768224 - time (sec): 21.52 - samples/sec: 8087.21 - lr: 0.000008 - momentum: 0.000000 |
|
2023-10-19 19:55:26,840 epoch 8 - iter 1424/1786 - loss 0.24665818 - time (sec): 24.59 - samples/sec: 8095.57 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-19 19:55:29,829 epoch 8 - iter 1602/1786 - loss 0.24985874 - time (sec): 27.58 - samples/sec: 8098.97 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-19 19:55:32,995 epoch 8 - iter 1780/1786 - loss 0.25000798 - time (sec): 30.75 - samples/sec: 8064.42 - lr: 0.000007 - momentum: 0.000000 |
|
2023-10-19 19:55:33,104 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:55:33,104 EPOCH 8 done: loss 0.2499 - lr: 0.000007 |
|
2023-10-19 19:55:35,972 DEV : loss 0.19668100774288177 - f1-score (micro avg) 0.5031 |
|
2023-10-19 19:55:35,986 saving best model |
|
2023-10-19 19:55:36,020 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:55:39,211 epoch 9 - iter 178/1786 - loss 0.23800143 - time (sec): 3.19 - samples/sec: 8302.38 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-19 19:55:42,260 epoch 9 - iter 356/1786 - loss 0.23583672 - time (sec): 6.24 - samples/sec: 8301.37 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-19 19:55:45,281 epoch 9 - iter 534/1786 - loss 0.23499003 - time (sec): 9.26 - samples/sec: 8227.23 - lr: 0.000006 - momentum: 0.000000 |
|
2023-10-19 19:55:48,244 epoch 9 - iter 712/1786 - loss 0.23888394 - time (sec): 12.22 - samples/sec: 8139.72 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-19 19:55:51,288 epoch 9 - iter 890/1786 - loss 0.24062823 - time (sec): 15.27 - samples/sec: 8068.23 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-19 19:55:54,409 epoch 9 - iter 1068/1786 - loss 0.24208841 - time (sec): 18.39 - samples/sec: 8095.06 - lr: 0.000005 - momentum: 0.000000 |
|
2023-10-19 19:55:57,156 epoch 9 - iter 1246/1786 - loss 0.24572741 - time (sec): 21.13 - samples/sec: 8235.25 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-19 19:56:00,158 epoch 9 - iter 1424/1786 - loss 0.24461181 - time (sec): 24.14 - samples/sec: 8218.35 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-19 19:56:03,242 epoch 9 - iter 1602/1786 - loss 0.24434313 - time (sec): 27.22 - samples/sec: 8220.44 - lr: 0.000004 - momentum: 0.000000 |
|
2023-10-19 19:56:06,397 epoch 9 - iter 1780/1786 - loss 0.24230508 - time (sec): 30.38 - samples/sec: 8165.70 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-19 19:56:06,497 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:56:06,497 EPOCH 9 done: loss 0.2419 - lr: 0.000003 |
|
2023-10-19 19:56:08,853 DEV : loss 0.1973438411951065 - f1-score (micro avg) 0.508 |
|
2023-10-19 19:56:08,867 saving best model |
|
2023-10-19 19:56:08,900 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:56:11,976 epoch 10 - iter 178/1786 - loss 0.24379983 - time (sec): 3.08 - samples/sec: 7549.01 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-19 19:56:15,113 epoch 10 - iter 356/1786 - loss 0.25116343 - time (sec): 6.21 - samples/sec: 7700.35 - lr: 0.000003 - momentum: 0.000000 |
|
2023-10-19 19:56:18,116 epoch 10 - iter 534/1786 - loss 0.25060293 - time (sec): 9.22 - samples/sec: 7865.97 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-19 19:56:21,260 epoch 10 - iter 712/1786 - loss 0.25011516 - time (sec): 12.36 - samples/sec: 7898.81 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-19 19:56:24,503 epoch 10 - iter 890/1786 - loss 0.24555631 - time (sec): 15.60 - samples/sec: 7783.06 - lr: 0.000002 - momentum: 0.000000 |
|
2023-10-19 19:56:27,557 epoch 10 - iter 1068/1786 - loss 0.24118692 - time (sec): 18.66 - samples/sec: 7834.94 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-19 19:56:30,299 epoch 10 - iter 1246/1786 - loss 0.23655811 - time (sec): 21.40 - samples/sec: 8048.55 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-19 19:56:33,323 epoch 10 - iter 1424/1786 - loss 0.23172756 - time (sec): 24.42 - samples/sec: 8110.64 - lr: 0.000001 - momentum: 0.000000 |
|
2023-10-19 19:56:36,431 epoch 10 - iter 1602/1786 - loss 0.23408678 - time (sec): 27.53 - samples/sec: 8114.79 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-19 19:56:39,501 epoch 10 - iter 1780/1786 - loss 0.23658798 - time (sec): 30.60 - samples/sec: 8109.58 - lr: 0.000000 - momentum: 0.000000 |
|
2023-10-19 19:56:39,599 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:56:39,599 EPOCH 10 done: loss 0.2373 - lr: 0.000000 |
|
2023-10-19 19:56:42,416 DEV : loss 0.19595196843147278 - f1-score (micro avg) 0.5103 |
|
2023-10-19 19:56:42,430 saving best model |
|
2023-10-19 19:56:42,489 ---------------------------------------------------------------------------------------------------- |
|
2023-10-19 19:56:42,489 Loading model from best epoch ... |
|
2023-10-19 19:56:42,562 SequenceTagger predicts: Dictionary with 17 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd |
|
2023-10-19 19:56:47,193 |
|
Results: |
|
- F-score (micro) 0.414 |
|
- F-score (macro) 0.2562 |
|
- Accuracy 0.271 |
|
|
|
By class: |
|
precision recall f1-score support |
|
|
|
LOC 0.3964 0.5169 0.4487 1095 |
|
PER 0.4249 0.4951 0.4573 1012 |
|
ORG 0.1581 0.0952 0.1189 357 |
|
HumanProd 0.0000 0.0000 0.0000 33 |
|
|
|
micro avg 0.3901 0.4409 0.4140 2497 |
|
macro avg 0.2449 0.2768 0.2562 2497 |
|
weighted avg 0.3686 0.4409 0.3991 2497 |
|
|
|
2023-10-19 19:56:47,193 ---------------------------------------------------------------------------------------------------- |
|
|