stefan-it's picture
Upload folder using huggingface_hub
6f55a72
2023-10-13 10:54:48,408 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,409 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-13 10:54:48,409 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,409 MultiCorpus: 966 train + 219 dev + 204 test sentences
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
2023-10-13 10:54:48,409 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,409 Train: 966 sentences
2023-10-13 10:54:48,409 (train_with_dev=False, train_with_test=False)
2023-10-13 10:54:48,409 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,409 Training Params:
2023-10-13 10:54:48,409 - learning_rate: "5e-05"
2023-10-13 10:54:48,409 - mini_batch_size: "8"
2023-10-13 10:54:48,409 - max_epochs: "10"
2023-10-13 10:54:48,409 - shuffle: "True"
2023-10-13 10:54:48,409 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,409 Plugins:
2023-10-13 10:54:48,409 - LinearScheduler | warmup_fraction: '0.1'
2023-10-13 10:54:48,409 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,409 Final evaluation on model from best epoch (best-model.pt)
2023-10-13 10:54:48,409 - metric: "('micro avg', 'f1-score')"
2023-10-13 10:54:48,410 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,412 Computation:
2023-10-13 10:54:48,412 - compute on device: cuda:0
2023-10-13 10:54:48,412 - embedding storage: none
2023-10-13 10:54:48,412 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,412 Model training base path: "hmbench-ajmc/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-13 10:54:48,412 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:48,412 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:49,181 epoch 1 - iter 12/121 - loss 3.14771362 - time (sec): 0.77 - samples/sec: 3246.69 - lr: 0.000005 - momentum: 0.000000
2023-10-13 10:54:49,914 epoch 1 - iter 24/121 - loss 2.87447029 - time (sec): 1.50 - samples/sec: 3321.26 - lr: 0.000010 - momentum: 0.000000
2023-10-13 10:54:50,648 epoch 1 - iter 36/121 - loss 2.30639468 - time (sec): 2.23 - samples/sec: 3313.60 - lr: 0.000014 - momentum: 0.000000
2023-10-13 10:54:51,393 epoch 1 - iter 48/121 - loss 1.86202900 - time (sec): 2.98 - samples/sec: 3386.82 - lr: 0.000019 - momentum: 0.000000
2023-10-13 10:54:52,089 epoch 1 - iter 60/121 - loss 1.62992931 - time (sec): 3.68 - samples/sec: 3364.35 - lr: 0.000024 - momentum: 0.000000
2023-10-13 10:54:52,804 epoch 1 - iter 72/121 - loss 1.46714359 - time (sec): 4.39 - samples/sec: 3328.47 - lr: 0.000029 - momentum: 0.000000
2023-10-13 10:54:53,553 epoch 1 - iter 84/121 - loss 1.32456118 - time (sec): 5.14 - samples/sec: 3352.53 - lr: 0.000034 - momentum: 0.000000
2023-10-13 10:54:54,255 epoch 1 - iter 96/121 - loss 1.20144712 - time (sec): 5.84 - samples/sec: 3370.14 - lr: 0.000039 - momentum: 0.000000
2023-10-13 10:54:54,961 epoch 1 - iter 108/121 - loss 1.10783645 - time (sec): 6.55 - samples/sec: 3354.64 - lr: 0.000044 - momentum: 0.000000
2023-10-13 10:54:55,709 epoch 1 - iter 120/121 - loss 1.01683226 - time (sec): 7.30 - samples/sec: 3374.04 - lr: 0.000049 - momentum: 0.000000
2023-10-13 10:54:55,757 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:55,757 EPOCH 1 done: loss 1.0138 - lr: 0.000049
2023-10-13 10:54:56,361 DEV : loss 0.23680388927459717 - f1-score (micro avg) 0.5388
2023-10-13 10:54:56,366 saving best model
2023-10-13 10:54:56,730 ----------------------------------------------------------------------------------------------------
2023-10-13 10:54:57,427 epoch 2 - iter 12/121 - loss 0.20255724 - time (sec): 0.69 - samples/sec: 3361.30 - lr: 0.000049 - momentum: 0.000000
2023-10-13 10:54:58,237 epoch 2 - iter 24/121 - loss 0.21231262 - time (sec): 1.51 - samples/sec: 3407.31 - lr: 0.000049 - momentum: 0.000000
2023-10-13 10:54:58,957 epoch 2 - iter 36/121 - loss 0.23029475 - time (sec): 2.23 - samples/sec: 3408.61 - lr: 0.000048 - momentum: 0.000000
2023-10-13 10:54:59,663 epoch 2 - iter 48/121 - loss 0.21384997 - time (sec): 2.93 - samples/sec: 3428.03 - lr: 0.000048 - momentum: 0.000000
2023-10-13 10:55:00,490 epoch 2 - iter 60/121 - loss 0.21192044 - time (sec): 3.76 - samples/sec: 3293.28 - lr: 0.000047 - momentum: 0.000000
2023-10-13 10:55:01,197 epoch 2 - iter 72/121 - loss 0.20341878 - time (sec): 4.47 - samples/sec: 3361.68 - lr: 0.000047 - momentum: 0.000000
2023-10-13 10:55:01,975 epoch 2 - iter 84/121 - loss 0.19455859 - time (sec): 5.24 - samples/sec: 3354.58 - lr: 0.000046 - momentum: 0.000000
2023-10-13 10:55:02,628 epoch 2 - iter 96/121 - loss 0.19720687 - time (sec): 5.90 - samples/sec: 3344.77 - lr: 0.000046 - momentum: 0.000000
2023-10-13 10:55:03,301 epoch 2 - iter 108/121 - loss 0.19154190 - time (sec): 6.57 - samples/sec: 3349.15 - lr: 0.000045 - momentum: 0.000000
2023-10-13 10:55:03,993 epoch 2 - iter 120/121 - loss 0.18515556 - time (sec): 7.26 - samples/sec: 3382.82 - lr: 0.000045 - momentum: 0.000000
2023-10-13 10:55:04,064 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:04,064 EPOCH 2 done: loss 0.1843 - lr: 0.000045
2023-10-13 10:55:04,888 DEV : loss 0.10966917872428894 - f1-score (micro avg) 0.8175
2023-10-13 10:55:04,893 saving best model
2023-10-13 10:55:05,377 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:06,198 epoch 3 - iter 12/121 - loss 0.11076705 - time (sec): 0.81 - samples/sec: 3086.17 - lr: 0.000044 - momentum: 0.000000
2023-10-13 10:55:06,938 epoch 3 - iter 24/121 - loss 0.10298905 - time (sec): 1.55 - samples/sec: 3074.98 - lr: 0.000043 - momentum: 0.000000
2023-10-13 10:55:07,663 epoch 3 - iter 36/121 - loss 0.10131589 - time (sec): 2.28 - samples/sec: 3144.09 - lr: 0.000043 - momentum: 0.000000
2023-10-13 10:55:08,379 epoch 3 - iter 48/121 - loss 0.11446314 - time (sec): 2.99 - samples/sec: 3190.59 - lr: 0.000042 - momentum: 0.000000
2023-10-13 10:55:09,047 epoch 3 - iter 60/121 - loss 0.11596778 - time (sec): 3.66 - samples/sec: 3258.43 - lr: 0.000042 - momentum: 0.000000
2023-10-13 10:55:09,821 epoch 3 - iter 72/121 - loss 0.10976747 - time (sec): 4.43 - samples/sec: 3271.32 - lr: 0.000041 - momentum: 0.000000
2023-10-13 10:55:10,527 epoch 3 - iter 84/121 - loss 0.10180285 - time (sec): 5.14 - samples/sec: 3296.57 - lr: 0.000041 - momentum: 0.000000
2023-10-13 10:55:11,274 epoch 3 - iter 96/121 - loss 0.10085095 - time (sec): 5.89 - samples/sec: 3329.28 - lr: 0.000040 - momentum: 0.000000
2023-10-13 10:55:11,972 epoch 3 - iter 108/121 - loss 0.10286277 - time (sec): 6.58 - samples/sec: 3306.52 - lr: 0.000040 - momentum: 0.000000
2023-10-13 10:55:12,740 epoch 3 - iter 120/121 - loss 0.10356862 - time (sec): 7.35 - samples/sec: 3347.46 - lr: 0.000039 - momentum: 0.000000
2023-10-13 10:55:12,792 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:12,792 EPOCH 3 done: loss 0.1035 - lr: 0.000039
2023-10-13 10:55:13,606 DEV : loss 0.11740203946828842 - f1-score (micro avg) 0.8245
2023-10-13 10:55:13,611 saving best model
2023-10-13 10:55:14,084 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:14,818 epoch 4 - iter 12/121 - loss 0.05964679 - time (sec): 0.72 - samples/sec: 3489.33 - lr: 0.000038 - momentum: 0.000000
2023-10-13 10:55:15,519 epoch 4 - iter 24/121 - loss 0.06408530 - time (sec): 1.43 - samples/sec: 3394.82 - lr: 0.000038 - momentum: 0.000000
2023-10-13 10:55:16,224 epoch 4 - iter 36/121 - loss 0.05388743 - time (sec): 2.13 - samples/sec: 3467.30 - lr: 0.000037 - momentum: 0.000000
2023-10-13 10:55:16,979 epoch 4 - iter 48/121 - loss 0.05887078 - time (sec): 2.88 - samples/sec: 3496.77 - lr: 0.000037 - momentum: 0.000000
2023-10-13 10:55:17,738 epoch 4 - iter 60/121 - loss 0.06614795 - time (sec): 3.64 - samples/sec: 3427.29 - lr: 0.000036 - momentum: 0.000000
2023-10-13 10:55:18,445 epoch 4 - iter 72/121 - loss 0.06109136 - time (sec): 4.35 - samples/sec: 3359.56 - lr: 0.000036 - momentum: 0.000000
2023-10-13 10:55:19,232 epoch 4 - iter 84/121 - loss 0.06240943 - time (sec): 5.14 - samples/sec: 3309.75 - lr: 0.000035 - momentum: 0.000000
2023-10-13 10:55:19,966 epoch 4 - iter 96/121 - loss 0.06551218 - time (sec): 5.87 - samples/sec: 3330.42 - lr: 0.000035 - momentum: 0.000000
2023-10-13 10:55:20,676 epoch 4 - iter 108/121 - loss 0.06868814 - time (sec): 6.58 - samples/sec: 3363.35 - lr: 0.000034 - momentum: 0.000000
2023-10-13 10:55:21,399 epoch 4 - iter 120/121 - loss 0.06821591 - time (sec): 7.31 - samples/sec: 3377.92 - lr: 0.000034 - momentum: 0.000000
2023-10-13 10:55:21,443 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:21,443 EPOCH 4 done: loss 0.0681 - lr: 0.000034
2023-10-13 10:55:22,264 DEV : loss 0.10582716763019562 - f1-score (micro avg) 0.8177
2023-10-13 10:55:22,269 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:23,008 epoch 5 - iter 12/121 - loss 0.04243046 - time (sec): 0.74 - samples/sec: 3473.16 - lr: 0.000033 - momentum: 0.000000
2023-10-13 10:55:23,714 epoch 5 - iter 24/121 - loss 0.04252731 - time (sec): 1.44 - samples/sec: 3585.21 - lr: 0.000032 - momentum: 0.000000
2023-10-13 10:55:24,474 epoch 5 - iter 36/121 - loss 0.04475971 - time (sec): 2.20 - samples/sec: 3472.99 - lr: 0.000032 - momentum: 0.000000
2023-10-13 10:55:25,222 epoch 5 - iter 48/121 - loss 0.04517899 - time (sec): 2.95 - samples/sec: 3387.59 - lr: 0.000031 - momentum: 0.000000
2023-10-13 10:55:25,871 epoch 5 - iter 60/121 - loss 0.04598182 - time (sec): 3.60 - samples/sec: 3464.70 - lr: 0.000031 - momentum: 0.000000
2023-10-13 10:55:26,537 epoch 5 - iter 72/121 - loss 0.04392902 - time (sec): 4.27 - samples/sec: 3455.50 - lr: 0.000030 - momentum: 0.000000
2023-10-13 10:55:27,236 epoch 5 - iter 84/121 - loss 0.04396254 - time (sec): 4.97 - samples/sec: 3425.14 - lr: 0.000030 - momentum: 0.000000
2023-10-13 10:55:27,977 epoch 5 - iter 96/121 - loss 0.04645070 - time (sec): 5.71 - samples/sec: 3410.91 - lr: 0.000029 - momentum: 0.000000
2023-10-13 10:55:28,713 epoch 5 - iter 108/121 - loss 0.04401297 - time (sec): 6.44 - samples/sec: 3437.03 - lr: 0.000029 - momentum: 0.000000
2023-10-13 10:55:29,440 epoch 5 - iter 120/121 - loss 0.04427949 - time (sec): 7.17 - samples/sec: 3433.91 - lr: 0.000028 - momentum: 0.000000
2023-10-13 10:55:29,492 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:29,492 EPOCH 5 done: loss 0.0451 - lr: 0.000028
2023-10-13 10:55:30,247 DEV : loss 0.11472862958908081 - f1-score (micro avg) 0.8394
2023-10-13 10:55:30,251 saving best model
2023-10-13 10:55:30,714 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:31,439 epoch 6 - iter 12/121 - loss 0.02774532 - time (sec): 0.72 - samples/sec: 3251.96 - lr: 0.000027 - momentum: 0.000000
2023-10-13 10:55:32,156 epoch 6 - iter 24/121 - loss 0.02954858 - time (sec): 1.44 - samples/sec: 3403.58 - lr: 0.000027 - momentum: 0.000000
2023-10-13 10:55:32,830 epoch 6 - iter 36/121 - loss 0.02876125 - time (sec): 2.11 - samples/sec: 3304.54 - lr: 0.000026 - momentum: 0.000000
2023-10-13 10:55:33,632 epoch 6 - iter 48/121 - loss 0.03261705 - time (sec): 2.92 - samples/sec: 3380.44 - lr: 0.000026 - momentum: 0.000000
2023-10-13 10:55:34,395 epoch 6 - iter 60/121 - loss 0.02966616 - time (sec): 3.68 - samples/sec: 3363.67 - lr: 0.000025 - momentum: 0.000000
2023-10-13 10:55:35,133 epoch 6 - iter 72/121 - loss 0.03012673 - time (sec): 4.42 - samples/sec: 3340.41 - lr: 0.000025 - momentum: 0.000000
2023-10-13 10:55:35,872 epoch 6 - iter 84/121 - loss 0.02742035 - time (sec): 5.16 - samples/sec: 3331.72 - lr: 0.000024 - momentum: 0.000000
2023-10-13 10:55:36,612 epoch 6 - iter 96/121 - loss 0.02652985 - time (sec): 5.90 - samples/sec: 3354.03 - lr: 0.000024 - momentum: 0.000000
2023-10-13 10:55:37,366 epoch 6 - iter 108/121 - loss 0.03185571 - time (sec): 6.65 - samples/sec: 3345.13 - lr: 0.000023 - momentum: 0.000000
2023-10-13 10:55:38,078 epoch 6 - iter 120/121 - loss 0.03129772 - time (sec): 7.36 - samples/sec: 3338.44 - lr: 0.000022 - momentum: 0.000000
2023-10-13 10:55:38,132 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:38,132 EPOCH 6 done: loss 0.0311 - lr: 0.000022
2023-10-13 10:55:39,020 DEV : loss 0.13764505088329315 - f1-score (micro avg) 0.8389
2023-10-13 10:55:39,028 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:39,864 epoch 7 - iter 12/121 - loss 0.01945120 - time (sec): 0.83 - samples/sec: 2868.55 - lr: 0.000022 - momentum: 0.000000
2023-10-13 10:55:40,660 epoch 7 - iter 24/121 - loss 0.01862937 - time (sec): 1.63 - samples/sec: 2970.72 - lr: 0.000021 - momentum: 0.000000
2023-10-13 10:55:41,384 epoch 7 - iter 36/121 - loss 0.02012006 - time (sec): 2.35 - samples/sec: 3228.08 - lr: 0.000021 - momentum: 0.000000
2023-10-13 10:55:42,178 epoch 7 - iter 48/121 - loss 0.02303103 - time (sec): 3.15 - samples/sec: 3216.43 - lr: 0.000020 - momentum: 0.000000
2023-10-13 10:55:42,956 epoch 7 - iter 60/121 - loss 0.02222705 - time (sec): 3.93 - samples/sec: 3200.39 - lr: 0.000020 - momentum: 0.000000
2023-10-13 10:55:43,678 epoch 7 - iter 72/121 - loss 0.02355750 - time (sec): 4.65 - samples/sec: 3179.16 - lr: 0.000019 - momentum: 0.000000
2023-10-13 10:55:44,340 epoch 7 - iter 84/121 - loss 0.02354553 - time (sec): 5.31 - samples/sec: 3186.74 - lr: 0.000019 - momentum: 0.000000
2023-10-13 10:55:45,150 epoch 7 - iter 96/121 - loss 0.02328021 - time (sec): 6.12 - samples/sec: 3195.76 - lr: 0.000018 - momentum: 0.000000
2023-10-13 10:55:45,856 epoch 7 - iter 108/121 - loss 0.02189394 - time (sec): 6.83 - samples/sec: 3212.90 - lr: 0.000017 - momentum: 0.000000
2023-10-13 10:55:46,580 epoch 7 - iter 120/121 - loss 0.02167211 - time (sec): 7.55 - samples/sec: 3248.85 - lr: 0.000017 - momentum: 0.000000
2023-10-13 10:55:46,645 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:46,646 EPOCH 7 done: loss 0.0217 - lr: 0.000017
2023-10-13 10:55:47,543 DEV : loss 0.16001836955547333 - f1-score (micro avg) 0.8126
2023-10-13 10:55:47,551 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:48,303 epoch 8 - iter 12/121 - loss 0.01474380 - time (sec): 0.75 - samples/sec: 3252.24 - lr: 0.000016 - momentum: 0.000000
2023-10-13 10:55:49,113 epoch 8 - iter 24/121 - loss 0.01298771 - time (sec): 1.56 - samples/sec: 3154.58 - lr: 0.000016 - momentum: 0.000000
2023-10-13 10:55:49,884 epoch 8 - iter 36/121 - loss 0.01233272 - time (sec): 2.33 - samples/sec: 3049.89 - lr: 0.000015 - momentum: 0.000000
2023-10-13 10:55:50,617 epoch 8 - iter 48/121 - loss 0.01327043 - time (sec): 3.06 - samples/sec: 3075.12 - lr: 0.000015 - momentum: 0.000000
2023-10-13 10:55:51,365 epoch 8 - iter 60/121 - loss 0.01111132 - time (sec): 3.81 - samples/sec: 3134.17 - lr: 0.000014 - momentum: 0.000000
2023-10-13 10:55:52,069 epoch 8 - iter 72/121 - loss 0.01595594 - time (sec): 4.52 - samples/sec: 3183.84 - lr: 0.000014 - momentum: 0.000000
2023-10-13 10:55:52,805 epoch 8 - iter 84/121 - loss 0.01603433 - time (sec): 5.25 - samples/sec: 3245.07 - lr: 0.000013 - momentum: 0.000000
2023-10-13 10:55:53,606 epoch 8 - iter 96/121 - loss 0.01660554 - time (sec): 6.05 - samples/sec: 3220.79 - lr: 0.000013 - momentum: 0.000000
2023-10-13 10:55:54,329 epoch 8 - iter 108/121 - loss 0.01569970 - time (sec): 6.78 - samples/sec: 3206.30 - lr: 0.000012 - momentum: 0.000000
2023-10-13 10:55:55,136 epoch 8 - iter 120/121 - loss 0.01428220 - time (sec): 7.58 - samples/sec: 3236.16 - lr: 0.000011 - momentum: 0.000000
2023-10-13 10:55:55,198 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:55,198 EPOCH 8 done: loss 0.0142 - lr: 0.000011
2023-10-13 10:55:56,026 DEV : loss 0.1622304916381836 - f1-score (micro avg) 0.8333
2023-10-13 10:55:56,034 ----------------------------------------------------------------------------------------------------
2023-10-13 10:55:56,823 epoch 9 - iter 12/121 - loss 0.00734276 - time (sec): 0.79 - samples/sec: 3161.19 - lr: 0.000011 - momentum: 0.000000
2023-10-13 10:55:57,580 epoch 9 - iter 24/121 - loss 0.00823965 - time (sec): 1.54 - samples/sec: 3157.20 - lr: 0.000010 - momentum: 0.000000
2023-10-13 10:55:58,388 epoch 9 - iter 36/121 - loss 0.00744907 - time (sec): 2.35 - samples/sec: 3206.62 - lr: 0.000010 - momentum: 0.000000
2023-10-13 10:55:59,209 epoch 9 - iter 48/121 - loss 0.00934815 - time (sec): 3.17 - samples/sec: 3222.99 - lr: 0.000009 - momentum: 0.000000
2023-10-13 10:55:59,958 epoch 9 - iter 60/121 - loss 0.01336121 - time (sec): 3.92 - samples/sec: 3263.50 - lr: 0.000009 - momentum: 0.000000
2023-10-13 10:56:00,688 epoch 9 - iter 72/121 - loss 0.01236710 - time (sec): 4.65 - samples/sec: 3308.87 - lr: 0.000008 - momentum: 0.000000
2023-10-13 10:56:01,428 epoch 9 - iter 84/121 - loss 0.01223896 - time (sec): 5.39 - samples/sec: 3321.90 - lr: 0.000008 - momentum: 0.000000
2023-10-13 10:56:02,122 epoch 9 - iter 96/121 - loss 0.01146277 - time (sec): 6.09 - samples/sec: 3317.36 - lr: 0.000007 - momentum: 0.000000
2023-10-13 10:56:02,789 epoch 9 - iter 108/121 - loss 0.01217657 - time (sec): 6.75 - samples/sec: 3288.56 - lr: 0.000006 - momentum: 0.000000
2023-10-13 10:56:03,446 epoch 9 - iter 120/121 - loss 0.01175700 - time (sec): 7.41 - samples/sec: 3308.98 - lr: 0.000006 - momentum: 0.000000
2023-10-13 10:56:03,513 ----------------------------------------------------------------------------------------------------
2023-10-13 10:56:03,513 EPOCH 9 done: loss 0.0117 - lr: 0.000006
2023-10-13 10:56:04,332 DEV : loss 0.17858679592609406 - f1-score (micro avg) 0.8356
2023-10-13 10:56:04,338 ----------------------------------------------------------------------------------------------------
2023-10-13 10:56:05,033 epoch 10 - iter 12/121 - loss 0.00229284 - time (sec): 0.69 - samples/sec: 3364.56 - lr: 0.000005 - momentum: 0.000000
2023-10-13 10:56:05,772 epoch 10 - iter 24/121 - loss 0.00453598 - time (sec): 1.43 - samples/sec: 3421.09 - lr: 0.000005 - momentum: 0.000000
2023-10-13 10:56:06,590 epoch 10 - iter 36/121 - loss 0.00408723 - time (sec): 2.25 - samples/sec: 3374.08 - lr: 0.000004 - momentum: 0.000000
2023-10-13 10:56:07,272 epoch 10 - iter 48/121 - loss 0.00454802 - time (sec): 2.93 - samples/sec: 3386.03 - lr: 0.000004 - momentum: 0.000000
2023-10-13 10:56:08,021 epoch 10 - iter 60/121 - loss 0.00619839 - time (sec): 3.68 - samples/sec: 3330.19 - lr: 0.000003 - momentum: 0.000000
2023-10-13 10:56:08,831 epoch 10 - iter 72/121 - loss 0.00569237 - time (sec): 4.49 - samples/sec: 3272.97 - lr: 0.000003 - momentum: 0.000000
2023-10-13 10:56:09,561 epoch 10 - iter 84/121 - loss 0.00730347 - time (sec): 5.22 - samples/sec: 3267.79 - lr: 0.000002 - momentum: 0.000000
2023-10-13 10:56:10,268 epoch 10 - iter 96/121 - loss 0.00861050 - time (sec): 5.93 - samples/sec: 3300.12 - lr: 0.000001 - momentum: 0.000000
2023-10-13 10:56:11,038 epoch 10 - iter 108/121 - loss 0.00802835 - time (sec): 6.70 - samples/sec: 3283.30 - lr: 0.000001 - momentum: 0.000000
2023-10-13 10:56:11,770 epoch 10 - iter 120/121 - loss 0.00909751 - time (sec): 7.43 - samples/sec: 3305.64 - lr: 0.000000 - momentum: 0.000000
2023-10-13 10:56:11,823 ----------------------------------------------------------------------------------------------------
2023-10-13 10:56:11,823 EPOCH 10 done: loss 0.0090 - lr: 0.000000
2023-10-13 10:56:12,608 DEV : loss 0.176446795463562 - f1-score (micro avg) 0.8344
2023-10-13 10:56:13,002 ----------------------------------------------------------------------------------------------------
2023-10-13 10:56:13,003 Loading model from best epoch ...
2023-10-13 10:56:14,394 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
2023-10-13 10:56:15,250
Results:
- F-score (micro) 0.7978
- F-score (macro) 0.4732
- Accuracy 0.6805
By class:
precision recall f1-score support
pers 0.7925 0.9065 0.8456 139
scope 0.8222 0.8605 0.8409 129
work 0.6829 0.7000 0.6914 80
loc 0.7500 0.3333 0.4615 9
date 0.0000 0.0000 0.0000 3
object 0.0000 0.0000 0.0000 0
micro avg 0.7749 0.8222 0.7978 360
macro avg 0.5079 0.4667 0.4732 360
weighted avg 0.7711 0.8222 0.7930 360
2023-10-13 10:56:15,251 ----------------------------------------------------------------------------------------------------