stefan-it's picture
Upload folder using huggingface_hub
11e0d4c
2023-10-17 10:48:27,032 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 10:48:27,033 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 MultiCorpus: 966 train + 219 dev + 204 test sentences
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
2023-10-17 10:48:27,033 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 Train: 966 sentences
2023-10-17 10:48:27,033 (train_with_dev=False, train_with_test=False)
2023-10-17 10:48:27,033 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 Training Params:
2023-10-17 10:48:27,033 - learning_rate: "3e-05"
2023-10-17 10:48:27,033 - mini_batch_size: "8"
2023-10-17 10:48:27,033 - max_epochs: "10"
2023-10-17 10:48:27,033 - shuffle: "True"
2023-10-17 10:48:27,033 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 Plugins:
2023-10-17 10:48:27,033 - TensorboardLogger
2023-10-17 10:48:27,033 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 10:48:27,033 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 10:48:27,033 - metric: "('micro avg', 'f1-score')"
2023-10-17 10:48:27,033 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,033 Computation:
2023-10-17 10:48:27,033 - compute on device: cuda:0
2023-10-17 10:48:27,034 - embedding storage: none
2023-10-17 10:48:27,034 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,034 Model training base path: "hmbench-ajmc/fr-hmteams/teams-base-historic-multilingual-discriminator-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3"
2023-10-17 10:48:27,034 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,034 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:27,034 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 10:48:27,746 epoch 1 - iter 12/121 - loss 4.39690925 - time (sec): 0.71 - samples/sec: 3399.72 - lr: 0.000003 - momentum: 0.000000
2023-10-17 10:48:28,540 epoch 1 - iter 24/121 - loss 4.10541631 - time (sec): 1.51 - samples/sec: 3389.91 - lr: 0.000006 - momentum: 0.000000
2023-10-17 10:48:29,240 epoch 1 - iter 36/121 - loss 3.51087937 - time (sec): 2.21 - samples/sec: 3446.66 - lr: 0.000009 - momentum: 0.000000
2023-10-17 10:48:29,972 epoch 1 - iter 48/121 - loss 2.88568294 - time (sec): 2.94 - samples/sec: 3430.42 - lr: 0.000012 - momentum: 0.000000
2023-10-17 10:48:30,755 epoch 1 - iter 60/121 - loss 2.39380784 - time (sec): 3.72 - samples/sec: 3397.65 - lr: 0.000015 - momentum: 0.000000
2023-10-17 10:48:31,507 epoch 1 - iter 72/121 - loss 2.10713332 - time (sec): 4.47 - samples/sec: 3363.92 - lr: 0.000018 - momentum: 0.000000
2023-10-17 10:48:32,220 epoch 1 - iter 84/121 - loss 1.89493238 - time (sec): 5.18 - samples/sec: 3352.57 - lr: 0.000021 - momentum: 0.000000
2023-10-17 10:48:32,956 epoch 1 - iter 96/121 - loss 1.71497423 - time (sec): 5.92 - samples/sec: 3342.98 - lr: 0.000024 - momentum: 0.000000
2023-10-17 10:48:33,705 epoch 1 - iter 108/121 - loss 1.57778109 - time (sec): 6.67 - samples/sec: 3313.42 - lr: 0.000027 - momentum: 0.000000
2023-10-17 10:48:34,488 epoch 1 - iter 120/121 - loss 1.44925092 - time (sec): 7.45 - samples/sec: 3299.22 - lr: 0.000030 - momentum: 0.000000
2023-10-17 10:48:34,538 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:34,539 EPOCH 1 done: loss 1.4411 - lr: 0.000030
2023-10-17 10:48:35,125 DEV : loss 0.25803616642951965 - f1-score (micro avg) 0.5036
2023-10-17 10:48:35,131 saving best model
2023-10-17 10:48:35,590 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:36,422 epoch 2 - iter 12/121 - loss 0.29291001 - time (sec): 0.83 - samples/sec: 3144.34 - lr: 0.000030 - momentum: 0.000000
2023-10-17 10:48:37,149 epoch 2 - iter 24/121 - loss 0.31351141 - time (sec): 1.56 - samples/sec: 3179.49 - lr: 0.000029 - momentum: 0.000000
2023-10-17 10:48:37,864 epoch 2 - iter 36/121 - loss 0.28292452 - time (sec): 2.27 - samples/sec: 3282.14 - lr: 0.000029 - momentum: 0.000000
2023-10-17 10:48:38,596 epoch 2 - iter 48/121 - loss 0.26449553 - time (sec): 3.01 - samples/sec: 3255.84 - lr: 0.000029 - momentum: 0.000000
2023-10-17 10:48:39,339 epoch 2 - iter 60/121 - loss 0.25450082 - time (sec): 3.75 - samples/sec: 3204.88 - lr: 0.000028 - momentum: 0.000000
2023-10-17 10:48:40,077 epoch 2 - iter 72/121 - loss 0.24660048 - time (sec): 4.49 - samples/sec: 3243.50 - lr: 0.000028 - momentum: 0.000000
2023-10-17 10:48:40,782 epoch 2 - iter 84/121 - loss 0.24229750 - time (sec): 5.19 - samples/sec: 3239.27 - lr: 0.000028 - momentum: 0.000000
2023-10-17 10:48:41,557 epoch 2 - iter 96/121 - loss 0.22592468 - time (sec): 5.97 - samples/sec: 3300.84 - lr: 0.000027 - momentum: 0.000000
2023-10-17 10:48:42,264 epoch 2 - iter 108/121 - loss 0.21793866 - time (sec): 6.67 - samples/sec: 3306.53 - lr: 0.000027 - momentum: 0.000000
2023-10-17 10:48:43,083 epoch 2 - iter 120/121 - loss 0.21207618 - time (sec): 7.49 - samples/sec: 3279.85 - lr: 0.000027 - momentum: 0.000000
2023-10-17 10:48:43,136 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:43,136 EPOCH 2 done: loss 0.2118 - lr: 0.000027
2023-10-17 10:48:43,887 DEV : loss 0.12447294592857361 - f1-score (micro avg) 0.7899
2023-10-17 10:48:43,892 saving best model
2023-10-17 10:48:44,583 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:45,341 epoch 3 - iter 12/121 - loss 0.18133639 - time (sec): 0.75 - samples/sec: 3139.88 - lr: 0.000026 - momentum: 0.000000
2023-10-17 10:48:46,147 epoch 3 - iter 24/121 - loss 0.14051621 - time (sec): 1.56 - samples/sec: 3138.88 - lr: 0.000026 - momentum: 0.000000
2023-10-17 10:48:46,917 epoch 3 - iter 36/121 - loss 0.11820910 - time (sec): 2.33 - samples/sec: 3092.23 - lr: 0.000026 - momentum: 0.000000
2023-10-17 10:48:47,624 epoch 3 - iter 48/121 - loss 0.11595470 - time (sec): 3.04 - samples/sec: 3187.56 - lr: 0.000025 - momentum: 0.000000
2023-10-17 10:48:48,351 epoch 3 - iter 60/121 - loss 0.12463239 - time (sec): 3.76 - samples/sec: 3182.21 - lr: 0.000025 - momentum: 0.000000
2023-10-17 10:48:49,071 epoch 3 - iter 72/121 - loss 0.12167198 - time (sec): 4.48 - samples/sec: 3258.11 - lr: 0.000025 - momentum: 0.000000
2023-10-17 10:48:49,837 epoch 3 - iter 84/121 - loss 0.12062476 - time (sec): 5.25 - samples/sec: 3277.28 - lr: 0.000024 - momentum: 0.000000
2023-10-17 10:48:50,593 epoch 3 - iter 96/121 - loss 0.11846904 - time (sec): 6.01 - samples/sec: 3264.80 - lr: 0.000024 - momentum: 0.000000
2023-10-17 10:48:51,373 epoch 3 - iter 108/121 - loss 0.11360768 - time (sec): 6.79 - samples/sec: 3237.62 - lr: 0.000024 - momentum: 0.000000
2023-10-17 10:48:52,169 epoch 3 - iter 120/121 - loss 0.11329058 - time (sec): 7.58 - samples/sec: 3245.58 - lr: 0.000023 - momentum: 0.000000
2023-10-17 10:48:52,238 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:52,238 EPOCH 3 done: loss 0.1128 - lr: 0.000023
2023-10-17 10:48:52,992 DEV : loss 0.1195056214928627 - f1-score (micro avg) 0.813
2023-10-17 10:48:52,997 saving best model
2023-10-17 10:48:53,466 ----------------------------------------------------------------------------------------------------
2023-10-17 10:48:54,243 epoch 4 - iter 12/121 - loss 0.08908308 - time (sec): 0.78 - samples/sec: 3088.53 - lr: 0.000023 - momentum: 0.000000
2023-10-17 10:48:55,010 epoch 4 - iter 24/121 - loss 0.06552881 - time (sec): 1.54 - samples/sec: 3254.25 - lr: 0.000023 - momentum: 0.000000
2023-10-17 10:48:55,721 epoch 4 - iter 36/121 - loss 0.07341600 - time (sec): 2.25 - samples/sec: 3305.99 - lr: 0.000022 - momentum: 0.000000
2023-10-17 10:48:56,475 epoch 4 - iter 48/121 - loss 0.08798614 - time (sec): 3.01 - samples/sec: 3313.41 - lr: 0.000022 - momentum: 0.000000
2023-10-17 10:48:57,211 epoch 4 - iter 60/121 - loss 0.08929128 - time (sec): 3.74 - samples/sec: 3279.83 - lr: 0.000022 - momentum: 0.000000
2023-10-17 10:48:57,963 epoch 4 - iter 72/121 - loss 0.08650757 - time (sec): 4.50 - samples/sec: 3319.01 - lr: 0.000021 - momentum: 0.000000
2023-10-17 10:48:58,699 epoch 4 - iter 84/121 - loss 0.08773390 - time (sec): 5.23 - samples/sec: 3297.14 - lr: 0.000021 - momentum: 0.000000
2023-10-17 10:48:59,576 epoch 4 - iter 96/121 - loss 0.08444612 - time (sec): 6.11 - samples/sec: 3250.44 - lr: 0.000021 - momentum: 0.000000
2023-10-17 10:49:00,286 epoch 4 - iter 108/121 - loss 0.08465591 - time (sec): 6.82 - samples/sec: 3244.90 - lr: 0.000020 - momentum: 0.000000
2023-10-17 10:49:01,059 epoch 4 - iter 120/121 - loss 0.08053900 - time (sec): 7.59 - samples/sec: 3223.96 - lr: 0.000020 - momentum: 0.000000
2023-10-17 10:49:01,140 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:01,140 EPOCH 4 done: loss 0.0801 - lr: 0.000020
2023-10-17 10:49:01,935 DEV : loss 0.13317343592643738 - f1-score (micro avg) 0.8234
2023-10-17 10:49:01,940 saving best model
2023-10-17 10:49:02,420 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:03,175 epoch 5 - iter 12/121 - loss 0.08132784 - time (sec): 0.75 - samples/sec: 3518.09 - lr: 0.000020 - momentum: 0.000000
2023-10-17 10:49:03,995 epoch 5 - iter 24/121 - loss 0.06780759 - time (sec): 1.57 - samples/sec: 3234.44 - lr: 0.000019 - momentum: 0.000000
2023-10-17 10:49:04,796 epoch 5 - iter 36/121 - loss 0.06490643 - time (sec): 2.37 - samples/sec: 3158.71 - lr: 0.000019 - momentum: 0.000000
2023-10-17 10:49:05,595 epoch 5 - iter 48/121 - loss 0.06448408 - time (sec): 3.17 - samples/sec: 3177.78 - lr: 0.000019 - momentum: 0.000000
2023-10-17 10:49:06,394 epoch 5 - iter 60/121 - loss 0.06639264 - time (sec): 3.97 - samples/sec: 3132.86 - lr: 0.000018 - momentum: 0.000000
2023-10-17 10:49:07,160 epoch 5 - iter 72/121 - loss 0.06146937 - time (sec): 4.74 - samples/sec: 3157.65 - lr: 0.000018 - momentum: 0.000000
2023-10-17 10:49:07,919 epoch 5 - iter 84/121 - loss 0.05833790 - time (sec): 5.50 - samples/sec: 3186.82 - lr: 0.000018 - momentum: 0.000000
2023-10-17 10:49:08,688 epoch 5 - iter 96/121 - loss 0.05661470 - time (sec): 6.27 - samples/sec: 3158.29 - lr: 0.000017 - momentum: 0.000000
2023-10-17 10:49:09,452 epoch 5 - iter 108/121 - loss 0.05633452 - time (sec): 7.03 - samples/sec: 3157.92 - lr: 0.000017 - momentum: 0.000000
2023-10-17 10:49:10,163 epoch 5 - iter 120/121 - loss 0.05668629 - time (sec): 7.74 - samples/sec: 3168.00 - lr: 0.000017 - momentum: 0.000000
2023-10-17 10:49:10,217 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:10,217 EPOCH 5 done: loss 0.0564 - lr: 0.000017
2023-10-17 10:49:11,000 DEV : loss 0.14495401084423065 - f1-score (micro avg) 0.8389
2023-10-17 10:49:11,006 saving best model
2023-10-17 10:49:11,497 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:12,262 epoch 6 - iter 12/121 - loss 0.06776906 - time (sec): 0.76 - samples/sec: 3231.96 - lr: 0.000016 - momentum: 0.000000
2023-10-17 10:49:13,028 epoch 6 - iter 24/121 - loss 0.04794869 - time (sec): 1.53 - samples/sec: 3133.17 - lr: 0.000016 - momentum: 0.000000
2023-10-17 10:49:13,755 epoch 6 - iter 36/121 - loss 0.05253700 - time (sec): 2.26 - samples/sec: 3184.30 - lr: 0.000016 - momentum: 0.000000
2023-10-17 10:49:14,534 epoch 6 - iter 48/121 - loss 0.04783436 - time (sec): 3.03 - samples/sec: 3220.44 - lr: 0.000015 - momentum: 0.000000
2023-10-17 10:49:15,304 epoch 6 - iter 60/121 - loss 0.04533641 - time (sec): 3.80 - samples/sec: 3224.21 - lr: 0.000015 - momentum: 0.000000
2023-10-17 10:49:16,122 epoch 6 - iter 72/121 - loss 0.04174276 - time (sec): 4.62 - samples/sec: 3228.29 - lr: 0.000015 - momentum: 0.000000
2023-10-17 10:49:16,886 epoch 6 - iter 84/121 - loss 0.04556145 - time (sec): 5.39 - samples/sec: 3238.63 - lr: 0.000014 - momentum: 0.000000
2023-10-17 10:49:17,634 epoch 6 - iter 96/121 - loss 0.04338405 - time (sec): 6.13 - samples/sec: 3217.45 - lr: 0.000014 - momentum: 0.000000
2023-10-17 10:49:18,400 epoch 6 - iter 108/121 - loss 0.04250434 - time (sec): 6.90 - samples/sec: 3210.10 - lr: 0.000014 - momentum: 0.000000
2023-10-17 10:49:19,173 epoch 6 - iter 120/121 - loss 0.04266793 - time (sec): 7.67 - samples/sec: 3211.56 - lr: 0.000013 - momentum: 0.000000
2023-10-17 10:49:19,227 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:19,227 EPOCH 6 done: loss 0.0425 - lr: 0.000013
2023-10-17 10:49:19,973 DEV : loss 0.14539101719856262 - f1-score (micro avg) 0.8327
2023-10-17 10:49:19,978 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:20,802 epoch 7 - iter 12/121 - loss 0.04067018 - time (sec): 0.82 - samples/sec: 3303.77 - lr: 0.000013 - momentum: 0.000000
2023-10-17 10:49:21,503 epoch 7 - iter 24/121 - loss 0.03984650 - time (sec): 1.52 - samples/sec: 3143.09 - lr: 0.000013 - momentum: 0.000000
2023-10-17 10:49:22,263 epoch 7 - iter 36/121 - loss 0.04274470 - time (sec): 2.28 - samples/sec: 3172.44 - lr: 0.000012 - momentum: 0.000000
2023-10-17 10:49:22,987 epoch 7 - iter 48/121 - loss 0.03734536 - time (sec): 3.01 - samples/sec: 3196.91 - lr: 0.000012 - momentum: 0.000000
2023-10-17 10:49:23,752 epoch 7 - iter 60/121 - loss 0.03564571 - time (sec): 3.77 - samples/sec: 3234.54 - lr: 0.000012 - momentum: 0.000000
2023-10-17 10:49:24,611 epoch 7 - iter 72/121 - loss 0.03504219 - time (sec): 4.63 - samples/sec: 3243.18 - lr: 0.000011 - momentum: 0.000000
2023-10-17 10:49:25,341 epoch 7 - iter 84/121 - loss 0.03291674 - time (sec): 5.36 - samples/sec: 3214.94 - lr: 0.000011 - momentum: 0.000000
2023-10-17 10:49:26,098 epoch 7 - iter 96/121 - loss 0.03094488 - time (sec): 6.12 - samples/sec: 3256.62 - lr: 0.000011 - momentum: 0.000000
2023-10-17 10:49:26,804 epoch 7 - iter 108/121 - loss 0.03005752 - time (sec): 6.82 - samples/sec: 3245.60 - lr: 0.000010 - momentum: 0.000000
2023-10-17 10:49:27,529 epoch 7 - iter 120/121 - loss 0.02989960 - time (sec): 7.55 - samples/sec: 3253.54 - lr: 0.000010 - momentum: 0.000000
2023-10-17 10:49:27,579 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:27,579 EPOCH 7 done: loss 0.0297 - lr: 0.000010
2023-10-17 10:49:28,324 DEV : loss 0.16793492436408997 - f1-score (micro avg) 0.8396
2023-10-17 10:49:28,329 saving best model
2023-10-17 10:49:28,896 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:29,689 epoch 8 - iter 12/121 - loss 0.03222385 - time (sec): 0.79 - samples/sec: 2907.27 - lr: 0.000010 - momentum: 0.000000
2023-10-17 10:49:30,469 epoch 8 - iter 24/121 - loss 0.02979099 - time (sec): 1.57 - samples/sec: 3201.32 - lr: 0.000009 - momentum: 0.000000
2023-10-17 10:49:31,176 epoch 8 - iter 36/121 - loss 0.02454332 - time (sec): 2.28 - samples/sec: 3234.31 - lr: 0.000009 - momentum: 0.000000
2023-10-17 10:49:31,916 epoch 8 - iter 48/121 - loss 0.02843440 - time (sec): 3.02 - samples/sec: 3310.66 - lr: 0.000009 - momentum: 0.000000
2023-10-17 10:49:32,698 epoch 8 - iter 60/121 - loss 0.02741868 - time (sec): 3.80 - samples/sec: 3252.85 - lr: 0.000008 - momentum: 0.000000
2023-10-17 10:49:33,479 epoch 8 - iter 72/121 - loss 0.02524621 - time (sec): 4.58 - samples/sec: 3253.05 - lr: 0.000008 - momentum: 0.000000
2023-10-17 10:49:34,229 epoch 8 - iter 84/121 - loss 0.02507580 - time (sec): 5.33 - samples/sec: 3273.93 - lr: 0.000008 - momentum: 0.000000
2023-10-17 10:49:34,953 epoch 8 - iter 96/121 - loss 0.02654991 - time (sec): 6.05 - samples/sec: 3259.99 - lr: 0.000008 - momentum: 0.000000
2023-10-17 10:49:35,681 epoch 8 - iter 108/121 - loss 0.02521198 - time (sec): 6.78 - samples/sec: 3268.71 - lr: 0.000007 - momentum: 0.000000
2023-10-17 10:49:36,431 epoch 8 - iter 120/121 - loss 0.02447953 - time (sec): 7.53 - samples/sec: 3269.97 - lr: 0.000007 - momentum: 0.000000
2023-10-17 10:49:36,480 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:36,480 EPOCH 8 done: loss 0.0244 - lr: 0.000007
2023-10-17 10:49:37,230 DEV : loss 0.1799156218767166 - f1-score (micro avg) 0.8404
2023-10-17 10:49:37,235 saving best model
2023-10-17 10:49:37,790 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:38,553 epoch 9 - iter 12/121 - loss 0.00753185 - time (sec): 0.76 - samples/sec: 3324.13 - lr: 0.000006 - momentum: 0.000000
2023-10-17 10:49:39,344 epoch 9 - iter 24/121 - loss 0.00937456 - time (sec): 1.55 - samples/sec: 3291.04 - lr: 0.000006 - momentum: 0.000000
2023-10-17 10:49:40,066 epoch 9 - iter 36/121 - loss 0.01343081 - time (sec): 2.27 - samples/sec: 3338.27 - lr: 0.000006 - momentum: 0.000000
2023-10-17 10:49:40,807 epoch 9 - iter 48/121 - loss 0.01562412 - time (sec): 3.02 - samples/sec: 3281.06 - lr: 0.000006 - momentum: 0.000000
2023-10-17 10:49:41,525 epoch 9 - iter 60/121 - loss 0.01795150 - time (sec): 3.73 - samples/sec: 3281.03 - lr: 0.000005 - momentum: 0.000000
2023-10-17 10:49:42,216 epoch 9 - iter 72/121 - loss 0.01869381 - time (sec): 4.42 - samples/sec: 3259.42 - lr: 0.000005 - momentum: 0.000000
2023-10-17 10:49:42,996 epoch 9 - iter 84/121 - loss 0.01890742 - time (sec): 5.20 - samples/sec: 3272.74 - lr: 0.000005 - momentum: 0.000000
2023-10-17 10:49:43,733 epoch 9 - iter 96/121 - loss 0.01752654 - time (sec): 5.94 - samples/sec: 3291.33 - lr: 0.000004 - momentum: 0.000000
2023-10-17 10:49:44,576 epoch 9 - iter 108/121 - loss 0.01785410 - time (sec): 6.79 - samples/sec: 3273.13 - lr: 0.000004 - momentum: 0.000000
2023-10-17 10:49:45,293 epoch 9 - iter 120/121 - loss 0.01725842 - time (sec): 7.50 - samples/sec: 3272.43 - lr: 0.000004 - momentum: 0.000000
2023-10-17 10:49:45,348 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:45,348 EPOCH 9 done: loss 0.0172 - lr: 0.000004
2023-10-17 10:49:46,099 DEV : loss 0.18883700668811798 - f1-score (micro avg) 0.8511
2023-10-17 10:49:46,104 saving best model
2023-10-17 10:49:46,620 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:47,316 epoch 10 - iter 12/121 - loss 0.00707437 - time (sec): 0.69 - samples/sec: 3395.86 - lr: 0.000003 - momentum: 0.000000
2023-10-17 10:49:48,092 epoch 10 - iter 24/121 - loss 0.01800327 - time (sec): 1.47 - samples/sec: 3314.42 - lr: 0.000003 - momentum: 0.000000
2023-10-17 10:49:48,789 epoch 10 - iter 36/121 - loss 0.01428292 - time (sec): 2.16 - samples/sec: 3324.23 - lr: 0.000003 - momentum: 0.000000
2023-10-17 10:49:49,560 epoch 10 - iter 48/121 - loss 0.01267723 - time (sec): 2.93 - samples/sec: 3369.46 - lr: 0.000002 - momentum: 0.000000
2023-10-17 10:49:50,427 epoch 10 - iter 60/121 - loss 0.01179351 - time (sec): 3.80 - samples/sec: 3316.39 - lr: 0.000002 - momentum: 0.000000
2023-10-17 10:49:51,141 epoch 10 - iter 72/121 - loss 0.01364591 - time (sec): 4.51 - samples/sec: 3309.51 - lr: 0.000002 - momentum: 0.000000
2023-10-17 10:49:51,853 epoch 10 - iter 84/121 - loss 0.01442005 - time (sec): 5.23 - samples/sec: 3334.66 - lr: 0.000001 - momentum: 0.000000
2023-10-17 10:49:52,631 epoch 10 - iter 96/121 - loss 0.01357433 - time (sec): 6.00 - samples/sec: 3330.07 - lr: 0.000001 - momentum: 0.000000
2023-10-17 10:49:53,327 epoch 10 - iter 108/121 - loss 0.01494689 - time (sec): 6.70 - samples/sec: 3327.04 - lr: 0.000001 - momentum: 0.000000
2023-10-17 10:49:54,041 epoch 10 - iter 120/121 - loss 0.01558579 - time (sec): 7.41 - samples/sec: 3324.68 - lr: 0.000000 - momentum: 0.000000
2023-10-17 10:49:54,088 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:54,088 EPOCH 10 done: loss 0.0155 - lr: 0.000000
2023-10-17 10:49:54,831 DEV : loss 0.19253204762935638 - f1-score (micro avg) 0.8403
2023-10-17 10:49:55,277 ----------------------------------------------------------------------------------------------------
2023-10-17 10:49:55,278 Loading model from best epoch ...
2023-10-17 10:49:56,648 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
2023-10-17 10:49:57,492
Results:
- F-score (micro) 0.8253
- F-score (macro) 0.5605
- Accuracy 0.7224
By class:
precision recall f1-score support
pers 0.8435 0.8921 0.8671 139
scope 0.8603 0.9070 0.8830 129
work 0.6531 0.8000 0.7191 80
loc 0.6667 0.2222 0.3333 9
date 0.0000 0.0000 0.0000 3
micro avg 0.7995 0.8528 0.8253 360
macro avg 0.6047 0.5643 0.5605 360
weighted avg 0.7958 0.8528 0.8194 360
2023-10-17 10:49:57,492 ----------------------------------------------------------------------------------------------------