stefan-it's picture
Upload folder using huggingface_hub
ffd4020
raw
history blame contribute delete
No virus
25.2 kB
2023-10-08 22:06:30,588 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,589 Model: "SequenceTagger(
(embeddings): ByT5Embeddings(
(model): T5EncoderModel(
(shared): Embedding(384, 1472)
(encoder): T5Stack(
(embed_tokens): Embedding(384, 1472)
(block): ModuleList(
(0): T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
(relative_attention_bias): Embedding(32, 6)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(1-11): 11 x T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(final_layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=1472, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-08 22:06:30,589 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,590 MultiCorpus: 966 train + 219 dev + 204 test sentences
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
2023-10-08 22:06:30,590 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,590 Train: 966 sentences
2023-10-08 22:06:30,590 (train_with_dev=False, train_with_test=False)
2023-10-08 22:06:30,590 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,590 Training Params:
2023-10-08 22:06:30,590 - learning_rate: "0.00016"
2023-10-08 22:06:30,590 - mini_batch_size: "4"
2023-10-08 22:06:30,590 - max_epochs: "10"
2023-10-08 22:06:30,590 - shuffle: "True"
2023-10-08 22:06:30,590 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,590 Plugins:
2023-10-08 22:06:30,590 - TensorboardLogger
2023-10-08 22:06:30,590 - LinearScheduler | warmup_fraction: '0.1'
2023-10-08 22:06:30,590 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,590 Final evaluation on model from best epoch (best-model.pt)
2023-10-08 22:06:30,591 - metric: "('micro avg', 'f1-score')"
2023-10-08 22:06:30,591 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,591 Computation:
2023-10-08 22:06:30,591 - compute on device: cuda:0
2023-10-08 22:06:30,591 - embedding storage: none
2023-10-08 22:06:30,591 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,591 Model training base path: "hmbench-ajmc/fr-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-4"
2023-10-08 22:06:30,591 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,591 ----------------------------------------------------------------------------------------------------
2023-10-08 22:06:30,591 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-08 22:06:40,277 epoch 1 - iter 24/242 - loss 3.24915760 - time (sec): 9.69 - samples/sec: 259.78 - lr: 0.000015 - momentum: 0.000000
2023-10-08 22:06:49,531 epoch 1 - iter 48/242 - loss 3.23978516 - time (sec): 18.94 - samples/sec: 254.13 - lr: 0.000031 - momentum: 0.000000
2023-10-08 22:06:59,576 epoch 1 - iter 72/242 - loss 3.21788354 - time (sec): 28.98 - samples/sec: 256.41 - lr: 0.000047 - momentum: 0.000000
2023-10-08 22:07:09,201 epoch 1 - iter 96/242 - loss 3.17033478 - time (sec): 38.61 - samples/sec: 255.02 - lr: 0.000063 - momentum: 0.000000
2023-10-08 22:07:19,272 epoch 1 - iter 120/242 - loss 3.08254555 - time (sec): 48.68 - samples/sec: 256.10 - lr: 0.000079 - momentum: 0.000000
2023-10-08 22:07:28,945 epoch 1 - iter 144/242 - loss 2.97907250 - time (sec): 58.35 - samples/sec: 255.33 - lr: 0.000095 - momentum: 0.000000
2023-10-08 22:07:38,730 epoch 1 - iter 168/242 - loss 2.86725038 - time (sec): 68.14 - samples/sec: 254.18 - lr: 0.000110 - momentum: 0.000000
2023-10-08 22:07:48,441 epoch 1 - iter 192/242 - loss 2.75588471 - time (sec): 77.85 - samples/sec: 252.26 - lr: 0.000126 - momentum: 0.000000
2023-10-08 22:07:58,708 epoch 1 - iter 216/242 - loss 2.62216799 - time (sec): 88.12 - samples/sec: 252.70 - lr: 0.000142 - momentum: 0.000000
2023-10-08 22:08:08,358 epoch 1 - iter 240/242 - loss 2.49259107 - time (sec): 97.77 - samples/sec: 252.07 - lr: 0.000158 - momentum: 0.000000
2023-10-08 22:08:08,901 ----------------------------------------------------------------------------------------------------
2023-10-08 22:08:08,901 EPOCH 1 done: loss 2.4872 - lr: 0.000158
2023-10-08 22:08:15,196 DEV : loss 1.084664225578308 - f1-score (micro avg) 0.0
2023-10-08 22:08:15,202 ----------------------------------------------------------------------------------------------------
2023-10-08 22:08:25,222 epoch 2 - iter 24/242 - loss 1.02334536 - time (sec): 10.02 - samples/sec: 244.53 - lr: 0.000158 - momentum: 0.000000
2023-10-08 22:08:36,087 epoch 2 - iter 48/242 - loss 0.87413933 - time (sec): 20.88 - samples/sec: 246.51 - lr: 0.000157 - momentum: 0.000000
2023-10-08 22:08:45,543 epoch 2 - iter 72/242 - loss 0.81939027 - time (sec): 30.34 - samples/sec: 241.73 - lr: 0.000155 - momentum: 0.000000
2023-10-08 22:08:55,602 epoch 2 - iter 96/242 - loss 0.76488837 - time (sec): 40.40 - samples/sec: 245.60 - lr: 0.000153 - momentum: 0.000000
2023-10-08 22:09:05,493 epoch 2 - iter 120/242 - loss 0.74692410 - time (sec): 50.29 - samples/sec: 247.01 - lr: 0.000151 - momentum: 0.000000
2023-10-08 22:09:15,125 epoch 2 - iter 144/242 - loss 0.70119401 - time (sec): 59.92 - samples/sec: 246.24 - lr: 0.000150 - momentum: 0.000000
2023-10-08 22:09:25,491 epoch 2 - iter 168/242 - loss 0.66588543 - time (sec): 70.29 - samples/sec: 246.46 - lr: 0.000148 - momentum: 0.000000
2023-10-08 22:09:36,351 epoch 2 - iter 192/242 - loss 0.63136730 - time (sec): 81.15 - samples/sec: 246.75 - lr: 0.000146 - momentum: 0.000000
2023-10-08 22:09:45,953 epoch 2 - iter 216/242 - loss 0.61412928 - time (sec): 90.75 - samples/sec: 245.32 - lr: 0.000144 - momentum: 0.000000
2023-10-08 22:09:55,858 epoch 2 - iter 240/242 - loss 0.58699846 - time (sec): 100.65 - samples/sec: 244.46 - lr: 0.000142 - momentum: 0.000000
2023-10-08 22:09:56,452 ----------------------------------------------------------------------------------------------------
2023-10-08 22:09:56,453 EPOCH 2 done: loss 0.5880 - lr: 0.000142
2023-10-08 22:10:02,913 DEV : loss 0.38607537746429443 - f1-score (micro avg) 0.3816
2023-10-08 22:10:02,919 saving best model
2023-10-08 22:10:03,778 ----------------------------------------------------------------------------------------------------
2023-10-08 22:10:13,254 epoch 3 - iter 24/242 - loss 0.35598581 - time (sec): 9.47 - samples/sec: 235.36 - lr: 0.000141 - momentum: 0.000000
2023-10-08 22:10:23,109 epoch 3 - iter 48/242 - loss 0.33627778 - time (sec): 19.33 - samples/sec: 235.75 - lr: 0.000139 - momentum: 0.000000
2023-10-08 22:10:33,871 epoch 3 - iter 72/242 - loss 0.31806277 - time (sec): 30.09 - samples/sec: 239.34 - lr: 0.000137 - momentum: 0.000000
2023-10-08 22:10:43,309 epoch 3 - iter 96/242 - loss 0.31423604 - time (sec): 39.53 - samples/sec: 239.62 - lr: 0.000135 - momentum: 0.000000
2023-10-08 22:10:53,421 epoch 3 - iter 120/242 - loss 0.30539205 - time (sec): 49.64 - samples/sec: 240.54 - lr: 0.000134 - momentum: 0.000000
2023-10-08 22:11:03,708 epoch 3 - iter 144/242 - loss 0.30226881 - time (sec): 59.93 - samples/sec: 242.74 - lr: 0.000132 - momentum: 0.000000
2023-10-08 22:11:14,466 epoch 3 - iter 168/242 - loss 0.29546652 - time (sec): 70.69 - samples/sec: 245.26 - lr: 0.000130 - momentum: 0.000000
2023-10-08 22:11:25,423 epoch 3 - iter 192/242 - loss 0.28100684 - time (sec): 81.64 - samples/sec: 246.19 - lr: 0.000128 - momentum: 0.000000
2023-10-08 22:11:34,784 epoch 3 - iter 216/242 - loss 0.27434761 - time (sec): 91.00 - samples/sec: 243.75 - lr: 0.000126 - momentum: 0.000000
2023-10-08 22:11:44,728 epoch 3 - iter 240/242 - loss 0.26765760 - time (sec): 100.95 - samples/sec: 243.95 - lr: 0.000125 - momentum: 0.000000
2023-10-08 22:11:45,311 ----------------------------------------------------------------------------------------------------
2023-10-08 22:11:45,311 EPOCH 3 done: loss 0.2667 - lr: 0.000125
2023-10-08 22:11:51,764 DEV : loss 0.2302773892879486 - f1-score (micro avg) 0.5892
2023-10-08 22:11:51,770 saving best model
2023-10-08 22:11:56,153 ----------------------------------------------------------------------------------------------------
2023-10-08 22:12:05,905 epoch 4 - iter 24/242 - loss 0.26852391 - time (sec): 9.75 - samples/sec: 241.21 - lr: 0.000123 - momentum: 0.000000
2023-10-08 22:12:16,153 epoch 4 - iter 48/242 - loss 0.23078327 - time (sec): 20.00 - samples/sec: 249.47 - lr: 0.000121 - momentum: 0.000000
2023-10-08 22:12:27,083 epoch 4 - iter 72/242 - loss 0.20260681 - time (sec): 30.93 - samples/sec: 248.86 - lr: 0.000119 - momentum: 0.000000
2023-10-08 22:12:36,829 epoch 4 - iter 96/242 - loss 0.19323065 - time (sec): 40.67 - samples/sec: 247.06 - lr: 0.000118 - momentum: 0.000000
2023-10-08 22:12:47,242 epoch 4 - iter 120/242 - loss 0.18628416 - time (sec): 51.09 - samples/sec: 248.00 - lr: 0.000116 - momentum: 0.000000
2023-10-08 22:12:58,143 epoch 4 - iter 144/242 - loss 0.17465326 - time (sec): 61.99 - samples/sec: 249.27 - lr: 0.000114 - momentum: 0.000000
2023-10-08 22:13:08,199 epoch 4 - iter 168/242 - loss 0.17023713 - time (sec): 72.05 - samples/sec: 248.72 - lr: 0.000112 - momentum: 0.000000
2023-10-08 22:13:18,312 epoch 4 - iter 192/242 - loss 0.16790202 - time (sec): 82.16 - samples/sec: 247.24 - lr: 0.000110 - momentum: 0.000000
2023-10-08 22:13:27,813 epoch 4 - iter 216/242 - loss 0.16372656 - time (sec): 91.66 - samples/sec: 245.96 - lr: 0.000109 - momentum: 0.000000
2023-10-08 22:13:37,011 epoch 4 - iter 240/242 - loss 0.16336523 - time (sec): 100.86 - samples/sec: 244.64 - lr: 0.000107 - momentum: 0.000000
2023-10-08 22:13:37,487 ----------------------------------------------------------------------------------------------------
2023-10-08 22:13:37,488 EPOCH 4 done: loss 0.1634 - lr: 0.000107
2023-10-08 22:13:43,972 DEV : loss 0.15603280067443848 - f1-score (micro avg) 0.776
2023-10-08 22:13:43,978 saving best model
2023-10-08 22:13:48,454 ----------------------------------------------------------------------------------------------------
2023-10-08 22:13:57,940 epoch 5 - iter 24/242 - loss 0.10265278 - time (sec): 9.48 - samples/sec: 242.83 - lr: 0.000105 - momentum: 0.000000
2023-10-08 22:14:07,851 epoch 5 - iter 48/242 - loss 0.09738916 - time (sec): 19.40 - samples/sec: 241.55 - lr: 0.000103 - momentum: 0.000000
2023-10-08 22:14:18,302 epoch 5 - iter 72/242 - loss 0.10267497 - time (sec): 29.85 - samples/sec: 246.46 - lr: 0.000102 - momentum: 0.000000
2023-10-08 22:14:28,312 epoch 5 - iter 96/242 - loss 0.10645004 - time (sec): 39.86 - samples/sec: 244.25 - lr: 0.000100 - momentum: 0.000000
2023-10-08 22:14:38,591 epoch 5 - iter 120/242 - loss 0.10330361 - time (sec): 50.13 - samples/sec: 245.80 - lr: 0.000098 - momentum: 0.000000
2023-10-08 22:14:48,687 epoch 5 - iter 144/242 - loss 0.10278544 - time (sec): 60.23 - samples/sec: 245.07 - lr: 0.000096 - momentum: 0.000000
2023-10-08 22:14:58,918 epoch 5 - iter 168/242 - loss 0.10950217 - time (sec): 70.46 - samples/sec: 245.61 - lr: 0.000094 - momentum: 0.000000
2023-10-08 22:15:09,356 epoch 5 - iter 192/242 - loss 0.10957079 - time (sec): 80.90 - samples/sec: 244.67 - lr: 0.000093 - momentum: 0.000000
2023-10-08 22:15:18,601 epoch 5 - iter 216/242 - loss 0.10715778 - time (sec): 90.15 - samples/sec: 243.14 - lr: 0.000091 - momentum: 0.000000
2023-10-08 22:15:28,774 epoch 5 - iter 240/242 - loss 0.10640201 - time (sec): 100.32 - samples/sec: 243.98 - lr: 0.000089 - momentum: 0.000000
2023-10-08 22:15:29,770 ----------------------------------------------------------------------------------------------------
2023-10-08 22:15:29,771 EPOCH 5 done: loss 0.1060 - lr: 0.000089
2023-10-08 22:15:36,257 DEV : loss 0.1357332170009613 - f1-score (micro avg) 0.808
2023-10-08 22:15:36,263 saving best model
2023-10-08 22:15:41,459 ----------------------------------------------------------------------------------------------------
2023-10-08 22:15:51,282 epoch 6 - iter 24/242 - loss 0.07483210 - time (sec): 9.82 - samples/sec: 242.65 - lr: 0.000087 - momentum: 0.000000
2023-10-08 22:16:01,599 epoch 6 - iter 48/242 - loss 0.07252951 - time (sec): 20.14 - samples/sec: 245.06 - lr: 0.000086 - momentum: 0.000000
2023-10-08 22:16:11,922 epoch 6 - iter 72/242 - loss 0.08117295 - time (sec): 30.46 - samples/sec: 246.94 - lr: 0.000084 - momentum: 0.000000
2023-10-08 22:16:21,583 epoch 6 - iter 96/242 - loss 0.07402080 - time (sec): 40.12 - samples/sec: 242.86 - lr: 0.000082 - momentum: 0.000000
2023-10-08 22:16:31,663 epoch 6 - iter 120/242 - loss 0.07543916 - time (sec): 50.20 - samples/sec: 243.40 - lr: 0.000080 - momentum: 0.000000
2023-10-08 22:16:41,712 epoch 6 - iter 144/242 - loss 0.07705500 - time (sec): 60.25 - samples/sec: 242.83 - lr: 0.000078 - momentum: 0.000000
2023-10-08 22:16:51,738 epoch 6 - iter 168/242 - loss 0.07375354 - time (sec): 70.28 - samples/sec: 243.63 - lr: 0.000077 - momentum: 0.000000
2023-10-08 22:17:01,954 epoch 6 - iter 192/242 - loss 0.07262324 - time (sec): 80.49 - samples/sec: 244.93 - lr: 0.000075 - momentum: 0.000000
2023-10-08 22:17:11,220 epoch 6 - iter 216/242 - loss 0.07298100 - time (sec): 89.76 - samples/sec: 245.44 - lr: 0.000073 - momentum: 0.000000
2023-10-08 22:17:20,807 epoch 6 - iter 240/242 - loss 0.07352327 - time (sec): 99.35 - samples/sec: 246.42 - lr: 0.000071 - momentum: 0.000000
2023-10-08 22:17:21,686 ----------------------------------------------------------------------------------------------------
2023-10-08 22:17:21,686 EPOCH 6 done: loss 0.0736 - lr: 0.000071
2023-10-08 22:17:27,595 DEV : loss 0.1343294084072113 - f1-score (micro avg) 0.8075
2023-10-08 22:17:27,600 ----------------------------------------------------------------------------------------------------
2023-10-08 22:17:37,564 epoch 7 - iter 24/242 - loss 0.03961783 - time (sec): 9.96 - samples/sec: 267.60 - lr: 0.000070 - momentum: 0.000000
2023-10-08 22:17:46,212 epoch 7 - iter 48/242 - loss 0.05670659 - time (sec): 18.61 - samples/sec: 258.67 - lr: 0.000068 - momentum: 0.000000
2023-10-08 22:17:55,224 epoch 7 - iter 72/242 - loss 0.05316227 - time (sec): 27.62 - samples/sec: 257.47 - lr: 0.000066 - momentum: 0.000000
2023-10-08 22:18:04,709 epoch 7 - iter 96/242 - loss 0.05195584 - time (sec): 37.11 - samples/sec: 256.90 - lr: 0.000064 - momentum: 0.000000
2023-10-08 22:18:13,858 epoch 7 - iter 120/242 - loss 0.05323815 - time (sec): 46.26 - samples/sec: 258.32 - lr: 0.000062 - momentum: 0.000000
2023-10-08 22:18:23,153 epoch 7 - iter 144/242 - loss 0.05310530 - time (sec): 55.55 - samples/sec: 258.55 - lr: 0.000061 - momentum: 0.000000
2023-10-08 22:18:32,730 epoch 7 - iter 168/242 - loss 0.05130060 - time (sec): 65.13 - samples/sec: 258.32 - lr: 0.000059 - momentum: 0.000000
2023-10-08 22:18:41,736 epoch 7 - iter 192/242 - loss 0.05050856 - time (sec): 74.13 - samples/sec: 257.99 - lr: 0.000057 - momentum: 0.000000
2023-10-08 22:18:51,843 epoch 7 - iter 216/242 - loss 0.05446243 - time (sec): 84.24 - samples/sec: 260.85 - lr: 0.000055 - momentum: 0.000000
2023-10-08 22:19:01,612 epoch 7 - iter 240/242 - loss 0.05431896 - time (sec): 94.01 - samples/sec: 262.27 - lr: 0.000054 - momentum: 0.000000
2023-10-08 22:19:02,141 ----------------------------------------------------------------------------------------------------
2023-10-08 22:19:02,141 EPOCH 7 done: loss 0.0543 - lr: 0.000054
2023-10-08 22:19:07,981 DEV : loss 0.15673130750656128 - f1-score (micro avg) 0.8161
2023-10-08 22:19:07,987 saving best model
2023-10-08 22:19:12,344 ----------------------------------------------------------------------------------------------------
2023-10-08 22:19:22,073 epoch 8 - iter 24/242 - loss 0.03108551 - time (sec): 9.73 - samples/sec: 267.38 - lr: 0.000052 - momentum: 0.000000
2023-10-08 22:19:31,070 epoch 8 - iter 48/242 - loss 0.03842500 - time (sec): 18.73 - samples/sec: 254.58 - lr: 0.000050 - momentum: 0.000000
2023-10-08 22:19:40,684 epoch 8 - iter 72/242 - loss 0.03788139 - time (sec): 28.34 - samples/sec: 261.41 - lr: 0.000048 - momentum: 0.000000
2023-10-08 22:19:49,772 epoch 8 - iter 96/242 - loss 0.04365955 - time (sec): 37.43 - samples/sec: 260.93 - lr: 0.000046 - momentum: 0.000000
2023-10-08 22:19:58,712 epoch 8 - iter 120/242 - loss 0.04582318 - time (sec): 46.37 - samples/sec: 258.25 - lr: 0.000045 - momentum: 0.000000
2023-10-08 22:20:08,221 epoch 8 - iter 144/242 - loss 0.04746351 - time (sec): 55.88 - samples/sec: 259.59 - lr: 0.000043 - momentum: 0.000000
2023-10-08 22:20:17,552 epoch 8 - iter 168/242 - loss 0.04715385 - time (sec): 65.21 - samples/sec: 259.19 - lr: 0.000041 - momentum: 0.000000
2023-10-08 22:20:27,117 epoch 8 - iter 192/242 - loss 0.04527359 - time (sec): 74.77 - samples/sec: 260.49 - lr: 0.000039 - momentum: 0.000000
2023-10-08 22:20:36,865 epoch 8 - iter 216/242 - loss 0.04448953 - time (sec): 84.52 - samples/sec: 260.41 - lr: 0.000038 - momentum: 0.000000
2023-10-08 22:20:46,599 epoch 8 - iter 240/242 - loss 0.04320868 - time (sec): 94.25 - samples/sec: 260.18 - lr: 0.000036 - momentum: 0.000000
2023-10-08 22:20:47,344 ----------------------------------------------------------------------------------------------------
2023-10-08 22:20:47,344 EPOCH 8 done: loss 0.0429 - lr: 0.000036
2023-10-08 22:20:53,283 DEV : loss 0.15529531240463257 - f1-score (micro avg) 0.8247
2023-10-08 22:20:53,289 saving best model
2023-10-08 22:20:57,656 ----------------------------------------------------------------------------------------------------
2023-10-08 22:21:08,507 epoch 9 - iter 24/242 - loss 0.04325857 - time (sec): 10.85 - samples/sec: 273.54 - lr: 0.000034 - momentum: 0.000000
2023-10-08 22:21:17,785 epoch 9 - iter 48/242 - loss 0.04176713 - time (sec): 20.13 - samples/sec: 264.25 - lr: 0.000032 - momentum: 0.000000
2023-10-08 22:21:26,850 epoch 9 - iter 72/242 - loss 0.03770574 - time (sec): 29.19 - samples/sec: 260.99 - lr: 0.000030 - momentum: 0.000000
2023-10-08 22:21:36,327 epoch 9 - iter 96/242 - loss 0.03875526 - time (sec): 38.67 - samples/sec: 258.86 - lr: 0.000029 - momentum: 0.000000
2023-10-08 22:21:45,951 epoch 9 - iter 120/242 - loss 0.03773614 - time (sec): 48.29 - samples/sec: 257.92 - lr: 0.000027 - momentum: 0.000000
2023-10-08 22:21:54,993 epoch 9 - iter 144/242 - loss 0.03744673 - time (sec): 57.34 - samples/sec: 255.72 - lr: 0.000025 - momentum: 0.000000
2023-10-08 22:22:05,258 epoch 9 - iter 168/242 - loss 0.03770231 - time (sec): 67.60 - samples/sec: 255.17 - lr: 0.000023 - momentum: 0.000000
2023-10-08 22:22:15,009 epoch 9 - iter 192/242 - loss 0.03764059 - time (sec): 77.35 - samples/sec: 255.11 - lr: 0.000022 - momentum: 0.000000
2023-10-08 22:22:24,607 epoch 9 - iter 216/242 - loss 0.03632771 - time (sec): 86.95 - samples/sec: 254.57 - lr: 0.000020 - momentum: 0.000000
2023-10-08 22:22:34,535 epoch 9 - iter 240/242 - loss 0.03696105 - time (sec): 96.88 - samples/sec: 253.96 - lr: 0.000018 - momentum: 0.000000
2023-10-08 22:22:35,167 ----------------------------------------------------------------------------------------------------
2023-10-08 22:22:35,167 EPOCH 9 done: loss 0.0368 - lr: 0.000018
2023-10-08 22:22:41,700 DEV : loss 0.158250093460083 - f1-score (micro avg) 0.8344
2023-10-08 22:22:41,706 saving best model
2023-10-08 22:22:46,175 ----------------------------------------------------------------------------------------------------
2023-10-08 22:22:56,888 epoch 10 - iter 24/242 - loss 0.03560429 - time (sec): 10.71 - samples/sec: 262.81 - lr: 0.000016 - momentum: 0.000000
2023-10-08 22:23:06,146 epoch 10 - iter 48/242 - loss 0.03254209 - time (sec): 19.97 - samples/sec: 245.38 - lr: 0.000014 - momentum: 0.000000
2023-10-08 22:23:15,965 epoch 10 - iter 72/242 - loss 0.02992292 - time (sec): 29.79 - samples/sec: 245.94 - lr: 0.000013 - momentum: 0.000000
2023-10-08 22:23:25,874 epoch 10 - iter 96/242 - loss 0.03056336 - time (sec): 39.70 - samples/sec: 245.13 - lr: 0.000011 - momentum: 0.000000
2023-10-08 22:23:36,189 epoch 10 - iter 120/242 - loss 0.03070508 - time (sec): 50.01 - samples/sec: 243.00 - lr: 0.000009 - momentum: 0.000000
2023-10-08 22:23:46,527 epoch 10 - iter 144/242 - loss 0.03025221 - time (sec): 60.35 - samples/sec: 244.46 - lr: 0.000007 - momentum: 0.000000
2023-10-08 22:23:56,388 epoch 10 - iter 168/242 - loss 0.03159824 - time (sec): 70.21 - samples/sec: 244.83 - lr: 0.000006 - momentum: 0.000000
2023-10-08 22:24:06,289 epoch 10 - iter 192/242 - loss 0.03193099 - time (sec): 80.11 - samples/sec: 244.01 - lr: 0.000004 - momentum: 0.000000
2023-10-08 22:24:16,865 epoch 10 - iter 216/242 - loss 0.03397460 - time (sec): 90.69 - samples/sec: 245.12 - lr: 0.000002 - momentum: 0.000000
2023-10-08 22:24:26,574 epoch 10 - iter 240/242 - loss 0.03227002 - time (sec): 100.40 - samples/sec: 244.33 - lr: 0.000000 - momentum: 0.000000
2023-10-08 22:24:27,380 ----------------------------------------------------------------------------------------------------
2023-10-08 22:24:27,381 EPOCH 10 done: loss 0.0329 - lr: 0.000000
2023-10-08 22:24:33,881 DEV : loss 0.15991616249084473 - f1-score (micro avg) 0.8323
2023-10-08 22:24:34,738 ----------------------------------------------------------------------------------------------------
2023-10-08 22:24:34,739 Loading model from best epoch ...
2023-10-08 22:24:37,534 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
2023-10-08 22:24:44,102
Results:
- F-score (micro) 0.8049
- F-score (macro) 0.4359
- Accuracy 0.7024
By class:
precision recall f1-score support
pers 0.8429 0.8489 0.8459 139
scope 0.8321 0.8837 0.8571 129
work 0.6596 0.7750 0.7126 80
loc 1.0000 0.1111 0.2000 9
date 0.0000 0.0000 0.0000 3
object 0.0000 0.0000 0.0000 0
micro avg 0.7909 0.8194 0.8049 360
macro avg 0.5558 0.4365 0.4359 360
weighted avg 0.7952 0.8194 0.7971 360
2023-10-08 22:24:44,102 ----------------------------------------------------------------------------------------------------