stefan-it's picture
Upload folder using huggingface_hub
1e20599
2023-10-06 12:45:23,329 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,330 Model: "SequenceTagger(
(embeddings): ByT5Embeddings(
(model): T5EncoderModel(
(shared): Embedding(384, 1472)
(encoder): T5Stack(
(embed_tokens): Embedding(384, 1472)
(block): ModuleList(
(0): T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
(relative_attention_bias): Embedding(32, 6)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(1-11): 11 x T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(final_layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=1472, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-06 12:45:23,330 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,330 MultiCorpus: 1214 train + 266 dev + 251 test sentences
- NER_HIPE_2022 Corpus: 1214 train + 266 dev + 251 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/ajmc/en/with_doc_seperator
2023-10-06 12:45:23,330 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,330 Train: 1214 sentences
2023-10-06 12:45:23,331 (train_with_dev=False, train_with_test=False)
2023-10-06 12:45:23,331 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,331 Training Params:
2023-10-06 12:45:23,331 - learning_rate: "0.00016"
2023-10-06 12:45:23,331 - mini_batch_size: "4"
2023-10-06 12:45:23,331 - max_epochs: "10"
2023-10-06 12:45:23,331 - shuffle: "True"
2023-10-06 12:45:23,331 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,331 Plugins:
2023-10-06 12:45:23,331 - TensorboardLogger
2023-10-06 12:45:23,331 - LinearScheduler | warmup_fraction: '0.1'
2023-10-06 12:45:23,331 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,331 Final evaluation on model from best epoch (best-model.pt)
2023-10-06 12:45:23,331 - metric: "('micro avg', 'f1-score')"
2023-10-06 12:45:23,331 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,331 Computation:
2023-10-06 12:45:23,331 - compute on device: cuda:0
2023-10-06 12:45:23,331 - embedding storage: none
2023-10-06 12:45:23,332 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,332 Model training base path: "hmbench-ajmc/en-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs4-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-3"
2023-10-06 12:45:23,332 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,332 ----------------------------------------------------------------------------------------------------
2023-10-06 12:45:23,332 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-06 12:45:35,007 epoch 1 - iter 30/304 - loss 3.21102793 - time (sec): 11.67 - samples/sec: 252.79 - lr: 0.000015 - momentum: 0.000000
2023-10-06 12:45:46,928 epoch 1 - iter 60/304 - loss 3.19786369 - time (sec): 23.60 - samples/sec: 253.23 - lr: 0.000031 - momentum: 0.000000
2023-10-06 12:45:58,966 epoch 1 - iter 90/304 - loss 3.17645681 - time (sec): 35.63 - samples/sec: 254.51 - lr: 0.000047 - momentum: 0.000000
2023-10-06 12:46:10,717 epoch 1 - iter 120/304 - loss 3.12340824 - time (sec): 47.38 - samples/sec: 251.60 - lr: 0.000063 - momentum: 0.000000
2023-10-06 12:46:22,772 epoch 1 - iter 150/304 - loss 3.01813255 - time (sec): 59.44 - samples/sec: 251.73 - lr: 0.000078 - momentum: 0.000000
2023-10-06 12:46:34,699 epoch 1 - iter 180/304 - loss 2.89874743 - time (sec): 71.37 - samples/sec: 251.63 - lr: 0.000094 - momentum: 0.000000
2023-10-06 12:46:46,759 epoch 1 - iter 210/304 - loss 2.76573906 - time (sec): 83.43 - samples/sec: 252.07 - lr: 0.000110 - momentum: 0.000000
2023-10-06 12:46:59,330 epoch 1 - iter 240/304 - loss 2.61683815 - time (sec): 96.00 - samples/sec: 254.08 - lr: 0.000126 - momentum: 0.000000
2023-10-06 12:47:11,269 epoch 1 - iter 270/304 - loss 2.47285369 - time (sec): 107.94 - samples/sec: 254.42 - lr: 0.000142 - momentum: 0.000000
2023-10-06 12:47:23,078 epoch 1 - iter 300/304 - loss 2.32218897 - time (sec): 119.75 - samples/sec: 255.77 - lr: 0.000157 - momentum: 0.000000
2023-10-06 12:47:24,490 ----------------------------------------------------------------------------------------------------
2023-10-06 12:47:24,491 EPOCH 1 done: loss 2.3061 - lr: 0.000157
2023-10-06 12:47:32,333 DEV : loss 0.8873188495635986 - f1-score (micro avg) 0.0
2023-10-06 12:47:32,341 ----------------------------------------------------------------------------------------------------
2023-10-06 12:47:44,353 epoch 2 - iter 30/304 - loss 0.86710408 - time (sec): 12.01 - samples/sec: 262.78 - lr: 0.000158 - momentum: 0.000000
2023-10-06 12:47:56,502 epoch 2 - iter 60/304 - loss 0.78735801 - time (sec): 24.16 - samples/sec: 261.93 - lr: 0.000157 - momentum: 0.000000
2023-10-06 12:48:07,841 epoch 2 - iter 90/304 - loss 0.70808845 - time (sec): 35.50 - samples/sec: 254.46 - lr: 0.000155 - momentum: 0.000000
2023-10-06 12:48:19,926 epoch 2 - iter 120/304 - loss 0.68252424 - time (sec): 47.58 - samples/sec: 253.83 - lr: 0.000153 - momentum: 0.000000
2023-10-06 12:48:32,013 epoch 2 - iter 150/304 - loss 0.65418758 - time (sec): 59.67 - samples/sec: 253.36 - lr: 0.000151 - momentum: 0.000000
2023-10-06 12:48:44,000 epoch 2 - iter 180/304 - loss 0.61886426 - time (sec): 71.66 - samples/sec: 253.23 - lr: 0.000150 - momentum: 0.000000
2023-10-06 12:48:55,684 epoch 2 - iter 210/304 - loss 0.57580239 - time (sec): 83.34 - samples/sec: 253.85 - lr: 0.000148 - momentum: 0.000000
2023-10-06 12:49:07,416 epoch 2 - iter 240/304 - loss 0.54630658 - time (sec): 95.07 - samples/sec: 254.46 - lr: 0.000146 - momentum: 0.000000
2023-10-06 12:49:19,759 epoch 2 - iter 270/304 - loss 0.51256755 - time (sec): 107.42 - samples/sec: 255.33 - lr: 0.000144 - momentum: 0.000000
2023-10-06 12:49:32,169 epoch 2 - iter 300/304 - loss 0.48741370 - time (sec): 119.83 - samples/sec: 256.20 - lr: 0.000143 - momentum: 0.000000
2023-10-06 12:49:33,394 ----------------------------------------------------------------------------------------------------
2023-10-06 12:49:33,394 EPOCH 2 done: loss 0.4855 - lr: 0.000143
2023-10-06 12:49:41,298 DEV : loss 0.3283654451370239 - f1-score (micro avg) 0.5362
2023-10-06 12:49:41,307 saving best model
2023-10-06 12:49:42,173 ----------------------------------------------------------------------------------------------------
2023-10-06 12:49:53,782 epoch 3 - iter 30/304 - loss 0.29136616 - time (sec): 11.61 - samples/sec: 251.91 - lr: 0.000141 - momentum: 0.000000
2023-10-06 12:50:05,419 epoch 3 - iter 60/304 - loss 0.27081873 - time (sec): 23.24 - samples/sec: 248.92 - lr: 0.000139 - momentum: 0.000000
2023-10-06 12:50:17,484 epoch 3 - iter 90/304 - loss 0.25111837 - time (sec): 35.31 - samples/sec: 251.49 - lr: 0.000137 - momentum: 0.000000
2023-10-06 12:50:30,122 epoch 3 - iter 120/304 - loss 0.23987061 - time (sec): 47.95 - samples/sec: 254.03 - lr: 0.000135 - momentum: 0.000000
2023-10-06 12:50:41,731 epoch 3 - iter 150/304 - loss 0.22015916 - time (sec): 59.56 - samples/sec: 252.95 - lr: 0.000134 - momentum: 0.000000
2023-10-06 12:50:54,171 epoch 3 - iter 180/304 - loss 0.21867332 - time (sec): 72.00 - samples/sec: 252.50 - lr: 0.000132 - momentum: 0.000000
2023-10-06 12:51:06,320 epoch 3 - iter 210/304 - loss 0.21870393 - time (sec): 84.15 - samples/sec: 253.55 - lr: 0.000130 - momentum: 0.000000
2023-10-06 12:51:18,147 epoch 3 - iter 240/304 - loss 0.21086814 - time (sec): 95.97 - samples/sec: 253.43 - lr: 0.000128 - momentum: 0.000000
2023-10-06 12:51:29,799 epoch 3 - iter 270/304 - loss 0.20454206 - time (sec): 107.62 - samples/sec: 252.60 - lr: 0.000127 - momentum: 0.000000
2023-10-06 12:51:42,391 epoch 3 - iter 300/304 - loss 0.19768497 - time (sec): 120.22 - samples/sec: 254.64 - lr: 0.000125 - momentum: 0.000000
2023-10-06 12:51:43,887 ----------------------------------------------------------------------------------------------------
2023-10-06 12:51:43,888 EPOCH 3 done: loss 0.1991 - lr: 0.000125
2023-10-06 12:51:51,708 DEV : loss 0.18981128931045532 - f1-score (micro avg) 0.7063
2023-10-06 12:51:51,715 saving best model
2023-10-06 12:51:56,041 ----------------------------------------------------------------------------------------------------
2023-10-06 12:52:08,355 epoch 4 - iter 30/304 - loss 0.12926591 - time (sec): 12.31 - samples/sec: 253.96 - lr: 0.000123 - momentum: 0.000000
2023-10-06 12:52:19,916 epoch 4 - iter 60/304 - loss 0.13502345 - time (sec): 23.87 - samples/sec: 255.72 - lr: 0.000121 - momentum: 0.000000
2023-10-06 12:52:32,469 epoch 4 - iter 90/304 - loss 0.13018387 - time (sec): 36.43 - samples/sec: 259.40 - lr: 0.000119 - momentum: 0.000000
2023-10-06 12:52:44,999 epoch 4 - iter 120/304 - loss 0.13138157 - time (sec): 48.96 - samples/sec: 259.47 - lr: 0.000118 - momentum: 0.000000
2023-10-06 12:52:56,975 epoch 4 - iter 150/304 - loss 0.12863509 - time (sec): 60.93 - samples/sec: 260.20 - lr: 0.000116 - momentum: 0.000000
2023-10-06 12:53:09,181 epoch 4 - iter 180/304 - loss 0.12083582 - time (sec): 73.14 - samples/sec: 260.04 - lr: 0.000114 - momentum: 0.000000
2023-10-06 12:53:20,223 epoch 4 - iter 210/304 - loss 0.12032196 - time (sec): 84.18 - samples/sec: 258.84 - lr: 0.000112 - momentum: 0.000000
2023-10-06 12:53:32,259 epoch 4 - iter 240/304 - loss 0.11985667 - time (sec): 96.22 - samples/sec: 257.47 - lr: 0.000111 - momentum: 0.000000
2023-10-06 12:53:43,825 epoch 4 - iter 270/304 - loss 0.12052676 - time (sec): 107.78 - samples/sec: 256.57 - lr: 0.000109 - momentum: 0.000000
2023-10-06 12:53:55,678 epoch 4 - iter 300/304 - loss 0.11444225 - time (sec): 119.64 - samples/sec: 255.57 - lr: 0.000107 - momentum: 0.000000
2023-10-06 12:53:57,244 ----------------------------------------------------------------------------------------------------
2023-10-06 12:53:57,245 EPOCH 4 done: loss 0.1131 - lr: 0.000107
2023-10-06 12:54:05,037 DEV : loss 0.15284717082977295 - f1-score (micro avg) 0.8033
2023-10-06 12:54:05,044 saving best model
2023-10-06 12:54:05,951 ----------------------------------------------------------------------------------------------------
2023-10-06 12:54:17,908 epoch 5 - iter 30/304 - loss 0.04575619 - time (sec): 11.96 - samples/sec: 256.20 - lr: 0.000105 - momentum: 0.000000
2023-10-06 12:54:29,739 epoch 5 - iter 60/304 - loss 0.07299153 - time (sec): 23.79 - samples/sec: 253.81 - lr: 0.000103 - momentum: 0.000000
2023-10-06 12:54:42,199 epoch 5 - iter 90/304 - loss 0.06582040 - time (sec): 36.25 - samples/sec: 254.90 - lr: 0.000102 - momentum: 0.000000
2023-10-06 12:54:54,338 epoch 5 - iter 120/304 - loss 0.07450045 - time (sec): 48.38 - samples/sec: 255.99 - lr: 0.000100 - momentum: 0.000000
2023-10-06 12:55:06,364 epoch 5 - iter 150/304 - loss 0.07376260 - time (sec): 60.41 - samples/sec: 256.48 - lr: 0.000098 - momentum: 0.000000
2023-10-06 12:55:18,187 epoch 5 - iter 180/304 - loss 0.07526199 - time (sec): 72.23 - samples/sec: 255.49 - lr: 0.000096 - momentum: 0.000000
2023-10-06 12:55:29,494 epoch 5 - iter 210/304 - loss 0.07185715 - time (sec): 83.54 - samples/sec: 254.41 - lr: 0.000094 - momentum: 0.000000
2023-10-06 12:55:42,056 epoch 5 - iter 240/304 - loss 0.07090551 - time (sec): 96.10 - samples/sec: 255.94 - lr: 0.000093 - momentum: 0.000000
2023-10-06 12:55:53,830 epoch 5 - iter 270/304 - loss 0.07145145 - time (sec): 107.88 - samples/sec: 255.55 - lr: 0.000091 - momentum: 0.000000
2023-10-06 12:56:05,564 epoch 5 - iter 300/304 - loss 0.07325505 - time (sec): 119.61 - samples/sec: 255.38 - lr: 0.000089 - momentum: 0.000000
2023-10-06 12:56:07,260 ----------------------------------------------------------------------------------------------------
2023-10-06 12:56:07,260 EPOCH 5 done: loss 0.0728 - lr: 0.000089
2023-10-06 12:56:15,174 DEV : loss 0.142047718167305 - f1-score (micro avg) 0.814
2023-10-06 12:56:15,183 saving best model
2023-10-06 12:56:19,505 ----------------------------------------------------------------------------------------------------
2023-10-06 12:56:31,227 epoch 6 - iter 30/304 - loss 0.02719635 - time (sec): 11.72 - samples/sec: 252.81 - lr: 0.000087 - momentum: 0.000000
2023-10-06 12:56:43,020 epoch 6 - iter 60/304 - loss 0.05139271 - time (sec): 23.51 - samples/sec: 249.52 - lr: 0.000085 - momentum: 0.000000
2023-10-06 12:56:55,522 epoch 6 - iter 90/304 - loss 0.04460360 - time (sec): 36.02 - samples/sec: 251.64 - lr: 0.000084 - momentum: 0.000000
2023-10-06 12:57:07,716 epoch 6 - iter 120/304 - loss 0.05081156 - time (sec): 48.21 - samples/sec: 254.08 - lr: 0.000082 - momentum: 0.000000
2023-10-06 12:57:19,480 epoch 6 - iter 150/304 - loss 0.04911029 - time (sec): 59.97 - samples/sec: 253.24 - lr: 0.000080 - momentum: 0.000000
2023-10-06 12:57:31,617 epoch 6 - iter 180/304 - loss 0.04644365 - time (sec): 72.11 - samples/sec: 253.34 - lr: 0.000078 - momentum: 0.000000
2023-10-06 12:57:43,643 epoch 6 - iter 210/304 - loss 0.04891098 - time (sec): 84.14 - samples/sec: 252.02 - lr: 0.000077 - momentum: 0.000000
2023-10-06 12:57:55,770 epoch 6 - iter 240/304 - loss 0.05356947 - time (sec): 96.26 - samples/sec: 252.68 - lr: 0.000075 - momentum: 0.000000
2023-10-06 12:58:07,773 epoch 6 - iter 270/304 - loss 0.05147715 - time (sec): 108.27 - samples/sec: 253.72 - lr: 0.000073 - momentum: 0.000000
2023-10-06 12:58:19,809 epoch 6 - iter 300/304 - loss 0.05598668 - time (sec): 120.30 - samples/sec: 254.28 - lr: 0.000071 - momentum: 0.000000
2023-10-06 12:58:21,304 ----------------------------------------------------------------------------------------------------
2023-10-06 12:58:21,305 EPOCH 6 done: loss 0.0559 - lr: 0.000071
2023-10-06 12:58:29,080 DEV : loss 0.1495695859193802 - f1-score (micro avg) 0.832
2023-10-06 12:58:29,087 saving best model
2023-10-06 12:58:33,422 ----------------------------------------------------------------------------------------------------
2023-10-06 12:58:45,316 epoch 7 - iter 30/304 - loss 0.07676834 - time (sec): 11.89 - samples/sec: 250.49 - lr: 0.000069 - momentum: 0.000000
2023-10-06 12:58:57,612 epoch 7 - iter 60/304 - loss 0.06182160 - time (sec): 24.19 - samples/sec: 257.73 - lr: 0.000068 - momentum: 0.000000
2023-10-06 12:59:09,523 epoch 7 - iter 90/304 - loss 0.04963749 - time (sec): 36.10 - samples/sec: 254.96 - lr: 0.000066 - momentum: 0.000000
2023-10-06 12:59:22,086 epoch 7 - iter 120/304 - loss 0.04620244 - time (sec): 48.66 - samples/sec: 256.13 - lr: 0.000064 - momentum: 0.000000
2023-10-06 12:59:34,304 epoch 7 - iter 150/304 - loss 0.04937957 - time (sec): 60.88 - samples/sec: 254.91 - lr: 0.000062 - momentum: 0.000000
2023-10-06 12:59:45,765 epoch 7 - iter 180/304 - loss 0.04586637 - time (sec): 72.34 - samples/sec: 252.30 - lr: 0.000061 - momentum: 0.000000
2023-10-06 12:59:57,410 epoch 7 - iter 210/304 - loss 0.04655117 - time (sec): 83.99 - samples/sec: 252.05 - lr: 0.000059 - momentum: 0.000000
2023-10-06 13:00:09,583 epoch 7 - iter 240/304 - loss 0.04471897 - time (sec): 96.16 - samples/sec: 253.31 - lr: 0.000057 - momentum: 0.000000
2023-10-06 13:00:21,708 epoch 7 - iter 270/304 - loss 0.04843572 - time (sec): 108.28 - samples/sec: 255.10 - lr: 0.000055 - momentum: 0.000000
2023-10-06 13:00:33,453 epoch 7 - iter 300/304 - loss 0.04574001 - time (sec): 120.03 - samples/sec: 255.21 - lr: 0.000054 - momentum: 0.000000
2023-10-06 13:00:34,898 ----------------------------------------------------------------------------------------------------
2023-10-06 13:00:34,898 EPOCH 7 done: loss 0.0453 - lr: 0.000054
2023-10-06 13:00:42,598 DEV : loss 0.15644758939743042 - f1-score (micro avg) 0.8327
2023-10-06 13:00:42,606 saving best model
2023-10-06 13:00:46,952 ----------------------------------------------------------------------------------------------------
2023-10-06 13:00:58,792 epoch 8 - iter 30/304 - loss 0.03337648 - time (sec): 11.84 - samples/sec: 265.41 - lr: 0.000052 - momentum: 0.000000
2023-10-06 13:01:11,174 epoch 8 - iter 60/304 - loss 0.03929732 - time (sec): 24.22 - samples/sec: 267.34 - lr: 0.000050 - momentum: 0.000000
2023-10-06 13:01:23,864 epoch 8 - iter 90/304 - loss 0.04373362 - time (sec): 36.91 - samples/sec: 265.92 - lr: 0.000048 - momentum: 0.000000
2023-10-06 13:01:36,022 epoch 8 - iter 120/304 - loss 0.04058002 - time (sec): 49.07 - samples/sec: 262.92 - lr: 0.000046 - momentum: 0.000000
2023-10-06 13:01:48,448 epoch 8 - iter 150/304 - loss 0.03808828 - time (sec): 61.49 - samples/sec: 261.08 - lr: 0.000045 - momentum: 0.000000
2023-10-06 13:02:00,787 epoch 8 - iter 180/304 - loss 0.03804255 - time (sec): 73.83 - samples/sec: 257.93 - lr: 0.000043 - momentum: 0.000000
2023-10-06 13:02:12,182 epoch 8 - iter 210/304 - loss 0.03612001 - time (sec): 85.23 - samples/sec: 255.09 - lr: 0.000041 - momentum: 0.000000
2023-10-06 13:02:24,337 epoch 8 - iter 240/304 - loss 0.03318245 - time (sec): 97.38 - samples/sec: 254.53 - lr: 0.000039 - momentum: 0.000000
2023-10-06 13:02:36,230 epoch 8 - iter 270/304 - loss 0.03435426 - time (sec): 109.28 - samples/sec: 253.63 - lr: 0.000038 - momentum: 0.000000
2023-10-06 13:02:47,914 epoch 8 - iter 300/304 - loss 0.03337071 - time (sec): 120.96 - samples/sec: 252.50 - lr: 0.000036 - momentum: 0.000000
2023-10-06 13:02:49,501 ----------------------------------------------------------------------------------------------------
2023-10-06 13:02:49,501 EPOCH 8 done: loss 0.0352 - lr: 0.000036
2023-10-06 13:02:57,327 DEV : loss 0.15827415883541107 - f1-score (micro avg) 0.8398
2023-10-06 13:02:57,335 saving best model
2023-10-06 13:03:01,689 ----------------------------------------------------------------------------------------------------
2023-10-06 13:03:13,919 epoch 9 - iter 30/304 - loss 0.01575648 - time (sec): 12.23 - samples/sec: 259.31 - lr: 0.000034 - momentum: 0.000000
2023-10-06 13:03:25,433 epoch 9 - iter 60/304 - loss 0.02056250 - time (sec): 23.74 - samples/sec: 251.53 - lr: 0.000032 - momentum: 0.000000
2023-10-06 13:03:37,200 epoch 9 - iter 90/304 - loss 0.02079922 - time (sec): 35.51 - samples/sec: 248.66 - lr: 0.000030 - momentum: 0.000000
2023-10-06 13:03:49,939 epoch 9 - iter 120/304 - loss 0.02414929 - time (sec): 48.25 - samples/sec: 253.38 - lr: 0.000029 - momentum: 0.000000
2023-10-06 13:04:01,757 epoch 9 - iter 150/304 - loss 0.02792466 - time (sec): 60.07 - samples/sec: 252.45 - lr: 0.000027 - momentum: 0.000000
2023-10-06 13:04:13,548 epoch 9 - iter 180/304 - loss 0.02815304 - time (sec): 71.86 - samples/sec: 252.54 - lr: 0.000025 - momentum: 0.000000
2023-10-06 13:04:25,652 epoch 9 - iter 210/304 - loss 0.03010583 - time (sec): 83.96 - samples/sec: 253.46 - lr: 0.000023 - momentum: 0.000000
2023-10-06 13:04:37,602 epoch 9 - iter 240/304 - loss 0.03102381 - time (sec): 95.91 - samples/sec: 253.15 - lr: 0.000022 - momentum: 0.000000
2023-10-06 13:04:50,279 epoch 9 - iter 270/304 - loss 0.03028348 - time (sec): 108.59 - samples/sec: 254.02 - lr: 0.000020 - momentum: 0.000000
2023-10-06 13:05:02,263 epoch 9 - iter 300/304 - loss 0.03207685 - time (sec): 120.57 - samples/sec: 253.60 - lr: 0.000018 - momentum: 0.000000
2023-10-06 13:05:03,866 ----------------------------------------------------------------------------------------------------
2023-10-06 13:05:03,866 EPOCH 9 done: loss 0.0319 - lr: 0.000018
2023-10-06 13:05:11,921 DEV : loss 0.15947996079921722 - f1-score (micro avg) 0.8394
2023-10-06 13:05:11,930 ----------------------------------------------------------------------------------------------------
2023-10-06 13:05:23,841 epoch 10 - iter 30/304 - loss 0.01644045 - time (sec): 11.91 - samples/sec: 251.22 - lr: 0.000016 - momentum: 0.000000
2023-10-06 13:05:36,089 epoch 10 - iter 60/304 - loss 0.03068829 - time (sec): 24.16 - samples/sec: 253.67 - lr: 0.000014 - momentum: 0.000000
2023-10-06 13:05:47,453 epoch 10 - iter 90/304 - loss 0.03264622 - time (sec): 35.52 - samples/sec: 251.62 - lr: 0.000013 - momentum: 0.000000
2023-10-06 13:05:59,056 epoch 10 - iter 120/304 - loss 0.02975930 - time (sec): 47.12 - samples/sec: 251.89 - lr: 0.000011 - momentum: 0.000000
2023-10-06 13:06:11,073 epoch 10 - iter 150/304 - loss 0.02762064 - time (sec): 59.14 - samples/sec: 254.19 - lr: 0.000009 - momentum: 0.000000
2023-10-06 13:06:22,912 epoch 10 - iter 180/304 - loss 0.03191522 - time (sec): 70.98 - samples/sec: 255.35 - lr: 0.000007 - momentum: 0.000000
2023-10-06 13:06:34,545 epoch 10 - iter 210/304 - loss 0.02939243 - time (sec): 82.61 - samples/sec: 255.02 - lr: 0.000006 - momentum: 0.000000
2023-10-06 13:06:46,805 epoch 10 - iter 240/304 - loss 0.03079156 - time (sec): 94.87 - samples/sec: 255.94 - lr: 0.000004 - momentum: 0.000000
2023-10-06 13:06:59,669 epoch 10 - iter 270/304 - loss 0.02970349 - time (sec): 107.74 - samples/sec: 256.52 - lr: 0.000002 - momentum: 0.000000
2023-10-06 13:07:11,360 epoch 10 - iter 300/304 - loss 0.02859991 - time (sec): 119.43 - samples/sec: 256.14 - lr: 0.000000 - momentum: 0.000000
2023-10-06 13:07:12,832 ----------------------------------------------------------------------------------------------------
2023-10-06 13:07:12,832 EPOCH 10 done: loss 0.0283 - lr: 0.000000
2023-10-06 13:07:20,906 DEV : loss 0.1603294312953949 - f1-score (micro avg) 0.8407
2023-10-06 13:07:20,913 saving best model
2023-10-06 13:07:22,828 ----------------------------------------------------------------------------------------------------
2023-10-06 13:07:22,855 Loading model from best epoch ...
2023-10-06 13:07:26,506 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-date, B-date, E-date, I-date, S-object, B-object, E-object, I-object
2023-10-06 13:07:33,865
Results:
- F-score (micro) 0.812
- F-score (macro) 0.6518
- Accuracy 0.6914
By class:
precision recall f1-score support
scope 0.7640 0.8146 0.7885 151
pers 0.7913 0.9479 0.8626 96
work 0.7593 0.8632 0.8079 95
loc 1.0000 0.6667 0.8000 3
date 0.0000 0.0000 0.0000 3
micro avg 0.7720 0.8563 0.8120 348
macro avg 0.6629 0.6585 0.6518 348
weighted avg 0.7657 0.8563 0.8075 348
2023-10-06 13:07:33,865 ----------------------------------------------------------------------------------------------------