stefan-it's picture
Upload folder using huggingface_hub
de796eb
raw
history blame contribute delete
No virus
25 kB
2023-10-08 23:51:59,028 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,029 Model: "SequenceTagger(
(embeddings): ByT5Embeddings(
(model): T5EncoderModel(
(shared): Embedding(384, 1472)
(encoder): T5Stack(
(embed_tokens): Embedding(384, 1472)
(block): ModuleList(
(0): T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
(relative_attention_bias): Embedding(32, 6)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(1-11): 11 x T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(final_layer_norm): T5LayerNorm()
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=1472, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-08 23:51:59,029 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,029 MultiCorpus: 966 train + 219 dev + 204 test sentences
- NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
2023-10-08 23:51:59,029 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,029 Train: 966 sentences
2023-10-08 23:51:59,029 (train_with_dev=False, train_with_test=False)
2023-10-08 23:51:59,029 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,029 Training Params:
2023-10-08 23:51:59,029 - learning_rate: "0.00016"
2023-10-08 23:51:59,030 - mini_batch_size: "8"
2023-10-08 23:51:59,030 - max_epochs: "10"
2023-10-08 23:51:59,030 - shuffle: "True"
2023-10-08 23:51:59,030 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,030 Plugins:
2023-10-08 23:51:59,030 - TensorboardLogger
2023-10-08 23:51:59,030 - LinearScheduler | warmup_fraction: '0.1'
2023-10-08 23:51:59,030 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,030 Final evaluation on model from best epoch (best-model.pt)
2023-10-08 23:51:59,030 - metric: "('micro avg', 'f1-score')"
2023-10-08 23:51:59,030 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,030 Computation:
2023-10-08 23:51:59,030 - compute on device: cuda:0
2023-10-08 23:51:59,030 - embedding storage: none
2023-10-08 23:51:59,030 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,030 Model training base path: "hmbench-ajmc/fr-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-5"
2023-10-08 23:51:59,030 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,030 ----------------------------------------------------------------------------------------------------
2023-10-08 23:51:59,031 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-08 23:52:08,149 epoch 1 - iter 12/121 - loss 3.24007879 - time (sec): 9.12 - samples/sec: 251.61 - lr: 0.000015 - momentum: 0.000000
2023-10-08 23:52:17,986 epoch 1 - iter 24/121 - loss 3.23312334 - time (sec): 18.95 - samples/sec: 266.49 - lr: 0.000030 - momentum: 0.000000
2023-10-08 23:52:27,673 epoch 1 - iter 36/121 - loss 3.22226813 - time (sec): 28.64 - samples/sec: 269.54 - lr: 0.000046 - momentum: 0.000000
2023-10-08 23:52:36,996 epoch 1 - iter 48/121 - loss 3.20423696 - time (sec): 37.96 - samples/sec: 266.01 - lr: 0.000062 - momentum: 0.000000
2023-10-08 23:52:46,400 epoch 1 - iter 60/121 - loss 3.16654359 - time (sec): 47.37 - samples/sec: 267.86 - lr: 0.000078 - momentum: 0.000000
2023-10-08 23:52:55,301 epoch 1 - iter 72/121 - loss 3.11298351 - time (sec): 56.27 - samples/sec: 266.68 - lr: 0.000094 - momentum: 0.000000
2023-10-08 23:53:04,478 epoch 1 - iter 84/121 - loss 3.04136866 - time (sec): 65.45 - samples/sec: 268.62 - lr: 0.000110 - momentum: 0.000000
2023-10-08 23:53:13,407 epoch 1 - iter 96/121 - loss 2.96800067 - time (sec): 74.38 - samples/sec: 267.70 - lr: 0.000126 - momentum: 0.000000
2023-10-08 23:53:22,478 epoch 1 - iter 108/121 - loss 2.89081058 - time (sec): 83.45 - samples/sec: 265.76 - lr: 0.000141 - momentum: 0.000000
2023-10-08 23:53:31,862 epoch 1 - iter 120/121 - loss 2.80642603 - time (sec): 92.83 - samples/sec: 264.90 - lr: 0.000157 - momentum: 0.000000
2023-10-08 23:53:32,507 ----------------------------------------------------------------------------------------------------
2023-10-08 23:53:32,507 EPOCH 1 done: loss 2.8004 - lr: 0.000157
2023-10-08 23:53:38,960 DEV : loss 1.8004897832870483 - f1-score (micro avg) 0.0
2023-10-08 23:53:38,965 ----------------------------------------------------------------------------------------------------
2023-10-08 23:53:48,544 epoch 2 - iter 12/121 - loss 1.76018514 - time (sec): 9.58 - samples/sec: 258.42 - lr: 0.000158 - momentum: 0.000000
2023-10-08 23:53:58,106 epoch 2 - iter 24/121 - loss 1.62503441 - time (sec): 19.14 - samples/sec: 264.01 - lr: 0.000157 - momentum: 0.000000
2023-10-08 23:54:06,882 epoch 2 - iter 36/121 - loss 1.54613692 - time (sec): 27.92 - samples/sec: 260.04 - lr: 0.000155 - momentum: 0.000000
2023-10-08 23:54:15,815 epoch 2 - iter 48/121 - loss 1.44473077 - time (sec): 36.85 - samples/sec: 259.41 - lr: 0.000153 - momentum: 0.000000
2023-10-08 23:54:24,923 epoch 2 - iter 60/121 - loss 1.35458956 - time (sec): 45.96 - samples/sec: 258.61 - lr: 0.000151 - momentum: 0.000000
2023-10-08 23:54:34,120 epoch 2 - iter 72/121 - loss 1.28907501 - time (sec): 55.15 - samples/sec: 257.95 - lr: 0.000150 - momentum: 0.000000
2023-10-08 23:54:42,930 epoch 2 - iter 84/121 - loss 1.22982609 - time (sec): 63.96 - samples/sec: 257.27 - lr: 0.000148 - momentum: 0.000000
2023-10-08 23:54:52,296 epoch 2 - iter 96/121 - loss 1.15840086 - time (sec): 73.33 - samples/sec: 258.87 - lr: 0.000146 - momentum: 0.000000
2023-10-08 23:55:02,352 epoch 2 - iter 108/121 - loss 1.09622112 - time (sec): 83.39 - samples/sec: 260.72 - lr: 0.000144 - momentum: 0.000000
2023-10-08 23:55:12,259 epoch 2 - iter 120/121 - loss 1.04613598 - time (sec): 93.29 - samples/sec: 263.89 - lr: 0.000143 - momentum: 0.000000
2023-10-08 23:55:12,770 ----------------------------------------------------------------------------------------------------
2023-10-08 23:55:12,771 EPOCH 2 done: loss 1.0433 - lr: 0.000143
2023-10-08 23:55:19,184 DEV : loss 0.6442933082580566 - f1-score (micro avg) 0.0
2023-10-08 23:55:19,189 ----------------------------------------------------------------------------------------------------
2023-10-08 23:55:28,244 epoch 3 - iter 12/121 - loss 0.65213060 - time (sec): 9.05 - samples/sec: 258.80 - lr: 0.000141 - momentum: 0.000000
2023-10-08 23:55:38,280 epoch 3 - iter 24/121 - loss 0.56342249 - time (sec): 19.09 - samples/sec: 266.42 - lr: 0.000139 - momentum: 0.000000
2023-10-08 23:55:47,640 epoch 3 - iter 36/121 - loss 0.57699800 - time (sec): 28.45 - samples/sec: 265.17 - lr: 0.000137 - momentum: 0.000000
2023-10-08 23:55:57,112 epoch 3 - iter 48/121 - loss 0.57846581 - time (sec): 37.92 - samples/sec: 266.42 - lr: 0.000135 - momentum: 0.000000
2023-10-08 23:56:06,567 epoch 3 - iter 60/121 - loss 0.57006715 - time (sec): 47.38 - samples/sec: 266.23 - lr: 0.000134 - momentum: 0.000000
2023-10-08 23:56:15,581 epoch 3 - iter 72/121 - loss 0.56315309 - time (sec): 56.39 - samples/sec: 264.28 - lr: 0.000132 - momentum: 0.000000
2023-10-08 23:56:24,804 epoch 3 - iter 84/121 - loss 0.53809127 - time (sec): 65.61 - samples/sec: 263.05 - lr: 0.000130 - momentum: 0.000000
2023-10-08 23:56:34,535 epoch 3 - iter 96/121 - loss 0.51761936 - time (sec): 75.34 - samples/sec: 263.83 - lr: 0.000128 - momentum: 0.000000
2023-10-08 23:56:43,250 epoch 3 - iter 108/121 - loss 0.50831521 - time (sec): 84.06 - samples/sec: 262.67 - lr: 0.000127 - momentum: 0.000000
2023-10-08 23:56:52,692 epoch 3 - iter 120/121 - loss 0.49908619 - time (sec): 93.50 - samples/sec: 263.54 - lr: 0.000125 - momentum: 0.000000
2023-10-08 23:56:53,222 ----------------------------------------------------------------------------------------------------
2023-10-08 23:56:53,222 EPOCH 3 done: loss 0.4982 - lr: 0.000125
2023-10-08 23:56:59,729 DEV : loss 0.3734220862388611 - f1-score (micro avg) 0.0
2023-10-08 23:56:59,735 ----------------------------------------------------------------------------------------------------
2023-10-08 23:57:09,609 epoch 4 - iter 12/121 - loss 0.35649156 - time (sec): 9.87 - samples/sec: 275.61 - lr: 0.000123 - momentum: 0.000000
2023-10-08 23:57:19,546 epoch 4 - iter 24/121 - loss 0.36873843 - time (sec): 19.81 - samples/sec: 273.86 - lr: 0.000121 - momentum: 0.000000
2023-10-08 23:57:28,658 epoch 4 - iter 36/121 - loss 0.33494893 - time (sec): 28.92 - samples/sec: 268.38 - lr: 0.000120 - momentum: 0.000000
2023-10-08 23:57:38,638 epoch 4 - iter 48/121 - loss 0.32813231 - time (sec): 38.90 - samples/sec: 266.49 - lr: 0.000118 - momentum: 0.000000
2023-10-08 23:57:48,326 epoch 4 - iter 60/121 - loss 0.32351570 - time (sec): 48.59 - samples/sec: 265.65 - lr: 0.000116 - momentum: 0.000000
2023-10-08 23:57:57,213 epoch 4 - iter 72/121 - loss 0.32871155 - time (sec): 57.48 - samples/sec: 264.70 - lr: 0.000114 - momentum: 0.000000
2023-10-08 23:58:05,422 epoch 4 - iter 84/121 - loss 0.32152572 - time (sec): 65.69 - samples/sec: 264.62 - lr: 0.000113 - momentum: 0.000000
2023-10-08 23:58:14,425 epoch 4 - iter 96/121 - loss 0.31510401 - time (sec): 74.69 - samples/sec: 266.13 - lr: 0.000111 - momentum: 0.000000
2023-10-08 23:58:22,892 epoch 4 - iter 108/121 - loss 0.30765558 - time (sec): 83.16 - samples/sec: 266.93 - lr: 0.000109 - momentum: 0.000000
2023-10-08 23:58:31,247 epoch 4 - iter 120/121 - loss 0.29807006 - time (sec): 91.51 - samples/sec: 268.42 - lr: 0.000107 - momentum: 0.000000
2023-10-08 23:58:31,839 ----------------------------------------------------------------------------------------------------
2023-10-08 23:58:31,840 EPOCH 4 done: loss 0.2982 - lr: 0.000107
2023-10-08 23:58:37,715 DEV : loss 0.2557118535041809 - f1-score (micro avg) 0.4965
2023-10-08 23:58:37,721 saving best model
2023-10-08 23:58:38,593 ----------------------------------------------------------------------------------------------------
2023-10-08 23:58:47,348 epoch 5 - iter 12/121 - loss 0.26732774 - time (sec): 8.75 - samples/sec: 291.00 - lr: 0.000105 - momentum: 0.000000
2023-10-08 23:58:55,712 epoch 5 - iter 24/121 - loss 0.21981508 - time (sec): 17.12 - samples/sec: 284.88 - lr: 0.000104 - momentum: 0.000000
2023-10-08 23:59:04,552 epoch 5 - iter 36/121 - loss 0.21932766 - time (sec): 25.96 - samples/sec: 282.21 - lr: 0.000102 - momentum: 0.000000
2023-10-08 23:59:12,919 epoch 5 - iter 48/121 - loss 0.20749509 - time (sec): 34.32 - samples/sec: 282.06 - lr: 0.000100 - momentum: 0.000000
2023-10-08 23:59:21,455 epoch 5 - iter 60/121 - loss 0.21202799 - time (sec): 42.86 - samples/sec: 282.70 - lr: 0.000098 - momentum: 0.000000
2023-10-08 23:59:29,796 epoch 5 - iter 72/121 - loss 0.21482240 - time (sec): 51.20 - samples/sec: 281.82 - lr: 0.000097 - momentum: 0.000000
2023-10-08 23:59:38,418 epoch 5 - iter 84/121 - loss 0.22082266 - time (sec): 59.82 - samples/sec: 281.70 - lr: 0.000095 - momentum: 0.000000
2023-10-08 23:59:47,473 epoch 5 - iter 96/121 - loss 0.21824275 - time (sec): 68.88 - samples/sec: 282.95 - lr: 0.000093 - momentum: 0.000000
2023-10-08 23:59:56,488 epoch 5 - iter 108/121 - loss 0.21727918 - time (sec): 77.89 - samples/sec: 283.67 - lr: 0.000091 - momentum: 0.000000
2023-10-09 00:00:05,115 epoch 5 - iter 120/121 - loss 0.21066802 - time (sec): 86.52 - samples/sec: 283.48 - lr: 0.000090 - momentum: 0.000000
2023-10-09 00:00:05,784 ----------------------------------------------------------------------------------------------------
2023-10-09 00:00:05,784 EPOCH 5 done: loss 0.2101 - lr: 0.000090
2023-10-09 00:00:11,618 DEV : loss 0.1921490728855133 - f1-score (micro avg) 0.6798
2023-10-09 00:00:11,624 saving best model
2023-10-09 00:00:12,539 ----------------------------------------------------------------------------------------------------
2023-10-09 00:00:21,161 epoch 6 - iter 12/121 - loss 0.19619564 - time (sec): 8.62 - samples/sec: 284.10 - lr: 0.000088 - momentum: 0.000000
2023-10-09 00:00:30,392 epoch 6 - iter 24/121 - loss 0.17489469 - time (sec): 17.85 - samples/sec: 290.68 - lr: 0.000086 - momentum: 0.000000
2023-10-09 00:00:39,027 epoch 6 - iter 36/121 - loss 0.17969154 - time (sec): 26.49 - samples/sec: 289.21 - lr: 0.000084 - momentum: 0.000000
2023-10-09 00:00:47,646 epoch 6 - iter 48/121 - loss 0.17123261 - time (sec): 35.10 - samples/sec: 286.63 - lr: 0.000082 - momentum: 0.000000
2023-10-09 00:00:57,047 epoch 6 - iter 60/121 - loss 0.16359261 - time (sec): 44.51 - samples/sec: 284.41 - lr: 0.000081 - momentum: 0.000000
2023-10-09 00:01:05,558 epoch 6 - iter 72/121 - loss 0.16174851 - time (sec): 53.02 - samples/sec: 286.70 - lr: 0.000079 - momentum: 0.000000
2023-10-09 00:01:14,509 epoch 6 - iter 84/121 - loss 0.16282192 - time (sec): 61.97 - samples/sec: 286.60 - lr: 0.000077 - momentum: 0.000000
2023-10-09 00:01:22,991 epoch 6 - iter 96/121 - loss 0.15970690 - time (sec): 70.45 - samples/sec: 285.49 - lr: 0.000075 - momentum: 0.000000
2023-10-09 00:01:31,204 epoch 6 - iter 108/121 - loss 0.15976311 - time (sec): 78.66 - samples/sec: 283.21 - lr: 0.000074 - momentum: 0.000000
2023-10-09 00:01:39,831 epoch 6 - iter 120/121 - loss 0.15868564 - time (sec): 87.29 - samples/sec: 282.10 - lr: 0.000072 - momentum: 0.000000
2023-10-09 00:01:40,302 ----------------------------------------------------------------------------------------------------
2023-10-09 00:01:40,302 EPOCH 6 done: loss 0.1589 - lr: 0.000072
2023-10-09 00:01:46,235 DEV : loss 0.1633254587650299 - f1-score (micro avg) 0.8191
2023-10-09 00:01:46,240 saving best model
2023-10-09 00:01:47,154 ----------------------------------------------------------------------------------------------------
2023-10-09 00:01:55,767 epoch 7 - iter 12/121 - loss 0.14206947 - time (sec): 8.61 - samples/sec: 273.12 - lr: 0.000070 - momentum: 0.000000
2023-10-09 00:02:04,377 epoch 7 - iter 24/121 - loss 0.14789325 - time (sec): 17.22 - samples/sec: 271.28 - lr: 0.000068 - momentum: 0.000000
2023-10-09 00:02:12,757 epoch 7 - iter 36/121 - loss 0.13906464 - time (sec): 25.60 - samples/sec: 270.84 - lr: 0.000066 - momentum: 0.000000
2023-10-09 00:02:22,034 epoch 7 - iter 48/121 - loss 0.13345979 - time (sec): 34.88 - samples/sec: 274.64 - lr: 0.000065 - momentum: 0.000000
2023-10-09 00:02:30,827 epoch 7 - iter 60/121 - loss 0.13441935 - time (sec): 43.67 - samples/sec: 276.52 - lr: 0.000063 - momentum: 0.000000
2023-10-09 00:02:39,936 epoch 7 - iter 72/121 - loss 0.12810488 - time (sec): 52.78 - samples/sec: 275.86 - lr: 0.000061 - momentum: 0.000000
2023-10-09 00:02:48,983 epoch 7 - iter 84/121 - loss 0.12302416 - time (sec): 61.83 - samples/sec: 274.15 - lr: 0.000059 - momentum: 0.000000
2023-10-09 00:02:58,285 epoch 7 - iter 96/121 - loss 0.12267372 - time (sec): 71.13 - samples/sec: 274.71 - lr: 0.000058 - momentum: 0.000000
2023-10-09 00:03:07,936 epoch 7 - iter 108/121 - loss 0.12189462 - time (sec): 80.78 - samples/sec: 273.79 - lr: 0.000056 - momentum: 0.000000
2023-10-09 00:03:16,934 epoch 7 - iter 120/121 - loss 0.12532403 - time (sec): 89.78 - samples/sec: 273.63 - lr: 0.000054 - momentum: 0.000000
2023-10-09 00:03:17,532 ----------------------------------------------------------------------------------------------------
2023-10-09 00:03:17,532 EPOCH 7 done: loss 0.1249 - lr: 0.000054
2023-10-09 00:03:23,876 DEV : loss 0.14436966180801392 - f1-score (micro avg) 0.8404
2023-10-09 00:03:23,884 saving best model
2023-10-09 00:03:24,808 ----------------------------------------------------------------------------------------------------
2023-10-09 00:03:34,097 epoch 8 - iter 12/121 - loss 0.10346591 - time (sec): 9.29 - samples/sec: 264.56 - lr: 0.000052 - momentum: 0.000000
2023-10-09 00:03:43,149 epoch 8 - iter 24/121 - loss 0.10901018 - time (sec): 18.34 - samples/sec: 265.55 - lr: 0.000051 - momentum: 0.000000
2023-10-09 00:03:52,279 epoch 8 - iter 36/121 - loss 0.11799826 - time (sec): 27.47 - samples/sec: 264.19 - lr: 0.000049 - momentum: 0.000000
2023-10-09 00:04:01,499 epoch 8 - iter 48/121 - loss 0.11452594 - time (sec): 36.69 - samples/sec: 264.03 - lr: 0.000047 - momentum: 0.000000
2023-10-09 00:04:11,128 epoch 8 - iter 60/121 - loss 0.11044455 - time (sec): 46.32 - samples/sec: 264.43 - lr: 0.000045 - momentum: 0.000000
2023-10-09 00:04:20,351 epoch 8 - iter 72/121 - loss 0.10827378 - time (sec): 55.54 - samples/sec: 264.69 - lr: 0.000044 - momentum: 0.000000
2023-10-09 00:04:30,148 epoch 8 - iter 84/121 - loss 0.10461963 - time (sec): 65.34 - samples/sec: 264.87 - lr: 0.000042 - momentum: 0.000000
2023-10-09 00:04:40,036 epoch 8 - iter 96/121 - loss 0.10403268 - time (sec): 75.23 - samples/sec: 264.53 - lr: 0.000040 - momentum: 0.000000
2023-10-09 00:04:49,514 epoch 8 - iter 108/121 - loss 0.10165250 - time (sec): 84.70 - samples/sec: 264.40 - lr: 0.000038 - momentum: 0.000000
2023-10-09 00:04:58,325 epoch 8 - iter 120/121 - loss 0.10194572 - time (sec): 93.52 - samples/sec: 262.89 - lr: 0.000037 - momentum: 0.000000
2023-10-09 00:04:58,923 ----------------------------------------------------------------------------------------------------
2023-10-09 00:04:58,924 EPOCH 8 done: loss 0.1022 - lr: 0.000037
2023-10-09 00:05:05,505 DEV : loss 0.1325424313545227 - f1-score (micro avg) 0.8462
2023-10-09 00:05:05,511 saving best model
2023-10-09 00:05:06,431 ----------------------------------------------------------------------------------------------------
2023-10-09 00:05:15,691 epoch 9 - iter 12/121 - loss 0.09163936 - time (sec): 9.26 - samples/sec: 246.71 - lr: 0.000035 - momentum: 0.000000
2023-10-09 00:05:26,096 epoch 9 - iter 24/121 - loss 0.08650992 - time (sec): 19.66 - samples/sec: 261.71 - lr: 0.000033 - momentum: 0.000000
2023-10-09 00:05:35,480 epoch 9 - iter 36/121 - loss 0.08442869 - time (sec): 29.05 - samples/sec: 263.13 - lr: 0.000031 - momentum: 0.000000
2023-10-09 00:05:45,074 epoch 9 - iter 48/121 - loss 0.07985915 - time (sec): 38.64 - samples/sec: 261.69 - lr: 0.000029 - momentum: 0.000000
2023-10-09 00:05:54,452 epoch 9 - iter 60/121 - loss 0.07948707 - time (sec): 48.02 - samples/sec: 261.15 - lr: 0.000028 - momentum: 0.000000
2023-10-09 00:06:03,731 epoch 9 - iter 72/121 - loss 0.08071963 - time (sec): 57.30 - samples/sec: 262.49 - lr: 0.000026 - momentum: 0.000000
2023-10-09 00:06:12,711 epoch 9 - iter 84/121 - loss 0.08519618 - time (sec): 66.28 - samples/sec: 260.66 - lr: 0.000024 - momentum: 0.000000
2023-10-09 00:06:21,752 epoch 9 - iter 96/121 - loss 0.08669140 - time (sec): 75.32 - samples/sec: 259.96 - lr: 0.000022 - momentum: 0.000000
2023-10-09 00:06:30,804 epoch 9 - iter 108/121 - loss 0.08850931 - time (sec): 84.37 - samples/sec: 260.50 - lr: 0.000021 - momentum: 0.000000
2023-10-09 00:06:40,443 epoch 9 - iter 120/121 - loss 0.09025996 - time (sec): 94.01 - samples/sec: 261.80 - lr: 0.000019 - momentum: 0.000000
2023-10-09 00:06:40,959 ----------------------------------------------------------------------------------------------------
2023-10-09 00:06:40,959 EPOCH 9 done: loss 0.0899 - lr: 0.000019
2023-10-09 00:06:47,426 DEV : loss 0.12830205261707306 - f1-score (micro avg) 0.8383
2023-10-09 00:06:47,432 ----------------------------------------------------------------------------------------------------
2023-10-09 00:06:56,447 epoch 10 - iter 12/121 - loss 0.08773662 - time (sec): 9.01 - samples/sec: 255.96 - lr: 0.000017 - momentum: 0.000000
2023-10-09 00:07:05,846 epoch 10 - iter 24/121 - loss 0.08149445 - time (sec): 18.41 - samples/sec: 261.12 - lr: 0.000015 - momentum: 0.000000
2023-10-09 00:07:15,599 epoch 10 - iter 36/121 - loss 0.08443907 - time (sec): 28.17 - samples/sec: 262.02 - lr: 0.000013 - momentum: 0.000000
2023-10-09 00:07:25,242 epoch 10 - iter 48/121 - loss 0.08259476 - time (sec): 37.81 - samples/sec: 260.42 - lr: 0.000012 - momentum: 0.000000
2023-10-09 00:07:34,749 epoch 10 - iter 60/121 - loss 0.08190697 - time (sec): 47.32 - samples/sec: 261.35 - lr: 0.000010 - momentum: 0.000000
2023-10-09 00:07:44,460 epoch 10 - iter 72/121 - loss 0.08295489 - time (sec): 57.03 - samples/sec: 262.44 - lr: 0.000008 - momentum: 0.000000
2023-10-09 00:07:54,076 epoch 10 - iter 84/121 - loss 0.08353264 - time (sec): 66.64 - samples/sec: 262.85 - lr: 0.000006 - momentum: 0.000000
2023-10-09 00:08:02,457 epoch 10 - iter 96/121 - loss 0.08147989 - time (sec): 75.02 - samples/sec: 261.10 - lr: 0.000005 - momentum: 0.000000
2023-10-09 00:08:11,924 epoch 10 - iter 108/121 - loss 0.08233944 - time (sec): 84.49 - samples/sec: 260.14 - lr: 0.000003 - momentum: 0.000000
2023-10-09 00:08:21,661 epoch 10 - iter 120/121 - loss 0.08077834 - time (sec): 94.23 - samples/sec: 260.68 - lr: 0.000001 - momentum: 0.000000
2023-10-09 00:08:22,311 ----------------------------------------------------------------------------------------------------
2023-10-09 00:08:22,311 EPOCH 10 done: loss 0.0813 - lr: 0.000001
2023-10-09 00:08:28,891 DEV : loss 0.12556366622447968 - f1-score (micro avg) 0.8358
2023-10-09 00:08:29,753 ----------------------------------------------------------------------------------------------------
2023-10-09 00:08:29,758 Loading model from best epoch ...
2023-10-09 00:08:33,914 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
2023-10-09 00:08:40,505
Results:
- F-score (micro) 0.8086
- F-score (macro) 0.4848
- Accuracy 0.7089
By class:
precision recall f1-score support
pers 0.8333 0.8633 0.8481 139
scope 0.8264 0.9225 0.8718 129
work 0.6364 0.7875 0.7039 80
loc 0.0000 0.0000 0.0000 9
date 0.0000 0.0000 0.0000 3
micro avg 0.7804 0.8389 0.8086 360
macro avg 0.4592 0.5147 0.4848 360
weighted avg 0.7593 0.8389 0.7963 360
2023-10-09 00:08:40,505 ----------------------------------------------------------------------------------------------------