stefan-it's picture
Upload folder using huggingface_hub
7733fc7
2023-10-12 09:35:27,482 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,484 Model: "SequenceTagger(
(embeddings): ByT5Embeddings(
(model): T5EncoderModel(
(shared): Embedding(384, 1472)
(encoder): T5Stack(
(embed_tokens): Embedding(384, 1472)
(block): ModuleList(
(0): T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
(relative_attention_bias): Embedding(32, 6)
)
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(1-11): 11 x T5Block(
(layer): ModuleList(
(0): T5LayerSelfAttention(
(SelfAttention): T5Attention(
(q): Linear(in_features=1472, out_features=384, bias=False)
(k): Linear(in_features=1472, out_features=384, bias=False)
(v): Linear(in_features=1472, out_features=384, bias=False)
(o): Linear(in_features=384, out_features=1472, bias=False)
)
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(1): T5LayerFF(
(DenseReluDense): T5DenseGatedActDense(
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
(wo): Linear(in_features=3584, out_features=1472, bias=False)
(dropout): Dropout(p=0.1, inplace=False)
(act): NewGELUActivation()
)
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(final_layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=1472, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-12 09:35:27,484 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,484 MultiCorpus: 7936 train + 992 dev + 992 test sentences
- NER_ICDAR_EUROPEANA Corpus: 7936 train + 992 dev + 992 test sentences - /root/.flair/datasets/ner_icdar_europeana/fr
2023-10-12 09:35:27,484 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,485 Train: 7936 sentences
2023-10-12 09:35:27,485 (train_with_dev=False, train_with_test=False)
2023-10-12 09:35:27,485 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,485 Training Params:
2023-10-12 09:35:27,485 - learning_rate: "0.00016"
2023-10-12 09:35:27,485 - mini_batch_size: "8"
2023-10-12 09:35:27,485 - max_epochs: "10"
2023-10-12 09:35:27,485 - shuffle: "True"
2023-10-12 09:35:27,485 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,485 Plugins:
2023-10-12 09:35:27,485 - TensorboardLogger
2023-10-12 09:35:27,485 - LinearScheduler | warmup_fraction: '0.1'
2023-10-12 09:35:27,485 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,485 Final evaluation on model from best epoch (best-model.pt)
2023-10-12 09:35:27,485 - metric: "('micro avg', 'f1-score')"
2023-10-12 09:35:27,486 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,486 Computation:
2023-10-12 09:35:27,486 - compute on device: cuda:0
2023-10-12 09:35:27,486 - embedding storage: none
2023-10-12 09:35:27,486 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,486 Model training base path: "hmbench-icdar/fr-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs8-wsFalse-e10-lr0.00016-poolingfirst-layers-1-crfFalse-1"
2023-10-12 09:35:27,486 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,486 ----------------------------------------------------------------------------------------------------
2023-10-12 09:35:27,486 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-12 09:36:15,430 epoch 1 - iter 99/992 - loss 2.58479086 - time (sec): 47.94 - samples/sec: 322.62 - lr: 0.000016 - momentum: 0.000000
2023-10-12 09:37:05,745 epoch 1 - iter 198/992 - loss 2.52870939 - time (sec): 98.26 - samples/sec: 320.85 - lr: 0.000032 - momentum: 0.000000
2023-10-12 09:37:55,606 epoch 1 - iter 297/992 - loss 2.31835059 - time (sec): 148.12 - samples/sec: 323.43 - lr: 0.000048 - momentum: 0.000000
2023-10-12 09:38:44,520 epoch 1 - iter 396/992 - loss 2.04948183 - time (sec): 197.03 - samples/sec: 326.45 - lr: 0.000064 - momentum: 0.000000
2023-10-12 09:39:33,893 epoch 1 - iter 495/992 - loss 1.78919303 - time (sec): 246.40 - samples/sec: 325.79 - lr: 0.000080 - momentum: 0.000000
2023-10-12 09:40:23,131 epoch 1 - iter 594/992 - loss 1.55565919 - time (sec): 295.64 - samples/sec: 328.11 - lr: 0.000096 - momentum: 0.000000
2023-10-12 09:41:12,279 epoch 1 - iter 693/992 - loss 1.36235321 - time (sec): 344.79 - samples/sec: 331.68 - lr: 0.000112 - momentum: 0.000000
2023-10-12 09:41:59,986 epoch 1 - iter 792/992 - loss 1.22392122 - time (sec): 392.50 - samples/sec: 333.92 - lr: 0.000128 - momentum: 0.000000
2023-10-12 09:42:46,949 epoch 1 - iter 891/992 - loss 1.11828998 - time (sec): 439.46 - samples/sec: 335.27 - lr: 0.000144 - momentum: 0.000000
2023-10-12 09:43:36,228 epoch 1 - iter 990/992 - loss 1.02891630 - time (sec): 488.74 - samples/sec: 334.90 - lr: 0.000160 - momentum: 0.000000
2023-10-12 09:43:37,265 ----------------------------------------------------------------------------------------------------
2023-10-12 09:43:37,265 EPOCH 1 done: loss 1.0274 - lr: 0.000160
2023-10-12 09:44:01,623 DEV : loss 0.1744040548801422 - f1-score (micro avg) 0.3342
2023-10-12 09:44:01,663 saving best model
2023-10-12 09:44:02,545 ----------------------------------------------------------------------------------------------------
2023-10-12 09:44:50,545 epoch 2 - iter 99/992 - loss 0.22753274 - time (sec): 48.00 - samples/sec: 343.18 - lr: 0.000158 - momentum: 0.000000
2023-10-12 09:45:38,087 epoch 2 - iter 198/992 - loss 0.19608680 - time (sec): 95.54 - samples/sec: 342.17 - lr: 0.000156 - momentum: 0.000000
2023-10-12 09:46:26,577 epoch 2 - iter 297/992 - loss 0.18358912 - time (sec): 144.03 - samples/sec: 340.53 - lr: 0.000155 - momentum: 0.000000
2023-10-12 09:47:15,707 epoch 2 - iter 396/992 - loss 0.17888904 - time (sec): 193.16 - samples/sec: 340.25 - lr: 0.000153 - momentum: 0.000000
2023-10-12 09:48:02,749 epoch 2 - iter 495/992 - loss 0.17228318 - time (sec): 240.20 - samples/sec: 342.93 - lr: 0.000151 - momentum: 0.000000
2023-10-12 09:48:49,148 epoch 2 - iter 594/992 - loss 0.16801476 - time (sec): 286.60 - samples/sec: 343.83 - lr: 0.000149 - momentum: 0.000000
2023-10-12 09:49:36,285 epoch 2 - iter 693/992 - loss 0.16526240 - time (sec): 333.74 - samples/sec: 344.83 - lr: 0.000148 - momentum: 0.000000
2023-10-12 09:50:26,071 epoch 2 - iter 792/992 - loss 0.15957310 - time (sec): 383.52 - samples/sec: 341.06 - lr: 0.000146 - momentum: 0.000000
2023-10-12 09:51:20,596 epoch 2 - iter 891/992 - loss 0.15487382 - time (sec): 438.05 - samples/sec: 335.55 - lr: 0.000144 - momentum: 0.000000
2023-10-12 09:52:15,328 epoch 2 - iter 990/992 - loss 0.15093391 - time (sec): 492.78 - samples/sec: 331.80 - lr: 0.000142 - momentum: 0.000000
2023-10-12 09:52:16,520 ----------------------------------------------------------------------------------------------------
2023-10-12 09:52:16,520 EPOCH 2 done: loss 0.1506 - lr: 0.000142
2023-10-12 09:52:45,153 DEV : loss 0.09215505421161652 - f1-score (micro avg) 0.6997
2023-10-12 09:52:45,206 saving best model
2023-10-12 09:52:53,862 ----------------------------------------------------------------------------------------------------
2023-10-12 09:53:51,412 epoch 3 - iter 99/992 - loss 0.09122045 - time (sec): 57.54 - samples/sec: 273.51 - lr: 0.000140 - momentum: 0.000000
2023-10-12 09:54:44,654 epoch 3 - iter 198/992 - loss 0.09329663 - time (sec): 110.79 - samples/sec: 288.45 - lr: 0.000139 - momentum: 0.000000
2023-10-12 09:55:42,468 epoch 3 - iter 297/992 - loss 0.09230889 - time (sec): 168.60 - samples/sec: 289.46 - lr: 0.000137 - momentum: 0.000000
2023-10-12 09:56:37,292 epoch 3 - iter 396/992 - loss 0.09112678 - time (sec): 223.42 - samples/sec: 291.75 - lr: 0.000135 - momentum: 0.000000
2023-10-12 09:57:29,861 epoch 3 - iter 495/992 - loss 0.08931256 - time (sec): 275.99 - samples/sec: 294.90 - lr: 0.000133 - momentum: 0.000000
2023-10-12 09:58:26,690 epoch 3 - iter 594/992 - loss 0.08792155 - time (sec): 332.82 - samples/sec: 292.83 - lr: 0.000132 - momentum: 0.000000
2023-10-12 09:59:24,075 epoch 3 - iter 693/992 - loss 0.08786202 - time (sec): 390.21 - samples/sec: 290.12 - lr: 0.000130 - momentum: 0.000000
2023-10-12 10:00:15,500 epoch 3 - iter 792/992 - loss 0.08568713 - time (sec): 441.63 - samples/sec: 296.32 - lr: 0.000128 - momentum: 0.000000
2023-10-12 10:01:05,466 epoch 3 - iter 891/992 - loss 0.08435593 - time (sec): 491.60 - samples/sec: 300.53 - lr: 0.000126 - momentum: 0.000000
2023-10-12 10:01:52,852 epoch 3 - iter 990/992 - loss 0.08429677 - time (sec): 538.98 - samples/sec: 303.70 - lr: 0.000125 - momentum: 0.000000
2023-10-12 10:01:53,890 ----------------------------------------------------------------------------------------------------
2023-10-12 10:01:53,890 EPOCH 3 done: loss 0.0843 - lr: 0.000125
2023-10-12 10:02:21,183 DEV : loss 0.08918626606464386 - f1-score (micro avg) 0.7407
2023-10-12 10:02:21,233 saving best model
2023-10-12 10:02:23,914 ----------------------------------------------------------------------------------------------------
2023-10-12 10:03:16,084 epoch 4 - iter 99/992 - loss 0.05874315 - time (sec): 52.17 - samples/sec: 328.59 - lr: 0.000123 - momentum: 0.000000
2023-10-12 10:04:07,859 epoch 4 - iter 198/992 - loss 0.05797209 - time (sec): 103.94 - samples/sec: 327.68 - lr: 0.000121 - momentum: 0.000000
2023-10-12 10:04:56,031 epoch 4 - iter 297/992 - loss 0.06035790 - time (sec): 152.11 - samples/sec: 328.57 - lr: 0.000119 - momentum: 0.000000
2023-10-12 10:05:43,678 epoch 4 - iter 396/992 - loss 0.05838440 - time (sec): 199.76 - samples/sec: 328.96 - lr: 0.000117 - momentum: 0.000000
2023-10-12 10:06:31,630 epoch 4 - iter 495/992 - loss 0.05780847 - time (sec): 247.71 - samples/sec: 330.93 - lr: 0.000116 - momentum: 0.000000
2023-10-12 10:07:19,579 epoch 4 - iter 594/992 - loss 0.05754794 - time (sec): 295.66 - samples/sec: 331.61 - lr: 0.000114 - momentum: 0.000000
2023-10-12 10:08:06,828 epoch 4 - iter 693/992 - loss 0.05712259 - time (sec): 342.91 - samples/sec: 334.16 - lr: 0.000112 - momentum: 0.000000
2023-10-12 10:08:54,022 epoch 4 - iter 792/992 - loss 0.05741937 - time (sec): 390.10 - samples/sec: 335.15 - lr: 0.000110 - momentum: 0.000000
2023-10-12 10:09:42,447 epoch 4 - iter 891/992 - loss 0.05589883 - time (sec): 438.53 - samples/sec: 336.91 - lr: 0.000109 - momentum: 0.000000
2023-10-12 10:10:28,979 epoch 4 - iter 990/992 - loss 0.05609350 - time (sec): 485.06 - samples/sec: 337.58 - lr: 0.000107 - momentum: 0.000000
2023-10-12 10:10:29,864 ----------------------------------------------------------------------------------------------------
2023-10-12 10:10:29,865 EPOCH 4 done: loss 0.0560 - lr: 0.000107
2023-10-12 10:10:54,352 DEV : loss 0.10516904294490814 - f1-score (micro avg) 0.7686
2023-10-12 10:10:54,392 saving best model
2023-10-12 10:10:56,993 ----------------------------------------------------------------------------------------------------
2023-10-12 10:11:51,597 epoch 5 - iter 99/992 - loss 0.04339909 - time (sec): 54.60 - samples/sec: 295.88 - lr: 0.000105 - momentum: 0.000000
2023-10-12 10:12:41,528 epoch 5 - iter 198/992 - loss 0.03593089 - time (sec): 104.53 - samples/sec: 309.37 - lr: 0.000103 - momentum: 0.000000
2023-10-12 10:13:29,901 epoch 5 - iter 297/992 - loss 0.03787890 - time (sec): 152.90 - samples/sec: 318.18 - lr: 0.000101 - momentum: 0.000000
2023-10-12 10:14:18,612 epoch 5 - iter 396/992 - loss 0.03841376 - time (sec): 201.61 - samples/sec: 322.88 - lr: 0.000100 - momentum: 0.000000
2023-10-12 10:15:09,736 epoch 5 - iter 495/992 - loss 0.03858312 - time (sec): 252.74 - samples/sec: 321.68 - lr: 0.000098 - momentum: 0.000000
2023-10-12 10:16:02,866 epoch 5 - iter 594/992 - loss 0.04020965 - time (sec): 305.87 - samples/sec: 320.15 - lr: 0.000096 - momentum: 0.000000
2023-10-12 10:16:51,758 epoch 5 - iter 693/992 - loss 0.04044582 - time (sec): 354.76 - samples/sec: 322.58 - lr: 0.000094 - momentum: 0.000000
2023-10-12 10:17:41,098 epoch 5 - iter 792/992 - loss 0.04037223 - time (sec): 404.10 - samples/sec: 325.00 - lr: 0.000093 - momentum: 0.000000
2023-10-12 10:18:30,954 epoch 5 - iter 891/992 - loss 0.04070099 - time (sec): 453.96 - samples/sec: 325.68 - lr: 0.000091 - momentum: 0.000000
2023-10-12 10:19:19,136 epoch 5 - iter 990/992 - loss 0.04132693 - time (sec): 502.14 - samples/sec: 325.85 - lr: 0.000089 - momentum: 0.000000
2023-10-12 10:19:20,111 ----------------------------------------------------------------------------------------------------
2023-10-12 10:19:20,111 EPOCH 5 done: loss 0.0413 - lr: 0.000089
2023-10-12 10:19:46,676 DEV : loss 0.11871492117643356 - f1-score (micro avg) 0.7624
2023-10-12 10:19:46,717 ----------------------------------------------------------------------------------------------------
2023-10-12 10:20:35,147 epoch 6 - iter 99/992 - loss 0.02568349 - time (sec): 48.43 - samples/sec: 323.47 - lr: 0.000087 - momentum: 0.000000
2023-10-12 10:21:29,969 epoch 6 - iter 198/992 - loss 0.02791147 - time (sec): 103.25 - samples/sec: 310.13 - lr: 0.000085 - momentum: 0.000000
2023-10-12 10:22:20,524 epoch 6 - iter 297/992 - loss 0.02795523 - time (sec): 153.80 - samples/sec: 313.23 - lr: 0.000084 - momentum: 0.000000
2023-10-12 10:23:15,727 epoch 6 - iter 396/992 - loss 0.02853544 - time (sec): 209.01 - samples/sec: 311.81 - lr: 0.000082 - momentum: 0.000000
2023-10-12 10:24:10,797 epoch 6 - iter 495/992 - loss 0.02779838 - time (sec): 264.08 - samples/sec: 307.25 - lr: 0.000080 - momentum: 0.000000
2023-10-12 10:25:04,606 epoch 6 - iter 594/992 - loss 0.02783298 - time (sec): 317.89 - samples/sec: 307.75 - lr: 0.000078 - momentum: 0.000000
2023-10-12 10:25:57,328 epoch 6 - iter 693/992 - loss 0.02926439 - time (sec): 370.61 - samples/sec: 309.48 - lr: 0.000077 - momentum: 0.000000
2023-10-12 10:26:49,073 epoch 6 - iter 792/992 - loss 0.03085251 - time (sec): 422.35 - samples/sec: 309.52 - lr: 0.000075 - momentum: 0.000000
2023-10-12 10:27:40,834 epoch 6 - iter 891/992 - loss 0.03137567 - time (sec): 474.12 - samples/sec: 310.75 - lr: 0.000073 - momentum: 0.000000
2023-10-12 10:28:32,177 epoch 6 - iter 990/992 - loss 0.03128091 - time (sec): 525.46 - samples/sec: 311.37 - lr: 0.000071 - momentum: 0.000000
2023-10-12 10:28:33,245 ----------------------------------------------------------------------------------------------------
2023-10-12 10:28:33,245 EPOCH 6 done: loss 0.0313 - lr: 0.000071
2023-10-12 10:29:00,108 DEV : loss 0.13804303109645844 - f1-score (micro avg) 0.7576
2023-10-12 10:29:00,151 ----------------------------------------------------------------------------------------------------
2023-10-12 10:29:54,368 epoch 7 - iter 99/992 - loss 0.01668401 - time (sec): 54.21 - samples/sec: 300.78 - lr: 0.000069 - momentum: 0.000000
2023-10-12 10:30:54,123 epoch 7 - iter 198/992 - loss 0.02161418 - time (sec): 113.97 - samples/sec: 289.29 - lr: 0.000068 - momentum: 0.000000
2023-10-12 10:31:45,039 epoch 7 - iter 297/992 - loss 0.02234921 - time (sec): 164.89 - samples/sec: 296.81 - lr: 0.000066 - momentum: 0.000000
2023-10-12 10:32:41,367 epoch 7 - iter 396/992 - loss 0.02304129 - time (sec): 221.21 - samples/sec: 296.76 - lr: 0.000064 - momentum: 0.000000
2023-10-12 10:33:33,523 epoch 7 - iter 495/992 - loss 0.02267395 - time (sec): 273.37 - samples/sec: 298.35 - lr: 0.000062 - momentum: 0.000000
2023-10-12 10:34:22,325 epoch 7 - iter 594/992 - loss 0.02287761 - time (sec): 322.17 - samples/sec: 303.73 - lr: 0.000061 - momentum: 0.000000
2023-10-12 10:35:13,730 epoch 7 - iter 693/992 - loss 0.02354855 - time (sec): 373.58 - samples/sec: 307.18 - lr: 0.000059 - momentum: 0.000000
2023-10-12 10:36:05,992 epoch 7 - iter 792/992 - loss 0.02437092 - time (sec): 425.84 - samples/sec: 304.43 - lr: 0.000057 - momentum: 0.000000
2023-10-12 10:36:56,191 epoch 7 - iter 891/992 - loss 0.02374473 - time (sec): 476.04 - samples/sec: 307.35 - lr: 0.000055 - momentum: 0.000000
2023-10-12 10:37:49,617 epoch 7 - iter 990/992 - loss 0.02363874 - time (sec): 529.46 - samples/sec: 309.00 - lr: 0.000053 - momentum: 0.000000
2023-10-12 10:37:50,718 ----------------------------------------------------------------------------------------------------
2023-10-12 10:37:50,719 EPOCH 7 done: loss 0.0237 - lr: 0.000053
2023-10-12 10:38:22,287 DEV : loss 0.1811920553445816 - f1-score (micro avg) 0.749
2023-10-12 10:38:22,339 ----------------------------------------------------------------------------------------------------
2023-10-12 10:39:16,794 epoch 8 - iter 99/992 - loss 0.02146333 - time (sec): 54.45 - samples/sec: 310.86 - lr: 0.000052 - momentum: 0.000000
2023-10-12 10:40:06,997 epoch 8 - iter 198/992 - loss 0.01992038 - time (sec): 104.66 - samples/sec: 308.15 - lr: 0.000050 - momentum: 0.000000
2023-10-12 10:40:54,436 epoch 8 - iter 297/992 - loss 0.02020086 - time (sec): 152.09 - samples/sec: 313.96 - lr: 0.000048 - momentum: 0.000000
2023-10-12 10:41:44,381 epoch 8 - iter 396/992 - loss 0.01943430 - time (sec): 202.04 - samples/sec: 315.64 - lr: 0.000046 - momentum: 0.000000
2023-10-12 10:42:34,397 epoch 8 - iter 495/992 - loss 0.01928764 - time (sec): 252.06 - samples/sec: 319.49 - lr: 0.000045 - momentum: 0.000000
2023-10-12 10:43:25,292 epoch 8 - iter 594/992 - loss 0.01982911 - time (sec): 302.95 - samples/sec: 323.08 - lr: 0.000043 - momentum: 0.000000
2023-10-12 10:44:22,487 epoch 8 - iter 693/992 - loss 0.01964615 - time (sec): 360.15 - samples/sec: 316.37 - lr: 0.000041 - momentum: 0.000000
2023-10-12 10:45:13,212 epoch 8 - iter 792/992 - loss 0.01935565 - time (sec): 410.87 - samples/sec: 317.94 - lr: 0.000039 - momentum: 0.000000
2023-10-12 10:46:03,197 epoch 8 - iter 891/992 - loss 0.01971031 - time (sec): 460.86 - samples/sec: 318.23 - lr: 0.000037 - momentum: 0.000000
2023-10-12 10:47:03,208 epoch 8 - iter 990/992 - loss 0.01892606 - time (sec): 520.87 - samples/sec: 314.37 - lr: 0.000036 - momentum: 0.000000
2023-10-12 10:47:04,250 ----------------------------------------------------------------------------------------------------
2023-10-12 10:47:04,250 EPOCH 8 done: loss 0.0189 - lr: 0.000036
2023-10-12 10:47:29,935 DEV : loss 0.18165574967861176 - f1-score (micro avg) 0.7538
2023-10-12 10:47:29,980 ----------------------------------------------------------------------------------------------------
2023-10-12 10:48:24,334 epoch 9 - iter 99/992 - loss 0.02135843 - time (sec): 54.35 - samples/sec: 316.68 - lr: 0.000034 - momentum: 0.000000
2023-10-12 10:49:15,051 epoch 9 - iter 198/992 - loss 0.01908305 - time (sec): 105.07 - samples/sec: 320.27 - lr: 0.000032 - momentum: 0.000000
2023-10-12 10:50:05,328 epoch 9 - iter 297/992 - loss 0.01680344 - time (sec): 155.35 - samples/sec: 325.86 - lr: 0.000030 - momentum: 0.000000
2023-10-12 10:50:54,195 epoch 9 - iter 396/992 - loss 0.01730348 - time (sec): 204.21 - samples/sec: 324.06 - lr: 0.000029 - momentum: 0.000000
2023-10-12 10:51:45,215 epoch 9 - iter 495/992 - loss 0.01563250 - time (sec): 255.23 - samples/sec: 323.92 - lr: 0.000027 - momentum: 0.000000
2023-10-12 10:52:35,904 epoch 9 - iter 594/992 - loss 0.01487404 - time (sec): 305.92 - samples/sec: 323.84 - lr: 0.000025 - momentum: 0.000000
2023-10-12 10:53:26,780 epoch 9 - iter 693/992 - loss 0.01457903 - time (sec): 356.80 - samples/sec: 324.34 - lr: 0.000023 - momentum: 0.000000
2023-10-12 10:54:19,486 epoch 9 - iter 792/992 - loss 0.01480671 - time (sec): 409.50 - samples/sec: 320.45 - lr: 0.000022 - momentum: 0.000000
2023-10-12 10:55:09,652 epoch 9 - iter 891/992 - loss 0.01542691 - time (sec): 459.67 - samples/sec: 320.55 - lr: 0.000020 - momentum: 0.000000
2023-10-12 10:56:00,454 epoch 9 - iter 990/992 - loss 0.01512327 - time (sec): 510.47 - samples/sec: 320.57 - lr: 0.000018 - momentum: 0.000000
2023-10-12 10:56:01,434 ----------------------------------------------------------------------------------------------------
2023-10-12 10:56:01,434 EPOCH 9 done: loss 0.0152 - lr: 0.000018
2023-10-12 10:56:32,551 DEV : loss 0.19330915808677673 - f1-score (micro avg) 0.7607
2023-10-12 10:56:32,595 ----------------------------------------------------------------------------------------------------
2023-10-12 10:57:21,108 epoch 10 - iter 99/992 - loss 0.01035239 - time (sec): 48.51 - samples/sec: 344.03 - lr: 0.000016 - momentum: 0.000000
2023-10-12 10:58:11,337 epoch 10 - iter 198/992 - loss 0.01080075 - time (sec): 98.74 - samples/sec: 331.86 - lr: 0.000014 - momentum: 0.000000
2023-10-12 10:59:05,409 epoch 10 - iter 297/992 - loss 0.01162795 - time (sec): 152.81 - samples/sec: 323.25 - lr: 0.000013 - momentum: 0.000000
2023-10-12 10:59:56,254 epoch 10 - iter 396/992 - loss 0.01246594 - time (sec): 203.66 - samples/sec: 323.79 - lr: 0.000011 - momentum: 0.000000
2023-10-12 11:00:46,032 epoch 10 - iter 495/992 - loss 0.01213316 - time (sec): 253.43 - samples/sec: 325.65 - lr: 0.000009 - momentum: 0.000000
2023-10-12 11:01:34,559 epoch 10 - iter 594/992 - loss 0.01206470 - time (sec): 301.96 - samples/sec: 325.89 - lr: 0.000007 - momentum: 0.000000
2023-10-12 11:02:23,481 epoch 10 - iter 693/992 - loss 0.01218444 - time (sec): 350.88 - samples/sec: 325.35 - lr: 0.000006 - momentum: 0.000000
2023-10-12 11:03:16,546 epoch 10 - iter 792/992 - loss 0.01226512 - time (sec): 403.95 - samples/sec: 323.31 - lr: 0.000004 - momentum: 0.000000
2023-10-12 11:04:05,167 epoch 10 - iter 891/992 - loss 0.01227256 - time (sec): 452.57 - samples/sec: 325.33 - lr: 0.000002 - momentum: 0.000000
2023-10-12 11:04:54,115 epoch 10 - iter 990/992 - loss 0.01280459 - time (sec): 501.52 - samples/sec: 326.55 - lr: 0.000000 - momentum: 0.000000
2023-10-12 11:04:54,990 ----------------------------------------------------------------------------------------------------
2023-10-12 11:04:54,990 EPOCH 10 done: loss 0.0129 - lr: 0.000000
2023-10-12 11:05:20,207 DEV : loss 0.1981041431427002 - f1-score (micro avg) 0.7596
2023-10-12 11:05:21,142 ----------------------------------------------------------------------------------------------------
2023-10-12 11:05:21,144 Loading model from best epoch ...
2023-10-12 11:05:25,056 SequenceTagger predicts: Dictionary with 13 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG
2023-10-12 11:05:50,281
Results:
- F-score (micro) 0.7433
- F-score (macro) 0.6567
- Accuracy 0.6209
By class:
precision recall f1-score support
LOC 0.8058 0.8107 0.8082 655
PER 0.7102 0.7803 0.7436 223
ORG 0.4044 0.4331 0.4183 127
micro avg 0.7308 0.7562 0.7433 1005
macro avg 0.6401 0.6747 0.6567 1005
weighted avg 0.7338 0.7562 0.7446 1005
2023-10-12 11:05:50,281 ----------------------------------------------------------------------------------------------------