stefan-it's picture
Upload folder using huggingface_hub
e17c838
2023-10-17 17:34:24,572 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,573 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 17:34:24,573 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,573 MultiCorpus: 14465 train + 1392 dev + 2432 test sentences
- NER_HIPE_2022 Corpus: 14465 train + 1392 dev + 2432 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/letemps/fr/with_doc_seperator
2023-10-17 17:34:24,573 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,573 Train: 14465 sentences
2023-10-17 17:34:24,574 (train_with_dev=False, train_with_test=False)
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 Training Params:
2023-10-17 17:34:24,574 - learning_rate: "3e-05"
2023-10-17 17:34:24,574 - mini_batch_size: "4"
2023-10-17 17:34:24,574 - max_epochs: "10"
2023-10-17 17:34:24,574 - shuffle: "True"
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 Plugins:
2023-10-17 17:34:24,574 - TensorboardLogger
2023-10-17 17:34:24,574 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 17:34:24,574 - metric: "('micro avg', 'f1-score')"
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 Computation:
2023-10-17 17:34:24,574 - compute on device: cuda:0
2023-10-17 17:34:24,574 - embedding storage: none
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 Model training base path: "hmbench-letemps/fr-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-5"
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 ----------------------------------------------------------------------------------------------------
2023-10-17 17:34:24,574 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 17:34:46,275 epoch 1 - iter 361/3617 - loss 1.76323200 - time (sec): 21.70 - samples/sec: 1673.46 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:35:08,329 epoch 1 - iter 722/3617 - loss 0.97719233 - time (sec): 43.75 - samples/sec: 1722.16 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:35:29,907 epoch 1 - iter 1083/3617 - loss 0.70773638 - time (sec): 65.33 - samples/sec: 1727.89 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:35:52,050 epoch 1 - iter 1444/3617 - loss 0.57489064 - time (sec): 87.47 - samples/sec: 1695.61 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:36:14,121 epoch 1 - iter 1805/3617 - loss 0.48505733 - time (sec): 109.55 - samples/sec: 1700.09 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:36:37,171 epoch 1 - iter 2166/3617 - loss 0.42060030 - time (sec): 132.60 - samples/sec: 1702.95 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:37:01,165 epoch 1 - iter 2527/3617 - loss 0.37530833 - time (sec): 156.59 - samples/sec: 1690.36 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:37:25,310 epoch 1 - iter 2888/3617 - loss 0.34369782 - time (sec): 180.73 - samples/sec: 1670.51 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:37:48,132 epoch 1 - iter 3249/3617 - loss 0.31710623 - time (sec): 203.56 - samples/sec: 1671.28 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:38:10,854 epoch 1 - iter 3610/3617 - loss 0.29710226 - time (sec): 226.28 - samples/sec: 1676.21 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:38:11,280 ----------------------------------------------------------------------------------------------------
2023-10-17 17:38:11,281 EPOCH 1 done: loss 0.2968 - lr: 0.000030
2023-10-17 17:38:16,764 DEV : loss 0.11850162595510483 - f1-score (micro avg) 0.6174
2023-10-17 17:38:16,803 saving best model
2023-10-17 17:38:17,296 ----------------------------------------------------------------------------------------------------
2023-10-17 17:38:40,328 epoch 2 - iter 361/3617 - loss 0.09918424 - time (sec): 23.03 - samples/sec: 1616.49 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:39:04,096 epoch 2 - iter 722/3617 - loss 0.09676193 - time (sec): 46.80 - samples/sec: 1648.48 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:39:27,692 epoch 2 - iter 1083/3617 - loss 0.09693761 - time (sec): 70.39 - samples/sec: 1636.68 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:39:49,271 epoch 2 - iter 1444/3617 - loss 0.09587484 - time (sec): 91.97 - samples/sec: 1657.05 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:40:10,888 epoch 2 - iter 1805/3617 - loss 0.09859513 - time (sec): 113.59 - samples/sec: 1677.81 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:40:32,627 epoch 2 - iter 2166/3617 - loss 0.09848297 - time (sec): 135.33 - samples/sec: 1691.64 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:40:54,465 epoch 2 - iter 2527/3617 - loss 0.09927284 - time (sec): 157.17 - samples/sec: 1706.12 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:41:16,183 epoch 2 - iter 2888/3617 - loss 0.10032862 - time (sec): 178.89 - samples/sec: 1700.71 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:41:38,079 epoch 2 - iter 3249/3617 - loss 0.09865403 - time (sec): 200.78 - samples/sec: 1696.74 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:42:00,005 epoch 2 - iter 3610/3617 - loss 0.09788190 - time (sec): 222.71 - samples/sec: 1702.38 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:42:00,414 ----------------------------------------------------------------------------------------------------
2023-10-17 17:42:00,414 EPOCH 2 done: loss 0.0979 - lr: 0.000027
2023-10-17 17:42:07,594 DEV : loss 0.17727647721767426 - f1-score (micro avg) 0.643
2023-10-17 17:42:07,634 saving best model
2023-10-17 17:42:08,219 ----------------------------------------------------------------------------------------------------
2023-10-17 17:42:30,667 epoch 3 - iter 361/3617 - loss 0.08733342 - time (sec): 22.45 - samples/sec: 1697.48 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:42:52,590 epoch 3 - iter 722/3617 - loss 0.07748900 - time (sec): 44.37 - samples/sec: 1700.86 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:43:14,458 epoch 3 - iter 1083/3617 - loss 0.07705997 - time (sec): 66.24 - samples/sec: 1715.97 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:43:36,400 epoch 3 - iter 1444/3617 - loss 0.08038163 - time (sec): 88.18 - samples/sec: 1709.07 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:43:58,644 epoch 3 - iter 1805/3617 - loss 0.08125813 - time (sec): 110.42 - samples/sec: 1713.75 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:44:20,982 epoch 3 - iter 2166/3617 - loss 0.08039951 - time (sec): 132.76 - samples/sec: 1706.36 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:44:43,036 epoch 3 - iter 2527/3617 - loss 0.07890125 - time (sec): 154.81 - samples/sec: 1724.23 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:45:04,833 epoch 3 - iter 2888/3617 - loss 0.07839290 - time (sec): 176.61 - samples/sec: 1728.39 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:45:26,509 epoch 3 - iter 3249/3617 - loss 0.07824675 - time (sec): 198.29 - samples/sec: 1726.88 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:45:48,660 epoch 3 - iter 3610/3617 - loss 0.07807987 - time (sec): 220.44 - samples/sec: 1720.66 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:45:49,123 ----------------------------------------------------------------------------------------------------
2023-10-17 17:45:49,124 EPOCH 3 done: loss 0.0781 - lr: 0.000023
2023-10-17 17:45:55,465 DEV : loss 0.1918492317199707 - f1-score (micro avg) 0.6362
2023-10-17 17:45:55,506 ----------------------------------------------------------------------------------------------------
2023-10-17 17:46:17,278 epoch 4 - iter 361/3617 - loss 0.05176746 - time (sec): 21.77 - samples/sec: 1799.54 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:46:39,033 epoch 4 - iter 722/3617 - loss 0.05224585 - time (sec): 43.53 - samples/sec: 1735.45 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:47:00,821 epoch 4 - iter 1083/3617 - loss 0.05378626 - time (sec): 65.31 - samples/sec: 1742.41 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:47:22,689 epoch 4 - iter 1444/3617 - loss 0.05274648 - time (sec): 87.18 - samples/sec: 1742.54 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:47:44,755 epoch 4 - iter 1805/3617 - loss 0.05592921 - time (sec): 109.25 - samples/sec: 1745.76 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:48:07,116 epoch 4 - iter 2166/3617 - loss 0.05602792 - time (sec): 131.61 - samples/sec: 1740.39 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:48:29,541 epoch 4 - iter 2527/3617 - loss 0.05664568 - time (sec): 154.03 - samples/sec: 1726.97 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:48:52,183 epoch 4 - iter 2888/3617 - loss 0.05768577 - time (sec): 176.68 - samples/sec: 1727.09 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:49:14,599 epoch 4 - iter 3249/3617 - loss 0.05774333 - time (sec): 199.09 - samples/sec: 1723.30 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:49:36,705 epoch 4 - iter 3610/3617 - loss 0.05929256 - time (sec): 221.20 - samples/sec: 1714.57 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:49:37,109 ----------------------------------------------------------------------------------------------------
2023-10-17 17:49:37,109 EPOCH 4 done: loss 0.0592 - lr: 0.000020
2023-10-17 17:49:44,205 DEV : loss 0.28038784861564636 - f1-score (micro avg) 0.6488
2023-10-17 17:49:44,246 saving best model
2023-10-17 17:49:44,837 ----------------------------------------------------------------------------------------------------
2023-10-17 17:50:07,317 epoch 5 - iter 361/3617 - loss 0.04250399 - time (sec): 22.48 - samples/sec: 1649.11 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:50:30,124 epoch 5 - iter 722/3617 - loss 0.03994872 - time (sec): 45.29 - samples/sec: 1656.90 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:50:47,769 epoch 5 - iter 1083/3617 - loss 0.04097508 - time (sec): 62.93 - samples/sec: 1792.79 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:51:05,557 epoch 5 - iter 1444/3617 - loss 0.04166533 - time (sec): 80.72 - samples/sec: 1875.53 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:51:23,292 epoch 5 - iter 1805/3617 - loss 0.04056126 - time (sec): 98.45 - samples/sec: 1921.77 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:51:41,012 epoch 5 - iter 2166/3617 - loss 0.04115083 - time (sec): 116.17 - samples/sec: 1961.30 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:51:58,679 epoch 5 - iter 2527/3617 - loss 0.04043008 - time (sec): 133.84 - samples/sec: 1982.69 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:52:18,914 epoch 5 - iter 2888/3617 - loss 0.04158883 - time (sec): 154.08 - samples/sec: 1968.29 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:52:40,783 epoch 5 - iter 3249/3617 - loss 0.04115229 - time (sec): 175.94 - samples/sec: 1934.98 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:53:02,556 epoch 5 - iter 3610/3617 - loss 0.04082194 - time (sec): 197.72 - samples/sec: 1917.98 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:53:02,991 ----------------------------------------------------------------------------------------------------
2023-10-17 17:53:02,991 EPOCH 5 done: loss 0.0408 - lr: 0.000017
2023-10-17 17:53:09,293 DEV : loss 0.2820379436016083 - f1-score (micro avg) 0.6496
2023-10-17 17:53:09,334 saving best model
2023-10-17 17:53:09,916 ----------------------------------------------------------------------------------------------------
2023-10-17 17:53:31,840 epoch 6 - iter 361/3617 - loss 0.02861178 - time (sec): 21.92 - samples/sec: 1722.24 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:53:53,637 epoch 6 - iter 722/3617 - loss 0.02877966 - time (sec): 43.72 - samples/sec: 1739.65 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:54:15,471 epoch 6 - iter 1083/3617 - loss 0.02687634 - time (sec): 65.55 - samples/sec: 1738.76 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:54:37,386 epoch 6 - iter 1444/3617 - loss 0.02735026 - time (sec): 87.47 - samples/sec: 1743.92 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:54:59,875 epoch 6 - iter 1805/3617 - loss 0.02863562 - time (sec): 109.96 - samples/sec: 1741.78 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:55:22,840 epoch 6 - iter 2166/3617 - loss 0.02936955 - time (sec): 132.92 - samples/sec: 1726.05 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:55:45,079 epoch 6 - iter 2527/3617 - loss 0.02920557 - time (sec): 155.16 - samples/sec: 1725.78 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:56:07,698 epoch 6 - iter 2888/3617 - loss 0.02871883 - time (sec): 177.78 - samples/sec: 1719.38 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:56:30,375 epoch 6 - iter 3249/3617 - loss 0.02854119 - time (sec): 200.46 - samples/sec: 1709.07 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:56:52,589 epoch 6 - iter 3610/3617 - loss 0.02909474 - time (sec): 222.67 - samples/sec: 1702.40 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:56:53,011 ----------------------------------------------------------------------------------------------------
2023-10-17 17:56:53,011 EPOCH 6 done: loss 0.0290 - lr: 0.000013
2023-10-17 17:57:00,078 DEV : loss 0.29600751399993896 - f1-score (micro avg) 0.6426
2023-10-17 17:57:00,126 ----------------------------------------------------------------------------------------------------
2023-10-17 17:57:22,403 epoch 7 - iter 361/3617 - loss 0.02256972 - time (sec): 22.28 - samples/sec: 1687.99 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:57:44,247 epoch 7 - iter 722/3617 - loss 0.01992930 - time (sec): 44.12 - samples/sec: 1709.63 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:58:06,222 epoch 7 - iter 1083/3617 - loss 0.01936052 - time (sec): 66.09 - samples/sec: 1688.77 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:58:27,958 epoch 7 - iter 1444/3617 - loss 0.02131053 - time (sec): 87.83 - samples/sec: 1716.41 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:58:49,581 epoch 7 - iter 1805/3617 - loss 0.02100541 - time (sec): 109.45 - samples/sec: 1713.74 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:59:11,493 epoch 7 - iter 2166/3617 - loss 0.02021921 - time (sec): 131.37 - samples/sec: 1720.34 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:59:33,198 epoch 7 - iter 2527/3617 - loss 0.02012324 - time (sec): 153.07 - samples/sec: 1729.52 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:59:55,204 epoch 7 - iter 2888/3617 - loss 0.01995187 - time (sec): 175.08 - samples/sec: 1726.12 - lr: 0.000011 - momentum: 0.000000
2023-10-17 18:00:17,089 epoch 7 - iter 3249/3617 - loss 0.02065452 - time (sec): 196.96 - samples/sec: 1729.41 - lr: 0.000010 - momentum: 0.000000
2023-10-17 18:00:38,909 epoch 7 - iter 3610/3617 - loss 0.02030197 - time (sec): 218.78 - samples/sec: 1733.29 - lr: 0.000010 - momentum: 0.000000
2023-10-17 18:00:39,320 ----------------------------------------------------------------------------------------------------
2023-10-17 18:00:39,321 EPOCH 7 done: loss 0.0203 - lr: 0.000010
2023-10-17 18:00:45,581 DEV : loss 0.36072632670402527 - f1-score (micro avg) 0.6493
2023-10-17 18:00:45,624 ----------------------------------------------------------------------------------------------------
2023-10-17 18:01:07,188 epoch 8 - iter 361/3617 - loss 0.01488249 - time (sec): 21.56 - samples/sec: 1705.12 - lr: 0.000010 - momentum: 0.000000
2023-10-17 18:01:28,983 epoch 8 - iter 722/3617 - loss 0.01173994 - time (sec): 43.36 - samples/sec: 1701.96 - lr: 0.000009 - momentum: 0.000000
2023-10-17 18:01:50,897 epoch 8 - iter 1083/3617 - loss 0.01172267 - time (sec): 65.27 - samples/sec: 1717.40 - lr: 0.000009 - momentum: 0.000000
2023-10-17 18:02:12,572 epoch 8 - iter 1444/3617 - loss 0.01239870 - time (sec): 86.95 - samples/sec: 1716.68 - lr: 0.000009 - momentum: 0.000000
2023-10-17 18:02:34,782 epoch 8 - iter 1805/3617 - loss 0.01203878 - time (sec): 109.16 - samples/sec: 1726.53 - lr: 0.000008 - momentum: 0.000000
2023-10-17 18:02:56,772 epoch 8 - iter 2166/3617 - loss 0.01229690 - time (sec): 131.15 - samples/sec: 1723.41 - lr: 0.000008 - momentum: 0.000000
2023-10-17 18:03:19,358 epoch 8 - iter 2527/3617 - loss 0.01220534 - time (sec): 153.73 - samples/sec: 1711.58 - lr: 0.000008 - momentum: 0.000000
2023-10-17 18:03:41,056 epoch 8 - iter 2888/3617 - loss 0.01250368 - time (sec): 175.43 - samples/sec: 1719.43 - lr: 0.000007 - momentum: 0.000000
2023-10-17 18:04:03,226 epoch 8 - iter 3249/3617 - loss 0.01250331 - time (sec): 197.60 - samples/sec: 1722.51 - lr: 0.000007 - momentum: 0.000000
2023-10-17 18:04:25,843 epoch 8 - iter 3610/3617 - loss 0.01215736 - time (sec): 220.22 - samples/sec: 1722.37 - lr: 0.000007 - momentum: 0.000000
2023-10-17 18:04:26,278 ----------------------------------------------------------------------------------------------------
2023-10-17 18:04:26,278 EPOCH 8 done: loss 0.0122 - lr: 0.000007
2023-10-17 18:04:33,346 DEV : loss 0.3847675919532776 - f1-score (micro avg) 0.6523
2023-10-17 18:04:33,386 saving best model
2023-10-17 18:04:33,971 ----------------------------------------------------------------------------------------------------
2023-10-17 18:04:55,982 epoch 9 - iter 361/3617 - loss 0.01548248 - time (sec): 22.01 - samples/sec: 1718.42 - lr: 0.000006 - momentum: 0.000000
2023-10-17 18:05:18,605 epoch 9 - iter 722/3617 - loss 0.01155917 - time (sec): 44.63 - samples/sec: 1687.13 - lr: 0.000006 - momentum: 0.000000
2023-10-17 18:05:41,248 epoch 9 - iter 1083/3617 - loss 0.01030921 - time (sec): 67.28 - samples/sec: 1668.56 - lr: 0.000006 - momentum: 0.000000
2023-10-17 18:06:04,188 epoch 9 - iter 1444/3617 - loss 0.00923199 - time (sec): 90.22 - samples/sec: 1647.00 - lr: 0.000005 - momentum: 0.000000
2023-10-17 18:06:27,599 epoch 9 - iter 1805/3617 - loss 0.00874837 - time (sec): 113.63 - samples/sec: 1645.30 - lr: 0.000005 - momentum: 0.000000
2023-10-17 18:06:50,980 epoch 9 - iter 2166/3617 - loss 0.00882996 - time (sec): 137.01 - samples/sec: 1645.83 - lr: 0.000005 - momentum: 0.000000
2023-10-17 18:07:14,442 epoch 9 - iter 2527/3617 - loss 0.00865169 - time (sec): 160.47 - samples/sec: 1643.59 - lr: 0.000004 - momentum: 0.000000
2023-10-17 18:07:37,588 epoch 9 - iter 2888/3617 - loss 0.00811406 - time (sec): 183.62 - samples/sec: 1642.69 - lr: 0.000004 - momentum: 0.000000
2023-10-17 18:08:00,984 epoch 9 - iter 3249/3617 - loss 0.00831608 - time (sec): 207.01 - samples/sec: 1648.34 - lr: 0.000004 - momentum: 0.000000
2023-10-17 18:08:24,144 epoch 9 - iter 3610/3617 - loss 0.00812836 - time (sec): 230.17 - samples/sec: 1647.70 - lr: 0.000003 - momentum: 0.000000
2023-10-17 18:08:24,588 ----------------------------------------------------------------------------------------------------
2023-10-17 18:08:24,588 EPOCH 9 done: loss 0.0082 - lr: 0.000003
2023-10-17 18:08:30,995 DEV : loss 0.3781038522720337 - f1-score (micro avg) 0.6514
2023-10-17 18:08:31,037 ----------------------------------------------------------------------------------------------------
2023-10-17 18:08:54,748 epoch 10 - iter 361/3617 - loss 0.00510416 - time (sec): 23.71 - samples/sec: 1590.70 - lr: 0.000003 - momentum: 0.000000
2023-10-17 18:09:17,569 epoch 10 - iter 722/3617 - loss 0.00489821 - time (sec): 46.53 - samples/sec: 1609.05 - lr: 0.000003 - momentum: 0.000000
2023-10-17 18:09:39,437 epoch 10 - iter 1083/3617 - loss 0.00501388 - time (sec): 68.40 - samples/sec: 1655.25 - lr: 0.000002 - momentum: 0.000000
2023-10-17 18:10:01,410 epoch 10 - iter 1444/3617 - loss 0.00452399 - time (sec): 90.37 - samples/sec: 1685.07 - lr: 0.000002 - momentum: 0.000000
2023-10-17 18:10:23,363 epoch 10 - iter 1805/3617 - loss 0.00468649 - time (sec): 112.32 - samples/sec: 1705.08 - lr: 0.000002 - momentum: 0.000000
2023-10-17 18:10:45,165 epoch 10 - iter 2166/3617 - loss 0.00503756 - time (sec): 134.13 - samples/sec: 1710.08 - lr: 0.000001 - momentum: 0.000000
2023-10-17 18:11:07,122 epoch 10 - iter 2527/3617 - loss 0.00499945 - time (sec): 156.08 - samples/sec: 1712.08 - lr: 0.000001 - momentum: 0.000000
2023-10-17 18:11:28,917 epoch 10 - iter 2888/3617 - loss 0.00532812 - time (sec): 177.88 - samples/sec: 1716.51 - lr: 0.000001 - momentum: 0.000000
2023-10-17 18:11:50,588 epoch 10 - iter 3249/3617 - loss 0.00527748 - time (sec): 199.55 - samples/sec: 1714.00 - lr: 0.000000 - momentum: 0.000000
2023-10-17 18:12:12,769 epoch 10 - iter 3610/3617 - loss 0.00530741 - time (sec): 221.73 - samples/sec: 1710.95 - lr: 0.000000 - momentum: 0.000000
2023-10-17 18:12:13,184 ----------------------------------------------------------------------------------------------------
2023-10-17 18:12:13,184 EPOCH 10 done: loss 0.0053 - lr: 0.000000
2023-10-17 18:12:20,278 DEV : loss 0.39976608753204346 - f1-score (micro avg) 0.6548
2023-10-17 18:12:20,319 saving best model
2023-10-17 18:12:21,394 ----------------------------------------------------------------------------------------------------
2023-10-17 18:12:21,395 Loading model from best epoch ...
2023-10-17 18:12:23,141 SequenceTagger predicts: Dictionary with 13 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org
2023-10-17 18:12:31,270
Results:
- F-score (micro) 0.6652
- F-score (macro) 0.5352
- Accuracy 0.5099
By class:
precision recall f1-score support
loc 0.6652 0.7733 0.7152 591
pers 0.5817 0.7675 0.6618 357
org 0.2623 0.2025 0.2286 79
micro avg 0.6128 0.7274 0.6652 1027
macro avg 0.5031 0.5811 0.5352 1027
weighted avg 0.6052 0.7274 0.6592 1027
2023-10-17 18:12:31,270 ----------------------------------------------------------------------------------------------------