2023-10-25 21:33:51,878 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,879 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(64001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=17, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-25 21:33:51,879 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,880 MultiCorpus: 1085 train + 148 dev + 364 test sentences - NER_HIPE_2022 Corpus: 1085 train + 148 dev + 364 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/sv/with_doc_seperator 2023-10-25 21:33:51,880 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,880 Train: 1085 sentences 2023-10-25 21:33:51,880 (train_with_dev=False, train_with_test=False) 2023-10-25 21:33:51,880 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,880 Training Params: 2023-10-25 21:33:51,880 - learning_rate: "5e-05" 2023-10-25 21:33:51,880 - mini_batch_size: "8" 2023-10-25 21:33:51,880 - max_epochs: "10" 2023-10-25 21:33:51,880 - shuffle: "True" 2023-10-25 21:33:51,880 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,880 Plugins: 2023-10-25 21:33:51,880 - TensorboardLogger 2023-10-25 21:33:51,880 - LinearScheduler | warmup_fraction: '0.1' 2023-10-25 21:33:51,880 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,880 Final evaluation on model from best epoch (best-model.pt) 2023-10-25 21:33:51,880 - metric: "('micro avg', 'f1-score')" 2023-10-25 21:33:51,880 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,880 Computation: 2023-10-25 21:33:51,880 - compute on device: cuda:0 2023-10-25 21:33:51,880 - embedding storage: none 2023-10-25 21:33:51,881 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,881 Model training base path: "hmbench-newseye/sv-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5" 2023-10-25 21:33:51,881 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,881 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:33:51,881 Logging anything other than scalars to TensorBoard is currently not supported. 2023-10-25 21:33:52,790 epoch 1 - iter 13/136 - loss 2.59051122 - time (sec): 0.91 - samples/sec: 5504.78 - lr: 0.000004 - momentum: 0.000000 2023-10-25 21:33:53,806 epoch 1 - iter 26/136 - loss 2.00418505 - time (sec): 1.92 - samples/sec: 5261.02 - lr: 0.000009 - momentum: 0.000000 2023-10-25 21:33:54,806 epoch 1 - iter 39/136 - loss 1.53400120 - time (sec): 2.92 - samples/sec: 5235.21 - lr: 0.000014 - momentum: 0.000000 2023-10-25 21:33:55,907 epoch 1 - iter 52/136 - loss 1.25236188 - time (sec): 4.03 - samples/sec: 5213.17 - lr: 0.000019 - momentum: 0.000000 2023-10-25 21:33:56,961 epoch 1 - iter 65/136 - loss 1.09370900 - time (sec): 5.08 - samples/sec: 5086.47 - lr: 0.000024 - momentum: 0.000000 2023-10-25 21:33:58,003 epoch 1 - iter 78/136 - loss 0.96770702 - time (sec): 6.12 - samples/sec: 5037.96 - lr: 0.000028 - momentum: 0.000000 2023-10-25 21:33:59,051 epoch 1 - iter 91/136 - loss 0.86306761 - time (sec): 7.17 - samples/sec: 5068.05 - lr: 0.000033 - momentum: 0.000000 2023-10-25 21:34:00,036 epoch 1 - iter 104/136 - loss 0.78919472 - time (sec): 8.15 - samples/sec: 5034.91 - lr: 0.000038 - momentum: 0.000000 2023-10-25 21:34:01,075 epoch 1 - iter 117/136 - loss 0.72503737 - time (sec): 9.19 - samples/sec: 5003.90 - lr: 0.000043 - momentum: 0.000000 2023-10-25 21:34:01,983 epoch 1 - iter 130/136 - loss 0.68591323 - time (sec): 10.10 - samples/sec: 4944.79 - lr: 0.000047 - momentum: 0.000000 2023-10-25 21:34:02,385 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:02,385 EPOCH 1 done: loss 0.6651 - lr: 0.000047 2023-10-25 21:34:03,425 DEV : loss 0.12261621654033661 - f1-score (micro avg) 0.7132 2023-10-25 21:34:03,431 saving best model 2023-10-25 21:34:03,923 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:04,893 epoch 2 - iter 13/136 - loss 0.12096013 - time (sec): 0.97 - samples/sec: 5279.05 - lr: 0.000050 - momentum: 0.000000 2023-10-25 21:34:05,859 epoch 2 - iter 26/136 - loss 0.13795080 - time (sec): 1.93 - samples/sec: 5464.64 - lr: 0.000049 - momentum: 0.000000 2023-10-25 21:34:06,909 epoch 2 - iter 39/136 - loss 0.13603380 - time (sec): 2.98 - samples/sec: 4998.10 - lr: 0.000048 - momentum: 0.000000 2023-10-25 21:34:07,877 epoch 2 - iter 52/136 - loss 0.13411210 - time (sec): 3.95 - samples/sec: 4984.89 - lr: 0.000048 - momentum: 0.000000 2023-10-25 21:34:08,809 epoch 2 - iter 65/136 - loss 0.12883628 - time (sec): 4.88 - samples/sec: 5023.44 - lr: 0.000047 - momentum: 0.000000 2023-10-25 21:34:09,762 epoch 2 - iter 78/136 - loss 0.13332562 - time (sec): 5.84 - samples/sec: 5105.30 - lr: 0.000047 - momentum: 0.000000 2023-10-25 21:34:10,762 epoch 2 - iter 91/136 - loss 0.13183996 - time (sec): 6.84 - samples/sec: 5115.96 - lr: 0.000046 - momentum: 0.000000 2023-10-25 21:34:11,773 epoch 2 - iter 104/136 - loss 0.13051739 - time (sec): 7.85 - samples/sec: 5030.07 - lr: 0.000046 - momentum: 0.000000 2023-10-25 21:34:12,740 epoch 2 - iter 117/136 - loss 0.12896418 - time (sec): 8.82 - samples/sec: 5111.03 - lr: 0.000045 - momentum: 0.000000 2023-10-25 21:34:13,702 epoch 2 - iter 130/136 - loss 0.12638642 - time (sec): 9.78 - samples/sec: 5071.78 - lr: 0.000045 - momentum: 0.000000 2023-10-25 21:34:14,159 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:14,159 EPOCH 2 done: loss 0.1251 - lr: 0.000045 2023-10-25 21:34:15,382 DEV : loss 0.10230904072523117 - f1-score (micro avg) 0.769 2023-10-25 21:34:15,388 saving best model 2023-10-25 21:34:16,085 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:17,037 epoch 3 - iter 13/136 - loss 0.06535754 - time (sec): 0.95 - samples/sec: 4487.39 - lr: 0.000044 - momentum: 0.000000 2023-10-25 21:34:17,959 epoch 3 - iter 26/136 - loss 0.07728296 - time (sec): 1.87 - samples/sec: 4793.63 - lr: 0.000043 - momentum: 0.000000 2023-10-25 21:34:19,031 epoch 3 - iter 39/136 - loss 0.06353931 - time (sec): 2.94 - samples/sec: 4846.67 - lr: 0.000043 - momentum: 0.000000 2023-10-25 21:34:19,925 epoch 3 - iter 52/136 - loss 0.06837432 - time (sec): 3.84 - samples/sec: 4968.20 - lr: 0.000042 - momentum: 0.000000 2023-10-25 21:34:20,989 epoch 3 - iter 65/136 - loss 0.06431996 - time (sec): 4.90 - samples/sec: 4916.87 - lr: 0.000042 - momentum: 0.000000 2023-10-25 21:34:22,022 epoch 3 - iter 78/136 - loss 0.06167304 - time (sec): 5.93 - samples/sec: 5110.38 - lr: 0.000041 - momentum: 0.000000 2023-10-25 21:34:23,093 epoch 3 - iter 91/136 - loss 0.06277312 - time (sec): 7.01 - samples/sec: 5092.42 - lr: 0.000041 - momentum: 0.000000 2023-10-25 21:34:23,978 epoch 3 - iter 104/136 - loss 0.06304171 - time (sec): 7.89 - samples/sec: 5074.43 - lr: 0.000040 - momentum: 0.000000 2023-10-25 21:34:24,902 epoch 3 - iter 117/136 - loss 0.06234476 - time (sec): 8.81 - samples/sec: 5029.02 - lr: 0.000040 - momentum: 0.000000 2023-10-25 21:34:25,862 epoch 3 - iter 130/136 - loss 0.06152829 - time (sec): 9.77 - samples/sec: 5041.98 - lr: 0.000039 - momentum: 0.000000 2023-10-25 21:34:26,356 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:26,357 EPOCH 3 done: loss 0.0619 - lr: 0.000039 2023-10-25 21:34:27,513 DEV : loss 0.11677566170692444 - f1-score (micro avg) 0.7711 2023-10-25 21:34:27,519 saving best model 2023-10-25 21:34:28,211 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:29,551 epoch 4 - iter 13/136 - loss 0.03055324 - time (sec): 1.34 - samples/sec: 4020.86 - lr: 0.000038 - momentum: 0.000000 2023-10-25 21:34:30,638 epoch 4 - iter 26/136 - loss 0.03116300 - time (sec): 2.42 - samples/sec: 4639.52 - lr: 0.000038 - momentum: 0.000000 2023-10-25 21:34:31,721 epoch 4 - iter 39/136 - loss 0.03021750 - time (sec): 3.51 - samples/sec: 4715.23 - lr: 0.000037 - momentum: 0.000000 2023-10-25 21:34:32,625 epoch 4 - iter 52/136 - loss 0.02989708 - time (sec): 4.41 - samples/sec: 4771.58 - lr: 0.000037 - momentum: 0.000000 2023-10-25 21:34:33,518 epoch 4 - iter 65/136 - loss 0.03206761 - time (sec): 5.30 - samples/sec: 4780.00 - lr: 0.000036 - momentum: 0.000000 2023-10-25 21:34:34,528 epoch 4 - iter 78/136 - loss 0.03406822 - time (sec): 6.31 - samples/sec: 4756.18 - lr: 0.000036 - momentum: 0.000000 2023-10-25 21:34:35,598 epoch 4 - iter 91/136 - loss 0.03482818 - time (sec): 7.38 - samples/sec: 4768.44 - lr: 0.000035 - momentum: 0.000000 2023-10-25 21:34:36,625 epoch 4 - iter 104/136 - loss 0.03447137 - time (sec): 8.41 - samples/sec: 4839.62 - lr: 0.000035 - momentum: 0.000000 2023-10-25 21:34:37,523 epoch 4 - iter 117/136 - loss 0.03504448 - time (sec): 9.31 - samples/sec: 4838.91 - lr: 0.000034 - momentum: 0.000000 2023-10-25 21:34:38,577 epoch 4 - iter 130/136 - loss 0.03602045 - time (sec): 10.36 - samples/sec: 4814.68 - lr: 0.000034 - momentum: 0.000000 2023-10-25 21:34:38,986 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:38,987 EPOCH 4 done: loss 0.0358 - lr: 0.000034 2023-10-25 21:34:40,156 DEV : loss 0.11416536569595337 - f1-score (micro avg) 0.8133 2023-10-25 21:34:40,164 saving best model 2023-10-25 21:34:40,872 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:41,893 epoch 5 - iter 13/136 - loss 0.01946253 - time (sec): 1.02 - samples/sec: 4960.50 - lr: 0.000033 - momentum: 0.000000 2023-10-25 21:34:42,855 epoch 5 - iter 26/136 - loss 0.01547288 - time (sec): 1.98 - samples/sec: 4757.71 - lr: 0.000032 - momentum: 0.000000 2023-10-25 21:34:43,781 epoch 5 - iter 39/136 - loss 0.01856037 - time (sec): 2.91 - samples/sec: 4834.98 - lr: 0.000032 - momentum: 0.000000 2023-10-25 21:34:44,751 epoch 5 - iter 52/136 - loss 0.02123076 - time (sec): 3.88 - samples/sec: 4870.86 - lr: 0.000031 - momentum: 0.000000 2023-10-25 21:34:45,642 epoch 5 - iter 65/136 - loss 0.02421228 - time (sec): 4.77 - samples/sec: 4872.99 - lr: 0.000031 - momentum: 0.000000 2023-10-25 21:34:46,757 epoch 5 - iter 78/136 - loss 0.02165472 - time (sec): 5.88 - samples/sec: 4950.75 - lr: 0.000030 - momentum: 0.000000 2023-10-25 21:34:47,991 epoch 5 - iter 91/136 - loss 0.02043650 - time (sec): 7.12 - samples/sec: 4934.45 - lr: 0.000030 - momentum: 0.000000 2023-10-25 21:34:48,983 epoch 5 - iter 104/136 - loss 0.02120918 - time (sec): 8.11 - samples/sec: 4956.52 - lr: 0.000029 - momentum: 0.000000 2023-10-25 21:34:49,861 epoch 5 - iter 117/136 - loss 0.02482853 - time (sec): 8.99 - samples/sec: 4962.83 - lr: 0.000029 - momentum: 0.000000 2023-10-25 21:34:50,832 epoch 5 - iter 130/136 - loss 0.02472568 - time (sec): 9.96 - samples/sec: 4991.68 - lr: 0.000028 - momentum: 0.000000 2023-10-25 21:34:51,249 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:51,249 EPOCH 5 done: loss 0.0240 - lr: 0.000028 2023-10-25 21:34:52,460 DEV : loss 0.12781116366386414 - f1-score (micro avg) 0.8117 2023-10-25 21:34:52,467 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:34:53,956 epoch 6 - iter 13/136 - loss 0.00771416 - time (sec): 1.49 - samples/sec: 3794.62 - lr: 0.000027 - momentum: 0.000000 2023-10-25 21:34:55,019 epoch 6 - iter 26/136 - loss 0.01804796 - time (sec): 2.55 - samples/sec: 4164.42 - lr: 0.000027 - momentum: 0.000000 2023-10-25 21:34:55,948 epoch 6 - iter 39/136 - loss 0.01483424 - time (sec): 3.48 - samples/sec: 4473.84 - lr: 0.000026 - momentum: 0.000000 2023-10-25 21:34:56,965 epoch 6 - iter 52/136 - loss 0.01612002 - time (sec): 4.50 - samples/sec: 4502.50 - lr: 0.000026 - momentum: 0.000000 2023-10-25 21:34:57,931 epoch 6 - iter 65/136 - loss 0.01886579 - time (sec): 5.46 - samples/sec: 4495.53 - lr: 0.000025 - momentum: 0.000000 2023-10-25 21:34:58,949 epoch 6 - iter 78/136 - loss 0.01674238 - time (sec): 6.48 - samples/sec: 4628.11 - lr: 0.000025 - momentum: 0.000000 2023-10-25 21:34:59,937 epoch 6 - iter 91/136 - loss 0.01760742 - time (sec): 7.47 - samples/sec: 4692.38 - lr: 0.000024 - momentum: 0.000000 2023-10-25 21:35:00,995 epoch 6 - iter 104/136 - loss 0.01821699 - time (sec): 8.53 - samples/sec: 4795.24 - lr: 0.000024 - momentum: 0.000000 2023-10-25 21:35:02,018 epoch 6 - iter 117/136 - loss 0.02000563 - time (sec): 9.55 - samples/sec: 4754.41 - lr: 0.000023 - momentum: 0.000000 2023-10-25 21:35:02,921 epoch 6 - iter 130/136 - loss 0.01883848 - time (sec): 10.45 - samples/sec: 4821.33 - lr: 0.000023 - momentum: 0.000000 2023-10-25 21:35:03,286 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:03,286 EPOCH 6 done: loss 0.0184 - lr: 0.000023 2023-10-25 21:35:04,573 DEV : loss 0.14983585476875305 - f1-score (micro avg) 0.8152 2023-10-25 21:35:04,580 saving best model 2023-10-25 21:35:05,282 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:06,301 epoch 7 - iter 13/136 - loss 0.01240694 - time (sec): 1.02 - samples/sec: 4330.78 - lr: 0.000022 - momentum: 0.000000 2023-10-25 21:35:07,183 epoch 7 - iter 26/136 - loss 0.01223001 - time (sec): 1.90 - samples/sec: 4670.10 - lr: 0.000021 - momentum: 0.000000 2023-10-25 21:35:08,168 epoch 7 - iter 39/136 - loss 0.01307552 - time (sec): 2.88 - samples/sec: 4570.45 - lr: 0.000021 - momentum: 0.000000 2023-10-25 21:35:09,250 epoch 7 - iter 52/136 - loss 0.01143417 - time (sec): 3.97 - samples/sec: 4751.28 - lr: 0.000020 - momentum: 0.000000 2023-10-25 21:35:10,157 epoch 7 - iter 65/136 - loss 0.01047760 - time (sec): 4.87 - samples/sec: 4788.91 - lr: 0.000020 - momentum: 0.000000 2023-10-25 21:35:11,094 epoch 7 - iter 78/136 - loss 0.01355259 - time (sec): 5.81 - samples/sec: 4932.77 - lr: 0.000019 - momentum: 0.000000 2023-10-25 21:35:12,100 epoch 7 - iter 91/136 - loss 0.01486358 - time (sec): 6.82 - samples/sec: 4996.59 - lr: 0.000019 - momentum: 0.000000 2023-10-25 21:35:13,026 epoch 7 - iter 104/136 - loss 0.01535576 - time (sec): 7.74 - samples/sec: 5024.62 - lr: 0.000018 - momentum: 0.000000 2023-10-25 21:35:14,023 epoch 7 - iter 117/136 - loss 0.01407148 - time (sec): 8.74 - samples/sec: 5059.63 - lr: 0.000018 - momentum: 0.000000 2023-10-25 21:35:14,953 epoch 7 - iter 130/136 - loss 0.01377123 - time (sec): 9.67 - samples/sec: 5086.11 - lr: 0.000017 - momentum: 0.000000 2023-10-25 21:35:15,457 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:15,457 EPOCH 7 done: loss 0.0141 - lr: 0.000017 2023-10-25 21:35:16,727 DEV : loss 0.1700810343027115 - f1-score (micro avg) 0.8125 2023-10-25 21:35:16,733 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:17,638 epoch 8 - iter 13/136 - loss 0.00268544 - time (sec): 0.90 - samples/sec: 4952.01 - lr: 0.000016 - momentum: 0.000000 2023-10-25 21:35:19,021 epoch 8 - iter 26/136 - loss 0.00772798 - time (sec): 2.29 - samples/sec: 4415.85 - lr: 0.000016 - momentum: 0.000000 2023-10-25 21:35:20,065 epoch 8 - iter 39/136 - loss 0.00859103 - time (sec): 3.33 - samples/sec: 4731.91 - lr: 0.000015 - momentum: 0.000000 2023-10-25 21:35:21,066 epoch 8 - iter 52/136 - loss 0.01223108 - time (sec): 4.33 - samples/sec: 4749.17 - lr: 0.000015 - momentum: 0.000000 2023-10-25 21:35:22,005 epoch 8 - iter 65/136 - loss 0.01102516 - time (sec): 5.27 - samples/sec: 4730.92 - lr: 0.000014 - momentum: 0.000000 2023-10-25 21:35:22,995 epoch 8 - iter 78/136 - loss 0.01255129 - time (sec): 6.26 - samples/sec: 4887.61 - lr: 0.000014 - momentum: 0.000000 2023-10-25 21:35:23,945 epoch 8 - iter 91/136 - loss 0.01134563 - time (sec): 7.21 - samples/sec: 4920.30 - lr: 0.000013 - momentum: 0.000000 2023-10-25 21:35:24,996 epoch 8 - iter 104/136 - loss 0.01063784 - time (sec): 8.26 - samples/sec: 4886.69 - lr: 0.000013 - momentum: 0.000000 2023-10-25 21:35:25,876 epoch 8 - iter 117/136 - loss 0.01079658 - time (sec): 9.14 - samples/sec: 4920.33 - lr: 0.000012 - momentum: 0.000000 2023-10-25 21:35:26,908 epoch 8 - iter 130/136 - loss 0.00982405 - time (sec): 10.17 - samples/sec: 4907.66 - lr: 0.000012 - momentum: 0.000000 2023-10-25 21:35:27,362 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:27,362 EPOCH 8 done: loss 0.0104 - lr: 0.000012 2023-10-25 21:35:28,655 DEV : loss 0.17804576456546783 - f1-score (micro avg) 0.8116 2023-10-25 21:35:28,661 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:29,627 epoch 9 - iter 13/136 - loss 0.00055046 - time (sec): 0.96 - samples/sec: 4989.89 - lr: 0.000011 - momentum: 0.000000 2023-10-25 21:35:30,468 epoch 9 - iter 26/136 - loss 0.00298412 - time (sec): 1.81 - samples/sec: 4763.26 - lr: 0.000010 - momentum: 0.000000 2023-10-25 21:35:31,439 epoch 9 - iter 39/136 - loss 0.00647008 - time (sec): 2.78 - samples/sec: 4928.77 - lr: 0.000010 - momentum: 0.000000 2023-10-25 21:35:32,428 epoch 9 - iter 52/136 - loss 0.00565494 - time (sec): 3.77 - samples/sec: 4839.08 - lr: 0.000009 - momentum: 0.000000 2023-10-25 21:35:33,500 epoch 9 - iter 65/136 - loss 0.00505945 - time (sec): 4.84 - samples/sec: 4901.28 - lr: 0.000009 - momentum: 0.000000 2023-10-25 21:35:34,602 epoch 9 - iter 78/136 - loss 0.00449225 - time (sec): 5.94 - samples/sec: 4962.79 - lr: 0.000008 - momentum: 0.000000 2023-10-25 21:35:35,623 epoch 9 - iter 91/136 - loss 0.00512371 - time (sec): 6.96 - samples/sec: 5020.44 - lr: 0.000008 - momentum: 0.000000 2023-10-25 21:35:36,716 epoch 9 - iter 104/136 - loss 0.00515854 - time (sec): 8.05 - samples/sec: 5076.11 - lr: 0.000007 - momentum: 0.000000 2023-10-25 21:35:37,644 epoch 9 - iter 117/136 - loss 0.00609712 - time (sec): 8.98 - samples/sec: 5112.01 - lr: 0.000007 - momentum: 0.000000 2023-10-25 21:35:38,560 epoch 9 - iter 130/136 - loss 0.00628290 - time (sec): 9.90 - samples/sec: 5080.59 - lr: 0.000006 - momentum: 0.000000 2023-10-25 21:35:38,954 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:38,954 EPOCH 9 done: loss 0.0064 - lr: 0.000006 2023-10-25 21:35:40,225 DEV : loss 0.18354582786560059 - f1-score (micro avg) 0.8175 2023-10-25 21:35:40,231 saving best model 2023-10-25 21:35:40,903 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:41,884 epoch 10 - iter 13/136 - loss 0.00268180 - time (sec): 0.98 - samples/sec: 4610.51 - lr: 0.000005 - momentum: 0.000000 2023-10-25 21:35:43,155 epoch 10 - iter 26/136 - loss 0.00574221 - time (sec): 2.25 - samples/sec: 4107.44 - lr: 0.000005 - momentum: 0.000000 2023-10-25 21:35:44,144 epoch 10 - iter 39/136 - loss 0.00437948 - time (sec): 3.24 - samples/sec: 4659.58 - lr: 0.000004 - momentum: 0.000000 2023-10-25 21:35:45,004 epoch 10 - iter 52/136 - loss 0.00591017 - time (sec): 4.10 - samples/sec: 4701.14 - lr: 0.000004 - momentum: 0.000000 2023-10-25 21:35:45,924 epoch 10 - iter 65/136 - loss 0.00477376 - time (sec): 5.02 - samples/sec: 4776.88 - lr: 0.000003 - momentum: 0.000000 2023-10-25 21:35:47,009 epoch 10 - iter 78/136 - loss 0.00422044 - time (sec): 6.10 - samples/sec: 4768.42 - lr: 0.000003 - momentum: 0.000000 2023-10-25 21:35:48,051 epoch 10 - iter 91/136 - loss 0.00414085 - time (sec): 7.15 - samples/sec: 4778.06 - lr: 0.000002 - momentum: 0.000000 2023-10-25 21:35:48,971 epoch 10 - iter 104/136 - loss 0.00387906 - time (sec): 8.07 - samples/sec: 4878.30 - lr: 0.000002 - momentum: 0.000000 2023-10-25 21:35:49,893 epoch 10 - iter 117/136 - loss 0.00431272 - time (sec): 8.99 - samples/sec: 4954.92 - lr: 0.000001 - momentum: 0.000000 2023-10-25 21:35:50,958 epoch 10 - iter 130/136 - loss 0.00472815 - time (sec): 10.05 - samples/sec: 4952.02 - lr: 0.000000 - momentum: 0.000000 2023-10-25 21:35:51,450 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:51,451 EPOCH 10 done: loss 0.0052 - lr: 0.000000 2023-10-25 21:35:52,725 DEV : loss 0.18221181631088257 - f1-score (micro avg) 0.825 2023-10-25 21:35:52,731 saving best model 2023-10-25 21:35:53,934 ---------------------------------------------------------------------------------------------------- 2023-10-25 21:35:53,935 Loading model from best epoch ... 2023-10-25 21:35:55,878 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd, S-ORG, B-ORG, E-ORG, I-ORG 2023-10-25 21:35:57,804 Results: - F-score (micro) 0.7849 - F-score (macro) 0.7239 - Accuracy 0.6605 By class: precision recall f1-score support LOC 0.8349 0.8750 0.8545 312 PER 0.6842 0.8750 0.7679 208 ORG 0.4259 0.4182 0.4220 55 HumanProd 0.8000 0.9091 0.8511 22 micro avg 0.7411 0.8342 0.7849 597 macro avg 0.6862 0.7693 0.7239 597 weighted avg 0.7434 0.8342 0.7843 597 2023-10-25 21:35:57,804 ----------------------------------------------------------------------------------------------------