2023-09-04 15:07:57,477 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,478 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=21, bias=True) (loss_function): CrossEntropyLoss() )" 2023-09-04 15:07:57,478 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,478 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences - NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator 2023-09-04 15:07:57,478 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,478 Train: 5901 sentences 2023-09-04 15:07:57,478 (train_with_dev=False, train_with_test=False) 2023-09-04 15:07:57,478 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,478 Training Params: 2023-09-04 15:07:57,478 - learning_rate: "3e-05" 2023-09-04 15:07:57,478 - mini_batch_size: "8" 2023-09-04 15:07:57,479 - max_epochs: "10" 2023-09-04 15:07:57,479 - shuffle: "True" 2023-09-04 15:07:57,479 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,479 Plugins: 2023-09-04 15:07:57,479 - LinearScheduler | warmup_fraction: '0.1' 2023-09-04 15:07:57,479 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,479 Final evaluation on model from best epoch (best-model.pt) 2023-09-04 15:07:57,479 - metric: "('micro avg', 'f1-score')" 2023-09-04 15:07:57,479 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,479 Computation: 2023-09-04 15:07:57,479 - compute on device: cuda:0 2023-09-04 15:07:57,479 - embedding storage: none 2023-09-04 15:07:57,479 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,479 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4" 2023-09-04 15:07:57,479 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:07:57,479 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:08:11,872 epoch 1 - iter 73/738 - loss 2.86248977 - time (sec): 14.39 - samples/sec: 1188.37 - lr: 0.000003 - momentum: 0.000000 2023-09-04 15:08:28,062 epoch 1 - iter 146/738 - loss 1.87378558 - time (sec): 30.58 - samples/sec: 1176.56 - lr: 0.000006 - momentum: 0.000000 2023-09-04 15:08:41,224 epoch 1 - iter 219/738 - loss 1.44076320 - time (sec): 43.74 - samples/sec: 1190.68 - lr: 0.000009 - momentum: 0.000000 2023-09-04 15:08:53,992 epoch 1 - iter 292/738 - loss 1.19510265 - time (sec): 56.51 - samples/sec: 1200.16 - lr: 0.000012 - momentum: 0.000000 2023-09-04 15:09:06,658 epoch 1 - iter 365/738 - loss 1.02747532 - time (sec): 69.18 - samples/sec: 1206.88 - lr: 0.000015 - momentum: 0.000000 2023-09-04 15:09:20,422 epoch 1 - iter 438/738 - loss 0.90817250 - time (sec): 82.94 - samples/sec: 1205.48 - lr: 0.000018 - momentum: 0.000000 2023-09-04 15:09:32,092 epoch 1 - iter 511/738 - loss 0.82740367 - time (sec): 94.61 - samples/sec: 1209.87 - lr: 0.000021 - momentum: 0.000000 2023-09-04 15:09:45,830 epoch 1 - iter 584/738 - loss 0.75485949 - time (sec): 108.35 - samples/sec: 1204.68 - lr: 0.000024 - momentum: 0.000000 2023-09-04 15:09:59,433 epoch 1 - iter 657/738 - loss 0.69355197 - time (sec): 121.95 - samples/sec: 1203.67 - lr: 0.000027 - momentum: 0.000000 2023-09-04 15:10:14,975 epoch 1 - iter 730/738 - loss 0.63642854 - time (sec): 137.49 - samples/sec: 1199.07 - lr: 0.000030 - momentum: 0.000000 2023-09-04 15:10:16,267 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:10:16,267 EPOCH 1 done: loss 0.6324 - lr: 0.000030 2023-09-04 15:10:30,427 DEV : loss 0.13814392685890198 - f1-score (micro avg) 0.7202 2023-09-04 15:10:30,455 saving best model 2023-09-04 15:10:30,930 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:10:43,191 epoch 2 - iter 73/738 - loss 0.15642957 - time (sec): 12.26 - samples/sec: 1196.70 - lr: 0.000030 - momentum: 0.000000 2023-09-04 15:10:55,459 epoch 2 - iter 146/738 - loss 0.15187407 - time (sec): 24.53 - samples/sec: 1213.71 - lr: 0.000029 - momentum: 0.000000 2023-09-04 15:11:10,504 epoch 2 - iter 219/738 - loss 0.14818061 - time (sec): 39.57 - samples/sec: 1210.82 - lr: 0.000029 - momentum: 0.000000 2023-09-04 15:11:23,998 epoch 2 - iter 292/738 - loss 0.14577470 - time (sec): 53.07 - samples/sec: 1207.19 - lr: 0.000029 - momentum: 0.000000 2023-09-04 15:11:37,517 epoch 2 - iter 365/738 - loss 0.14070710 - time (sec): 66.59 - samples/sec: 1204.25 - lr: 0.000028 - momentum: 0.000000 2023-09-04 15:11:51,786 epoch 2 - iter 438/738 - loss 0.13613518 - time (sec): 80.85 - samples/sec: 1204.78 - lr: 0.000028 - momentum: 0.000000 2023-09-04 15:12:05,952 epoch 2 - iter 511/738 - loss 0.13285248 - time (sec): 95.02 - samples/sec: 1194.66 - lr: 0.000028 - momentum: 0.000000 2023-09-04 15:12:19,375 epoch 2 - iter 584/738 - loss 0.13200454 - time (sec): 108.44 - samples/sec: 1196.77 - lr: 0.000027 - momentum: 0.000000 2023-09-04 15:12:35,244 epoch 2 - iter 657/738 - loss 0.12861740 - time (sec): 124.31 - samples/sec: 1191.28 - lr: 0.000027 - momentum: 0.000000 2023-09-04 15:12:49,248 epoch 2 - iter 730/738 - loss 0.12694763 - time (sec): 138.32 - samples/sec: 1190.88 - lr: 0.000027 - momentum: 0.000000 2023-09-04 15:12:50,559 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:12:50,559 EPOCH 2 done: loss 0.1269 - lr: 0.000027 2023-09-04 15:13:08,319 DEV : loss 0.11461225152015686 - f1-score (micro avg) 0.7643 2023-09-04 15:13:08,347 saving best model 2023-09-04 15:13:09,723 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:13:23,207 epoch 3 - iter 73/738 - loss 0.08262600 - time (sec): 13.48 - samples/sec: 1206.96 - lr: 0.000026 - momentum: 0.000000 2023-09-04 15:13:37,922 epoch 3 - iter 146/738 - loss 0.07783046 - time (sec): 28.20 - samples/sec: 1199.25 - lr: 0.000026 - momentum: 0.000000 2023-09-04 15:13:50,649 epoch 3 - iter 219/738 - loss 0.07457843 - time (sec): 40.92 - samples/sec: 1202.63 - lr: 0.000026 - momentum: 0.000000 2023-09-04 15:14:05,710 epoch 3 - iter 292/738 - loss 0.08068010 - time (sec): 55.99 - samples/sec: 1198.49 - lr: 0.000025 - momentum: 0.000000 2023-09-04 15:14:18,785 epoch 3 - iter 365/738 - loss 0.07733871 - time (sec): 69.06 - samples/sec: 1199.55 - lr: 0.000025 - momentum: 0.000000 2023-09-04 15:14:32,483 epoch 3 - iter 438/738 - loss 0.07510957 - time (sec): 82.76 - samples/sec: 1193.29 - lr: 0.000025 - momentum: 0.000000 2023-09-04 15:14:45,867 epoch 3 - iter 511/738 - loss 0.07428255 - time (sec): 96.14 - samples/sec: 1198.61 - lr: 0.000024 - momentum: 0.000000 2023-09-04 15:15:00,677 epoch 3 - iter 584/738 - loss 0.07258361 - time (sec): 110.95 - samples/sec: 1195.74 - lr: 0.000024 - momentum: 0.000000 2023-09-04 15:15:13,673 epoch 3 - iter 657/738 - loss 0.07018245 - time (sec): 123.95 - samples/sec: 1196.07 - lr: 0.000024 - momentum: 0.000000 2023-09-04 15:15:27,981 epoch 3 - iter 730/738 - loss 0.07051097 - time (sec): 138.26 - samples/sec: 1193.67 - lr: 0.000023 - momentum: 0.000000 2023-09-04 15:15:29,057 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:15:29,058 EPOCH 3 done: loss 0.0706 - lr: 0.000023 2023-09-04 15:15:46,860 DEV : loss 0.12629321217536926 - f1-score (micro avg) 0.7847 2023-09-04 15:15:46,888 saving best model 2023-09-04 15:15:48,249 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:16:02,993 epoch 4 - iter 73/738 - loss 0.04289321 - time (sec): 14.74 - samples/sec: 1210.88 - lr: 0.000023 - momentum: 0.000000 2023-09-04 15:16:15,752 epoch 4 - iter 146/738 - loss 0.04495463 - time (sec): 27.50 - samples/sec: 1204.09 - lr: 0.000023 - momentum: 0.000000 2023-09-04 15:16:32,912 epoch 4 - iter 219/738 - loss 0.04311179 - time (sec): 44.66 - samples/sec: 1184.67 - lr: 0.000022 - momentum: 0.000000 2023-09-04 15:16:47,592 epoch 4 - iter 292/738 - loss 0.04838265 - time (sec): 59.34 - samples/sec: 1173.84 - lr: 0.000022 - momentum: 0.000000 2023-09-04 15:17:00,520 epoch 4 - iter 365/738 - loss 0.04798678 - time (sec): 72.27 - samples/sec: 1181.69 - lr: 0.000022 - momentum: 0.000000 2023-09-04 15:17:15,753 epoch 4 - iter 438/738 - loss 0.04663458 - time (sec): 87.50 - samples/sec: 1182.61 - lr: 0.000021 - momentum: 0.000000 2023-09-04 15:17:28,483 epoch 4 - iter 511/738 - loss 0.04610997 - time (sec): 100.23 - samples/sec: 1187.93 - lr: 0.000021 - momentum: 0.000000 2023-09-04 15:17:41,349 epoch 4 - iter 584/738 - loss 0.04766936 - time (sec): 113.10 - samples/sec: 1185.21 - lr: 0.000021 - momentum: 0.000000 2023-09-04 15:17:53,035 epoch 4 - iter 657/738 - loss 0.04878602 - time (sec): 124.78 - samples/sec: 1191.55 - lr: 0.000020 - momentum: 0.000000 2023-09-04 15:18:06,125 epoch 4 - iter 730/738 - loss 0.04813588 - time (sec): 137.87 - samples/sec: 1195.65 - lr: 0.000020 - momentum: 0.000000 2023-09-04 15:18:07,361 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:18:07,361 EPOCH 4 done: loss 0.0478 - lr: 0.000020 2023-09-04 15:18:25,148 DEV : loss 0.1449851542711258 - f1-score (micro avg) 0.8246 2023-09-04 15:18:25,176 saving best model 2023-09-04 15:18:26,491 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:18:40,829 epoch 5 - iter 73/738 - loss 0.03757261 - time (sec): 14.34 - samples/sec: 1175.64 - lr: 0.000020 - momentum: 0.000000 2023-09-04 15:18:53,223 epoch 5 - iter 146/738 - loss 0.03398262 - time (sec): 26.73 - samples/sec: 1202.24 - lr: 0.000019 - momentum: 0.000000 2023-09-04 15:19:05,893 epoch 5 - iter 219/738 - loss 0.03453452 - time (sec): 39.40 - samples/sec: 1221.49 - lr: 0.000019 - momentum: 0.000000 2023-09-04 15:19:20,376 epoch 5 - iter 292/738 - loss 0.03535832 - time (sec): 53.88 - samples/sec: 1214.70 - lr: 0.000019 - momentum: 0.000000 2023-09-04 15:19:34,656 epoch 5 - iter 365/738 - loss 0.03277805 - time (sec): 68.16 - samples/sec: 1196.75 - lr: 0.000018 - momentum: 0.000000 2023-09-04 15:19:48,355 epoch 5 - iter 438/738 - loss 0.03204077 - time (sec): 81.86 - samples/sec: 1192.57 - lr: 0.000018 - momentum: 0.000000 2023-09-04 15:20:05,121 epoch 5 - iter 511/738 - loss 0.03323374 - time (sec): 98.63 - samples/sec: 1183.15 - lr: 0.000018 - momentum: 0.000000 2023-09-04 15:20:16,779 epoch 5 - iter 584/738 - loss 0.03395649 - time (sec): 110.29 - samples/sec: 1193.38 - lr: 0.000017 - momentum: 0.000000 2023-09-04 15:20:31,468 epoch 5 - iter 657/738 - loss 0.03392424 - time (sec): 124.98 - samples/sec: 1189.35 - lr: 0.000017 - momentum: 0.000000 2023-09-04 15:20:44,868 epoch 5 - iter 730/738 - loss 0.03386304 - time (sec): 138.38 - samples/sec: 1191.00 - lr: 0.000017 - momentum: 0.000000 2023-09-04 15:20:46,072 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:20:46,072 EPOCH 5 done: loss 0.0336 - lr: 0.000017 2023-09-04 15:21:03,892 DEV : loss 0.16297703981399536 - f1-score (micro avg) 0.8206 2023-09-04 15:21:03,921 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:21:18,215 epoch 6 - iter 73/738 - loss 0.02179470 - time (sec): 14.29 - samples/sec: 1178.36 - lr: 0.000016 - momentum: 0.000000 2023-09-04 15:21:32,516 epoch 6 - iter 146/738 - loss 0.02530039 - time (sec): 28.59 - samples/sec: 1155.41 - lr: 0.000016 - momentum: 0.000000 2023-09-04 15:21:44,583 epoch 6 - iter 219/738 - loss 0.02335614 - time (sec): 40.66 - samples/sec: 1171.33 - lr: 0.000016 - momentum: 0.000000 2023-09-04 15:21:57,035 epoch 6 - iter 292/738 - loss 0.02558086 - time (sec): 53.11 - samples/sec: 1182.77 - lr: 0.000015 - momentum: 0.000000 2023-09-04 15:22:11,708 epoch 6 - iter 365/738 - loss 0.02563490 - time (sec): 67.79 - samples/sec: 1178.91 - lr: 0.000015 - momentum: 0.000000 2023-09-04 15:22:22,999 epoch 6 - iter 438/738 - loss 0.02483109 - time (sec): 79.08 - samples/sec: 1188.60 - lr: 0.000015 - momentum: 0.000000 2023-09-04 15:22:37,280 epoch 6 - iter 511/738 - loss 0.02354249 - time (sec): 93.36 - samples/sec: 1190.62 - lr: 0.000014 - momentum: 0.000000 2023-09-04 15:22:52,561 epoch 6 - iter 584/738 - loss 0.02407573 - time (sec): 108.64 - samples/sec: 1191.50 - lr: 0.000014 - momentum: 0.000000 2023-09-04 15:23:09,298 epoch 6 - iter 657/738 - loss 0.02428604 - time (sec): 125.37 - samples/sec: 1189.00 - lr: 0.000014 - momentum: 0.000000 2023-09-04 15:23:22,753 epoch 6 - iter 730/738 - loss 0.02498198 - time (sec): 138.83 - samples/sec: 1189.25 - lr: 0.000013 - momentum: 0.000000 2023-09-04 15:23:23,779 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:23:23,779 EPOCH 6 done: loss 0.0251 - lr: 0.000013 2023-09-04 15:23:41,509 DEV : loss 0.1827543079853058 - f1-score (micro avg) 0.8092 2023-09-04 15:23:41,538 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:23:54,237 epoch 7 - iter 73/738 - loss 0.02181177 - time (sec): 12.70 - samples/sec: 1212.83 - lr: 0.000013 - momentum: 0.000000 2023-09-04 15:24:06,417 epoch 7 - iter 146/738 - loss 0.02084610 - time (sec): 24.88 - samples/sec: 1200.09 - lr: 0.000013 - momentum: 0.000000 2023-09-04 15:24:20,568 epoch 7 - iter 219/738 - loss 0.02110233 - time (sec): 39.03 - samples/sec: 1209.11 - lr: 0.000012 - momentum: 0.000000 2023-09-04 15:24:33,733 epoch 7 - iter 292/738 - loss 0.01901752 - time (sec): 52.19 - samples/sec: 1204.34 - lr: 0.000012 - momentum: 0.000000 2023-09-04 15:24:48,456 epoch 7 - iter 365/738 - loss 0.02024199 - time (sec): 66.92 - samples/sec: 1188.39 - lr: 0.000012 - momentum: 0.000000 2023-09-04 15:25:02,191 epoch 7 - iter 438/738 - loss 0.02027023 - time (sec): 80.65 - samples/sec: 1189.40 - lr: 0.000011 - momentum: 0.000000 2023-09-04 15:25:15,277 epoch 7 - iter 511/738 - loss 0.02061434 - time (sec): 93.74 - samples/sec: 1196.34 - lr: 0.000011 - momentum: 0.000000 2023-09-04 15:25:29,003 epoch 7 - iter 584/738 - loss 0.02034981 - time (sec): 107.46 - samples/sec: 1196.32 - lr: 0.000011 - momentum: 0.000000 2023-09-04 15:25:45,325 epoch 7 - iter 657/738 - loss 0.02041326 - time (sec): 123.79 - samples/sec: 1195.65 - lr: 0.000010 - momentum: 0.000000 2023-09-04 15:25:59,500 epoch 7 - iter 730/738 - loss 0.01986392 - time (sec): 137.96 - samples/sec: 1191.73 - lr: 0.000010 - momentum: 0.000000 2023-09-04 15:26:01,204 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:26:01,204 EPOCH 7 done: loss 0.0198 - lr: 0.000010 2023-09-04 15:26:18,945 DEV : loss 0.19946980476379395 - f1-score (micro avg) 0.8101 2023-09-04 15:26:18,974 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:26:33,611 epoch 8 - iter 73/738 - loss 0.01320934 - time (sec): 14.63 - samples/sec: 1199.27 - lr: 0.000010 - momentum: 0.000000 2023-09-04 15:26:46,284 epoch 8 - iter 146/738 - loss 0.01087234 - time (sec): 27.31 - samples/sec: 1190.62 - lr: 0.000009 - momentum: 0.000000 2023-09-04 15:27:00,406 epoch 8 - iter 219/738 - loss 0.01118795 - time (sec): 41.43 - samples/sec: 1193.36 - lr: 0.000009 - momentum: 0.000000 2023-09-04 15:27:13,072 epoch 8 - iter 292/738 - loss 0.01173749 - time (sec): 54.10 - samples/sec: 1196.64 - lr: 0.000009 - momentum: 0.000000 2023-09-04 15:27:27,562 epoch 8 - iter 365/738 - loss 0.01497626 - time (sec): 68.59 - samples/sec: 1182.80 - lr: 0.000008 - momentum: 0.000000 2023-09-04 15:27:43,186 epoch 8 - iter 438/738 - loss 0.01439217 - time (sec): 84.21 - samples/sec: 1175.93 - lr: 0.000008 - momentum: 0.000000 2023-09-04 15:27:54,652 epoch 8 - iter 511/738 - loss 0.01407249 - time (sec): 95.68 - samples/sec: 1189.87 - lr: 0.000008 - momentum: 0.000000 2023-09-04 15:28:09,602 epoch 8 - iter 584/738 - loss 0.01366217 - time (sec): 110.63 - samples/sec: 1183.80 - lr: 0.000007 - momentum: 0.000000 2023-09-04 15:28:22,514 epoch 8 - iter 657/738 - loss 0.01343997 - time (sec): 123.54 - samples/sec: 1186.85 - lr: 0.000007 - momentum: 0.000000 2023-09-04 15:28:37,442 epoch 8 - iter 730/738 - loss 0.01419631 - time (sec): 138.47 - samples/sec: 1190.92 - lr: 0.000007 - momentum: 0.000000 2023-09-04 15:28:38,663 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:28:38,664 EPOCH 8 done: loss 0.0142 - lr: 0.000007 2023-09-04 15:28:56,470 DEV : loss 0.1939600259065628 - f1-score (micro avg) 0.8261 2023-09-04 15:28:56,500 saving best model 2023-09-04 15:28:57,845 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:29:11,878 epoch 9 - iter 73/738 - loss 0.01087719 - time (sec): 14.03 - samples/sec: 1194.75 - lr: 0.000006 - momentum: 0.000000 2023-09-04 15:29:26,380 epoch 9 - iter 146/738 - loss 0.01176628 - time (sec): 28.53 - samples/sec: 1173.05 - lr: 0.000006 - momentum: 0.000000 2023-09-04 15:29:38,093 epoch 9 - iter 219/738 - loss 0.01045319 - time (sec): 40.25 - samples/sec: 1199.96 - lr: 0.000006 - momentum: 0.000000 2023-09-04 15:29:50,788 epoch 9 - iter 292/738 - loss 0.01059609 - time (sec): 52.94 - samples/sec: 1200.20 - lr: 0.000005 - momentum: 0.000000 2023-09-04 15:30:04,953 epoch 9 - iter 365/738 - loss 0.01182372 - time (sec): 67.11 - samples/sec: 1182.59 - lr: 0.000005 - momentum: 0.000000 2023-09-04 15:30:20,455 epoch 9 - iter 438/738 - loss 0.01097159 - time (sec): 82.61 - samples/sec: 1174.63 - lr: 0.000005 - momentum: 0.000000 2023-09-04 15:30:35,578 epoch 9 - iter 511/738 - loss 0.01034011 - time (sec): 97.73 - samples/sec: 1173.99 - lr: 0.000004 - momentum: 0.000000 2023-09-04 15:30:48,048 epoch 9 - iter 584/738 - loss 0.01093906 - time (sec): 110.20 - samples/sec: 1182.16 - lr: 0.000004 - momentum: 0.000000 2023-09-04 15:31:01,277 epoch 9 - iter 657/738 - loss 0.01060596 - time (sec): 123.43 - samples/sec: 1182.92 - lr: 0.000004 - momentum: 0.000000 2023-09-04 15:31:15,726 epoch 9 - iter 730/738 - loss 0.01058406 - time (sec): 137.88 - samples/sec: 1193.37 - lr: 0.000003 - momentum: 0.000000 2023-09-04 15:31:17,063 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:31:17,064 EPOCH 9 done: loss 0.0105 - lr: 0.000003 2023-09-04 15:31:34,810 DEV : loss 0.20513701438903809 - f1-score (micro avg) 0.8227 2023-09-04 15:31:34,839 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:31:48,240 epoch 10 - iter 73/738 - loss 0.00152471 - time (sec): 13.40 - samples/sec: 1191.48 - lr: 0.000003 - momentum: 0.000000 2023-09-04 15:32:01,756 epoch 10 - iter 146/738 - loss 0.00902703 - time (sec): 26.92 - samples/sec: 1210.66 - lr: 0.000003 - momentum: 0.000000 2023-09-04 15:32:13,703 epoch 10 - iter 219/738 - loss 0.00963469 - time (sec): 38.86 - samples/sec: 1221.31 - lr: 0.000002 - momentum: 0.000000 2023-09-04 15:32:28,695 epoch 10 - iter 292/738 - loss 0.01078399 - time (sec): 53.85 - samples/sec: 1218.41 - lr: 0.000002 - momentum: 0.000000 2023-09-04 15:32:44,713 epoch 10 - iter 365/738 - loss 0.01129022 - time (sec): 69.87 - samples/sec: 1201.46 - lr: 0.000002 - momentum: 0.000000 2023-09-04 15:32:57,832 epoch 10 - iter 438/738 - loss 0.01051066 - time (sec): 82.99 - samples/sec: 1200.25 - lr: 0.000001 - momentum: 0.000000 2023-09-04 15:33:11,789 epoch 10 - iter 511/738 - loss 0.00970689 - time (sec): 96.95 - samples/sec: 1203.46 - lr: 0.000001 - momentum: 0.000000 2023-09-04 15:33:27,378 epoch 10 - iter 584/738 - loss 0.00932251 - time (sec): 112.54 - samples/sec: 1194.23 - lr: 0.000001 - momentum: 0.000000 2023-09-04 15:33:40,681 epoch 10 - iter 657/738 - loss 0.00881038 - time (sec): 125.84 - samples/sec: 1193.25 - lr: 0.000000 - momentum: 0.000000 2023-09-04 15:33:52,640 epoch 10 - iter 730/738 - loss 0.00838896 - time (sec): 137.80 - samples/sec: 1195.44 - lr: 0.000000 - momentum: 0.000000 2023-09-04 15:33:53,938 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:33:53,938 EPOCH 10 done: loss 0.0083 - lr: 0.000000 2023-09-04 15:34:11,671 DEV : loss 0.20588278770446777 - f1-score (micro avg) 0.8215 2023-09-04 15:34:12,180 ---------------------------------------------------------------------------------------------------- 2023-09-04 15:34:12,181 Loading model from best epoch ... 2023-09-04 15:34:14,139 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod 2023-09-04 15:34:29,363 Results: - F-score (micro) 0.7885 - F-score (macro) 0.688 - Accuracy 0.6729 By class: precision recall f1-score support loc 0.8694 0.8613 0.8653 858 pers 0.7423 0.8045 0.7721 537 org 0.4765 0.6136 0.5364 132 time 0.5231 0.6296 0.5714 54 prod 0.7193 0.6721 0.6949 61 micro avg 0.7697 0.8082 0.7885 1642 macro avg 0.6661 0.7162 0.6880 1642 weighted avg 0.7793 0.8082 0.7924 1642 2023-09-04 15:34:29,363 ----------------------------------------------------------------------------------------------------