File size: 24,246 Bytes
c4499f6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 |
2023-10-25 11:39:23,291 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,292 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(64001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=17, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-25 11:39:23,292 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,292 MultiCorpus: 20847 train + 1123 dev + 3350 test sentences
- NER_HIPE_2022 Corpus: 20847 train + 1123 dev + 3350 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/de/with_doc_seperator
2023-10-25 11:39:23,292 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,292 Train: 20847 sentences
2023-10-25 11:39:23,292 (train_with_dev=False, train_with_test=False)
2023-10-25 11:39:23,292 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,292 Training Params:
2023-10-25 11:39:23,292 - learning_rate: "3e-05"
2023-10-25 11:39:23,292 - mini_batch_size: "8"
2023-10-25 11:39:23,292 - max_epochs: "10"
2023-10-25 11:39:23,292 - shuffle: "True"
2023-10-25 11:39:23,292 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,292 Plugins:
2023-10-25 11:39:23,292 - TensorboardLogger
2023-10-25 11:39:23,292 - LinearScheduler | warmup_fraction: '0.1'
2023-10-25 11:39:23,292 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,292 Final evaluation on model from best epoch (best-model.pt)
2023-10-25 11:39:23,293 - metric: "('micro avg', 'f1-score')"
2023-10-25 11:39:23,293 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,293 Computation:
2023-10-25 11:39:23,293 - compute on device: cuda:0
2023-10-25 11:39:23,293 - embedding storage: none
2023-10-25 11:39:23,293 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,293 Model training base path: "hmbench-newseye/de-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2"
2023-10-25 11:39:23,293 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,293 ----------------------------------------------------------------------------------------------------
2023-10-25 11:39:23,293 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-25 11:39:37,851 epoch 1 - iter 260/2606 - loss 1.57021077 - time (sec): 14.56 - samples/sec: 2556.73 - lr: 0.000003 - momentum: 0.000000
2023-10-25 11:39:52,142 epoch 1 - iter 520/2606 - loss 0.95405161 - time (sec): 28.85 - samples/sec: 2664.38 - lr: 0.000006 - momentum: 0.000000
2023-10-25 11:40:06,273 epoch 1 - iter 780/2606 - loss 0.73729341 - time (sec): 42.98 - samples/sec: 2655.02 - lr: 0.000009 - momentum: 0.000000
2023-10-25 11:40:19,835 epoch 1 - iter 1040/2606 - loss 0.62532777 - time (sec): 56.54 - samples/sec: 2629.25 - lr: 0.000012 - momentum: 0.000000
2023-10-25 11:40:33,679 epoch 1 - iter 1300/2606 - loss 0.54757678 - time (sec): 70.39 - samples/sec: 2616.46 - lr: 0.000015 - momentum: 0.000000
2023-10-25 11:40:47,653 epoch 1 - iter 1560/2606 - loss 0.49066763 - time (sec): 84.36 - samples/sec: 2635.35 - lr: 0.000018 - momentum: 0.000000
2023-10-25 11:41:01,175 epoch 1 - iter 1820/2606 - loss 0.45026054 - time (sec): 97.88 - samples/sec: 2620.86 - lr: 0.000021 - momentum: 0.000000
2023-10-25 11:41:15,148 epoch 1 - iter 2080/2606 - loss 0.41942467 - time (sec): 111.85 - samples/sec: 2611.91 - lr: 0.000024 - momentum: 0.000000
2023-10-25 11:41:29,102 epoch 1 - iter 2340/2606 - loss 0.39408020 - time (sec): 125.81 - samples/sec: 2607.23 - lr: 0.000027 - momentum: 0.000000
2023-10-25 11:41:43,261 epoch 1 - iter 2600/2606 - loss 0.36990188 - time (sec): 139.97 - samples/sec: 2620.90 - lr: 0.000030 - momentum: 0.000000
2023-10-25 11:41:43,543 ----------------------------------------------------------------------------------------------------
2023-10-25 11:41:43,544 EPOCH 1 done: loss 0.3696 - lr: 0.000030
2023-10-25 11:41:47,256 DEV : loss 0.16545310616493225 - f1-score (micro avg) 0.3369
2023-10-25 11:41:47,281 saving best model
2023-10-25 11:41:47,676 ----------------------------------------------------------------------------------------------------
2023-10-25 11:42:02,046 epoch 2 - iter 260/2606 - loss 0.16955557 - time (sec): 14.37 - samples/sec: 2689.29 - lr: 0.000030 - momentum: 0.000000
2023-10-25 11:42:16,238 epoch 2 - iter 520/2606 - loss 0.16249568 - time (sec): 28.56 - samples/sec: 2660.42 - lr: 0.000029 - momentum: 0.000000
2023-10-25 11:42:30,467 epoch 2 - iter 780/2606 - loss 0.16067408 - time (sec): 42.79 - samples/sec: 2662.51 - lr: 0.000029 - momentum: 0.000000
2023-10-25 11:42:43,942 epoch 2 - iter 1040/2606 - loss 0.15460242 - time (sec): 56.26 - samples/sec: 2624.53 - lr: 0.000029 - momentum: 0.000000
2023-10-25 11:42:57,089 epoch 2 - iter 1300/2606 - loss 0.15720241 - time (sec): 69.41 - samples/sec: 2628.78 - lr: 0.000028 - momentum: 0.000000
2023-10-25 11:43:10,939 epoch 2 - iter 1560/2606 - loss 0.15583926 - time (sec): 83.26 - samples/sec: 2624.56 - lr: 0.000028 - momentum: 0.000000
2023-10-25 11:43:25,093 epoch 2 - iter 1820/2606 - loss 0.15515707 - time (sec): 97.42 - samples/sec: 2626.00 - lr: 0.000028 - momentum: 0.000000
2023-10-25 11:43:38,804 epoch 2 - iter 2080/2606 - loss 0.15504235 - time (sec): 111.13 - samples/sec: 2628.05 - lr: 0.000027 - momentum: 0.000000
2023-10-25 11:43:52,956 epoch 2 - iter 2340/2606 - loss 0.15212685 - time (sec): 125.28 - samples/sec: 2629.04 - lr: 0.000027 - momentum: 0.000000
2023-10-25 11:44:07,071 epoch 2 - iter 2600/2606 - loss 0.15026039 - time (sec): 139.39 - samples/sec: 2630.97 - lr: 0.000027 - momentum: 0.000000
2023-10-25 11:44:07,353 ----------------------------------------------------------------------------------------------------
2023-10-25 11:44:07,354 EPOCH 2 done: loss 0.1505 - lr: 0.000027
2023-10-25 11:44:14,184 DEV : loss 0.1197943240404129 - f1-score (micro avg) 0.3209
2023-10-25 11:44:14,209 ----------------------------------------------------------------------------------------------------
2023-10-25 11:44:27,824 epoch 3 - iter 260/2606 - loss 0.10481529 - time (sec): 13.61 - samples/sec: 2470.68 - lr: 0.000026 - momentum: 0.000000
2023-10-25 11:44:41,595 epoch 3 - iter 520/2606 - loss 0.10487908 - time (sec): 27.38 - samples/sec: 2473.13 - lr: 0.000026 - momentum: 0.000000
2023-10-25 11:44:55,771 epoch 3 - iter 780/2606 - loss 0.09618585 - time (sec): 41.56 - samples/sec: 2593.18 - lr: 0.000026 - momentum: 0.000000
2023-10-25 11:45:09,666 epoch 3 - iter 1040/2606 - loss 0.09795426 - time (sec): 55.46 - samples/sec: 2580.25 - lr: 0.000025 - momentum: 0.000000
2023-10-25 11:45:23,599 epoch 3 - iter 1300/2606 - loss 0.09869583 - time (sec): 69.39 - samples/sec: 2596.10 - lr: 0.000025 - momentum: 0.000000
2023-10-25 11:45:37,541 epoch 3 - iter 1560/2606 - loss 0.09966936 - time (sec): 83.33 - samples/sec: 2604.12 - lr: 0.000025 - momentum: 0.000000
2023-10-25 11:45:51,947 epoch 3 - iter 1820/2606 - loss 0.10055515 - time (sec): 97.74 - samples/sec: 2630.23 - lr: 0.000024 - momentum: 0.000000
2023-10-25 11:46:05,429 epoch 3 - iter 2080/2606 - loss 0.09842991 - time (sec): 111.22 - samples/sec: 2645.66 - lr: 0.000024 - momentum: 0.000000
2023-10-25 11:46:19,397 epoch 3 - iter 2340/2606 - loss 0.09950876 - time (sec): 125.19 - samples/sec: 2654.12 - lr: 0.000024 - momentum: 0.000000
2023-10-25 11:46:32,626 epoch 3 - iter 2600/2606 - loss 0.09902161 - time (sec): 138.42 - samples/sec: 2648.89 - lr: 0.000023 - momentum: 0.000000
2023-10-25 11:46:32,909 ----------------------------------------------------------------------------------------------------
2023-10-25 11:46:32,909 EPOCH 3 done: loss 0.0991 - lr: 0.000023
2023-10-25 11:46:39,755 DEV : loss 0.17354419827461243 - f1-score (micro avg) 0.3626
2023-10-25 11:46:39,781 saving best model
2023-10-25 11:46:40,474 ----------------------------------------------------------------------------------------------------
2023-10-25 11:46:54,527 epoch 4 - iter 260/2606 - loss 0.06887745 - time (sec): 14.05 - samples/sec: 2623.65 - lr: 0.000023 - momentum: 0.000000
2023-10-25 11:47:08,284 epoch 4 - iter 520/2606 - loss 0.06746623 - time (sec): 27.81 - samples/sec: 2633.87 - lr: 0.000023 - momentum: 0.000000
2023-10-25 11:47:21,844 epoch 4 - iter 780/2606 - loss 0.06724212 - time (sec): 41.37 - samples/sec: 2648.90 - lr: 0.000022 - momentum: 0.000000
2023-10-25 11:47:35,478 epoch 4 - iter 1040/2606 - loss 0.06775242 - time (sec): 55.00 - samples/sec: 2592.82 - lr: 0.000022 - momentum: 0.000000
2023-10-25 11:47:49,494 epoch 4 - iter 1300/2606 - loss 0.06853685 - time (sec): 69.02 - samples/sec: 2618.05 - lr: 0.000022 - momentum: 0.000000
2023-10-25 11:48:03,215 epoch 4 - iter 1560/2606 - loss 0.06832555 - time (sec): 82.74 - samples/sec: 2619.28 - lr: 0.000021 - momentum: 0.000000
2023-10-25 11:48:17,331 epoch 4 - iter 1820/2606 - loss 0.06715608 - time (sec): 96.85 - samples/sec: 2620.90 - lr: 0.000021 - momentum: 0.000000
2023-10-25 11:48:31,507 epoch 4 - iter 2080/2606 - loss 0.06797360 - time (sec): 111.03 - samples/sec: 2642.22 - lr: 0.000021 - momentum: 0.000000
2023-10-25 11:48:45,148 epoch 4 - iter 2340/2606 - loss 0.06732721 - time (sec): 124.67 - samples/sec: 2628.22 - lr: 0.000020 - momentum: 0.000000
2023-10-25 11:48:59,012 epoch 4 - iter 2600/2606 - loss 0.06658621 - time (sec): 138.53 - samples/sec: 2644.43 - lr: 0.000020 - momentum: 0.000000
2023-10-25 11:48:59,328 ----------------------------------------------------------------------------------------------------
2023-10-25 11:48:59,328 EPOCH 4 done: loss 0.0666 - lr: 0.000020
2023-10-25 11:49:05,589 DEV : loss 0.30783411860466003 - f1-score (micro avg) 0.3413
2023-10-25 11:49:05,614 ----------------------------------------------------------------------------------------------------
2023-10-25 11:49:20,140 epoch 5 - iter 260/2606 - loss 0.04604953 - time (sec): 14.52 - samples/sec: 2436.13 - lr: 0.000020 - momentum: 0.000000
2023-10-25 11:49:33,623 epoch 5 - iter 520/2606 - loss 0.04697664 - time (sec): 28.01 - samples/sec: 2540.14 - lr: 0.000019 - momentum: 0.000000
2023-10-25 11:49:47,511 epoch 5 - iter 780/2606 - loss 0.04775602 - time (sec): 41.90 - samples/sec: 2623.00 - lr: 0.000019 - momentum: 0.000000
2023-10-25 11:50:01,505 epoch 5 - iter 1040/2606 - loss 0.04621368 - time (sec): 55.89 - samples/sec: 2641.15 - lr: 0.000019 - momentum: 0.000000
2023-10-25 11:50:15,262 epoch 5 - iter 1300/2606 - loss 0.04643873 - time (sec): 69.65 - samples/sec: 2665.18 - lr: 0.000018 - momentum: 0.000000
2023-10-25 11:50:29,597 epoch 5 - iter 1560/2606 - loss 0.04688888 - time (sec): 83.98 - samples/sec: 2672.97 - lr: 0.000018 - momentum: 0.000000
2023-10-25 11:50:43,060 epoch 5 - iter 1820/2606 - loss 0.04703940 - time (sec): 97.44 - samples/sec: 2673.71 - lr: 0.000018 - momentum: 0.000000
2023-10-25 11:50:56,340 epoch 5 - iter 2080/2606 - loss 0.04639153 - time (sec): 110.72 - samples/sec: 2664.01 - lr: 0.000017 - momentum: 0.000000
2023-10-25 11:51:09,859 epoch 5 - iter 2340/2606 - loss 0.04516071 - time (sec): 124.24 - samples/sec: 2649.51 - lr: 0.000017 - momentum: 0.000000
2023-10-25 11:51:24,025 epoch 5 - iter 2600/2606 - loss 0.04552710 - time (sec): 138.41 - samples/sec: 2647.47 - lr: 0.000017 - momentum: 0.000000
2023-10-25 11:51:24,357 ----------------------------------------------------------------------------------------------------
2023-10-25 11:51:24,357 EPOCH 5 done: loss 0.0455 - lr: 0.000017
2023-10-25 11:51:30,590 DEV : loss 0.3278255760669708 - f1-score (micro avg) 0.4015
2023-10-25 11:51:30,616 saving best model
2023-10-25 11:51:31,314 ----------------------------------------------------------------------------------------------------
2023-10-25 11:51:45,515 epoch 6 - iter 260/2606 - loss 0.03373944 - time (sec): 14.20 - samples/sec: 2672.10 - lr: 0.000016 - momentum: 0.000000
2023-10-25 11:51:59,610 epoch 6 - iter 520/2606 - loss 0.03199852 - time (sec): 28.29 - samples/sec: 2723.33 - lr: 0.000016 - momentum: 0.000000
2023-10-25 11:52:13,794 epoch 6 - iter 780/2606 - loss 0.03393438 - time (sec): 42.48 - samples/sec: 2694.52 - lr: 0.000016 - momentum: 0.000000
2023-10-25 11:52:28,592 epoch 6 - iter 1040/2606 - loss 0.03480030 - time (sec): 57.28 - samples/sec: 2659.23 - lr: 0.000015 - momentum: 0.000000
2023-10-25 11:52:42,170 epoch 6 - iter 1300/2606 - loss 0.03442460 - time (sec): 70.85 - samples/sec: 2627.20 - lr: 0.000015 - momentum: 0.000000
2023-10-25 11:52:55,763 epoch 6 - iter 1560/2606 - loss 0.03670962 - time (sec): 84.45 - samples/sec: 2631.63 - lr: 0.000015 - momentum: 0.000000
2023-10-25 11:53:09,342 epoch 6 - iter 1820/2606 - loss 0.03715112 - time (sec): 98.03 - samples/sec: 2618.16 - lr: 0.000014 - momentum: 0.000000
2023-10-25 11:53:23,179 epoch 6 - iter 2080/2606 - loss 0.03633915 - time (sec): 111.86 - samples/sec: 2616.67 - lr: 0.000014 - momentum: 0.000000
2023-10-25 11:53:37,063 epoch 6 - iter 2340/2606 - loss 0.03702732 - time (sec): 125.75 - samples/sec: 2617.58 - lr: 0.000014 - momentum: 0.000000
2023-10-25 11:53:51,255 epoch 6 - iter 2600/2606 - loss 0.03728123 - time (sec): 139.94 - samples/sec: 2619.81 - lr: 0.000013 - momentum: 0.000000
2023-10-25 11:53:51,569 ----------------------------------------------------------------------------------------------------
2023-10-25 11:53:51,569 EPOCH 6 done: loss 0.0373 - lr: 0.000013
2023-10-25 11:53:57,772 DEV : loss 0.34462153911590576 - f1-score (micro avg) 0.3964
2023-10-25 11:53:57,797 ----------------------------------------------------------------------------------------------------
2023-10-25 11:54:11,682 epoch 7 - iter 260/2606 - loss 0.01979562 - time (sec): 13.88 - samples/sec: 2671.15 - lr: 0.000013 - momentum: 0.000000
2023-10-25 11:54:25,692 epoch 7 - iter 520/2606 - loss 0.02087209 - time (sec): 27.89 - samples/sec: 2702.05 - lr: 0.000013 - momentum: 0.000000
2023-10-25 11:54:39,839 epoch 7 - iter 780/2606 - loss 0.02290501 - time (sec): 42.04 - samples/sec: 2625.40 - lr: 0.000012 - momentum: 0.000000
2023-10-25 11:54:54,087 epoch 7 - iter 1040/2606 - loss 0.02416505 - time (sec): 56.29 - samples/sec: 2625.41 - lr: 0.000012 - momentum: 0.000000
2023-10-25 11:55:08,140 epoch 7 - iter 1300/2606 - loss 0.02420180 - time (sec): 70.34 - samples/sec: 2626.94 - lr: 0.000012 - momentum: 0.000000
2023-10-25 11:55:23,022 epoch 7 - iter 1560/2606 - loss 0.02360409 - time (sec): 85.22 - samples/sec: 2626.26 - lr: 0.000011 - momentum: 0.000000
2023-10-25 11:55:38,226 epoch 7 - iter 1820/2606 - loss 0.02405417 - time (sec): 100.43 - samples/sec: 2591.22 - lr: 0.000011 - momentum: 0.000000
2023-10-25 11:55:51,845 epoch 7 - iter 2080/2606 - loss 0.02451150 - time (sec): 114.05 - samples/sec: 2587.49 - lr: 0.000011 - momentum: 0.000000
2023-10-25 11:56:06,068 epoch 7 - iter 2340/2606 - loss 0.02449885 - time (sec): 128.27 - samples/sec: 2578.50 - lr: 0.000010 - momentum: 0.000000
2023-10-25 11:56:19,845 epoch 7 - iter 2600/2606 - loss 0.02439320 - time (sec): 142.05 - samples/sec: 2578.34 - lr: 0.000010 - momentum: 0.000000
2023-10-25 11:56:20,236 ----------------------------------------------------------------------------------------------------
2023-10-25 11:56:20,236 EPOCH 7 done: loss 0.0244 - lr: 0.000010
2023-10-25 11:56:26,581 DEV : loss 0.32091233134269714 - f1-score (micro avg) 0.4248
2023-10-25 11:56:26,606 saving best model
2023-10-25 11:56:27,151 ----------------------------------------------------------------------------------------------------
2023-10-25 11:56:41,384 epoch 8 - iter 260/2606 - loss 0.02252562 - time (sec): 14.23 - samples/sec: 2609.58 - lr: 0.000010 - momentum: 0.000000
2023-10-25 11:56:55,898 epoch 8 - iter 520/2606 - loss 0.01930306 - time (sec): 28.74 - samples/sec: 2625.60 - lr: 0.000009 - momentum: 0.000000
2023-10-25 11:57:10,032 epoch 8 - iter 780/2606 - loss 0.01870884 - time (sec): 42.88 - samples/sec: 2618.88 - lr: 0.000009 - momentum: 0.000000
2023-10-25 11:57:24,578 epoch 8 - iter 1040/2606 - loss 0.01915372 - time (sec): 57.42 - samples/sec: 2612.96 - lr: 0.000009 - momentum: 0.000000
2023-10-25 11:57:40,398 epoch 8 - iter 1300/2606 - loss 0.01860075 - time (sec): 73.24 - samples/sec: 2595.58 - lr: 0.000008 - momentum: 0.000000
2023-10-25 11:57:54,134 epoch 8 - iter 1560/2606 - loss 0.01949918 - time (sec): 86.98 - samples/sec: 2586.84 - lr: 0.000008 - momentum: 0.000000
2023-10-25 11:58:08,121 epoch 8 - iter 1820/2606 - loss 0.01962360 - time (sec): 100.97 - samples/sec: 2558.62 - lr: 0.000008 - momentum: 0.000000
2023-10-25 11:58:23,529 epoch 8 - iter 2080/2606 - loss 0.01974142 - time (sec): 116.37 - samples/sec: 2547.11 - lr: 0.000007 - momentum: 0.000000
2023-10-25 11:58:37,973 epoch 8 - iter 2340/2606 - loss 0.01965665 - time (sec): 130.82 - samples/sec: 2541.92 - lr: 0.000007 - momentum: 0.000000
2023-10-25 11:58:52,332 epoch 8 - iter 2600/2606 - loss 0.01992311 - time (sec): 145.18 - samples/sec: 2525.82 - lr: 0.000007 - momentum: 0.000000
2023-10-25 11:58:52,624 ----------------------------------------------------------------------------------------------------
2023-10-25 11:58:52,625 EPOCH 8 done: loss 0.0199 - lr: 0.000007
2023-10-25 11:58:59,774 DEV : loss 0.4752597510814667 - f1-score (micro avg) 0.3824
2023-10-25 11:58:59,801 ----------------------------------------------------------------------------------------------------
2023-10-25 11:59:13,984 epoch 9 - iter 260/2606 - loss 0.01615083 - time (sec): 14.18 - samples/sec: 2541.15 - lr: 0.000006 - momentum: 0.000000
2023-10-25 11:59:27,976 epoch 9 - iter 520/2606 - loss 0.01461464 - time (sec): 28.17 - samples/sec: 2585.33 - lr: 0.000006 - momentum: 0.000000
2023-10-25 11:59:42,170 epoch 9 - iter 780/2606 - loss 0.01350285 - time (sec): 42.37 - samples/sec: 2579.29 - lr: 0.000006 - momentum: 0.000000
2023-10-25 11:59:56,028 epoch 9 - iter 1040/2606 - loss 0.01384109 - time (sec): 56.23 - samples/sec: 2587.65 - lr: 0.000005 - momentum: 0.000000
2023-10-25 12:00:10,480 epoch 9 - iter 1300/2606 - loss 0.01413513 - time (sec): 70.68 - samples/sec: 2595.62 - lr: 0.000005 - momentum: 0.000000
2023-10-25 12:00:24,644 epoch 9 - iter 1560/2606 - loss 0.01406109 - time (sec): 84.84 - samples/sec: 2600.34 - lr: 0.000005 - momentum: 0.000000
2023-10-25 12:00:38,961 epoch 9 - iter 1820/2606 - loss 0.01421979 - time (sec): 99.16 - samples/sec: 2603.67 - lr: 0.000004 - momentum: 0.000000
2023-10-25 12:00:53,372 epoch 9 - iter 2080/2606 - loss 0.01397783 - time (sec): 113.57 - samples/sec: 2582.73 - lr: 0.000004 - momentum: 0.000000
2023-10-25 12:01:08,292 epoch 9 - iter 2340/2606 - loss 0.01361878 - time (sec): 128.49 - samples/sec: 2573.71 - lr: 0.000004 - momentum: 0.000000
2023-10-25 12:01:22,562 epoch 9 - iter 2600/2606 - loss 0.01331532 - time (sec): 142.76 - samples/sec: 2566.49 - lr: 0.000003 - momentum: 0.000000
2023-10-25 12:01:22,891 ----------------------------------------------------------------------------------------------------
2023-10-25 12:01:22,891 EPOCH 9 done: loss 0.0133 - lr: 0.000003
2023-10-25 12:01:29,986 DEV : loss 0.4919085204601288 - f1-score (micro avg) 0.3897
2023-10-25 12:01:30,013 ----------------------------------------------------------------------------------------------------
2023-10-25 12:01:44,440 epoch 10 - iter 260/2606 - loss 0.00534808 - time (sec): 14.43 - samples/sec: 2579.04 - lr: 0.000003 - momentum: 0.000000
2023-10-25 12:01:58,548 epoch 10 - iter 520/2606 - loss 0.00657158 - time (sec): 28.53 - samples/sec: 2522.87 - lr: 0.000003 - momentum: 0.000000
2023-10-25 12:02:14,114 epoch 10 - iter 780/2606 - loss 0.00801464 - time (sec): 44.10 - samples/sec: 2529.83 - lr: 0.000002 - momentum: 0.000000
2023-10-25 12:02:28,519 epoch 10 - iter 1040/2606 - loss 0.00927196 - time (sec): 58.50 - samples/sec: 2551.25 - lr: 0.000002 - momentum: 0.000000
2023-10-25 12:02:42,788 epoch 10 - iter 1300/2606 - loss 0.00938321 - time (sec): 72.77 - samples/sec: 2518.72 - lr: 0.000002 - momentum: 0.000000
2023-10-25 12:02:57,574 epoch 10 - iter 1560/2606 - loss 0.01027436 - time (sec): 87.56 - samples/sec: 2506.66 - lr: 0.000001 - momentum: 0.000000
2023-10-25 12:03:11,726 epoch 10 - iter 1820/2606 - loss 0.01005900 - time (sec): 101.71 - samples/sec: 2509.51 - lr: 0.000001 - momentum: 0.000000
2023-10-25 12:03:25,658 epoch 10 - iter 2080/2606 - loss 0.00995170 - time (sec): 115.64 - samples/sec: 2529.85 - lr: 0.000001 - momentum: 0.000000
2023-10-25 12:03:39,660 epoch 10 - iter 2340/2606 - loss 0.00964683 - time (sec): 129.65 - samples/sec: 2546.93 - lr: 0.000000 - momentum: 0.000000
2023-10-25 12:03:53,974 epoch 10 - iter 2600/2606 - loss 0.00939974 - time (sec): 143.96 - samples/sec: 2548.58 - lr: 0.000000 - momentum: 0.000000
2023-10-25 12:03:54,247 ----------------------------------------------------------------------------------------------------
2023-10-25 12:03:54,247 EPOCH 10 done: loss 0.0094 - lr: 0.000000
2023-10-25 12:04:01,161 DEV : loss 0.4912854731082916 - f1-score (micro avg) 0.3849
2023-10-25 12:04:01,808 ----------------------------------------------------------------------------------------------------
2023-10-25 12:04:01,809 Loading model from best epoch ...
2023-10-25 12:04:03,671 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
2023-10-25 12:04:13,558
Results:
- F-score (micro) 0.4363
- F-score (macro) 0.2969
- Accuracy 0.2841
By class:
precision recall f1-score support
LOC 0.4646 0.4967 0.4801 1214
PER 0.4357 0.4567 0.4459 808
ORG 0.2788 0.2465 0.2617 353
HumanProd 0.0000 0.0000 0.0000 15
micro avg 0.4298 0.4431 0.4363 2390
macro avg 0.2948 0.3000 0.2969 2390
weighted avg 0.4244 0.4431 0.4333 2390
2023-10-25 12:04:13,558 ----------------------------------------------------------------------------------------------------
|