stefan-it's picture
Upload ./training.log with huggingface_hub
2bb1af2 verified
2024-03-26 12:12:19,300 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,300 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(30001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=17, bias=True)
(loss_function): CrossEntropyLoss()
)"
2024-03-26 12:12:19,300 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,300 Corpus: 758 train + 94 dev + 96 test sentences
2024-03-26 12:12:19,300 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,300 Train: 758 sentences
2024-03-26 12:12:19,300 (train_with_dev=False, train_with_test=False)
2024-03-26 12:12:19,300 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,300 Training Params:
2024-03-26 12:12:19,300 - learning_rate: "3e-05"
2024-03-26 12:12:19,300 - mini_batch_size: "8"
2024-03-26 12:12:19,300 - max_epochs: "10"
2024-03-26 12:12:19,300 - shuffle: "True"
2024-03-26 12:12:19,300 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,301 Plugins:
2024-03-26 12:12:19,301 - TensorboardLogger
2024-03-26 12:12:19,301 - LinearScheduler | warmup_fraction: '0.1'
2024-03-26 12:12:19,301 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,301 Final evaluation on model from best epoch (best-model.pt)
2024-03-26 12:12:19,301 - metric: "('micro avg', 'f1-score')"
2024-03-26 12:12:19,301 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,301 Computation:
2024-03-26 12:12:19,301 - compute on device: cuda:0
2024-03-26 12:12:19,301 - embedding storage: none
2024-03-26 12:12:19,301 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,301 Model training base path: "flair-co-funer-german_bert_base-bs8-e10-lr3e-05-5"
2024-03-26 12:12:19,301 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,301 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:19,301 Logging anything other than scalars to TensorBoard is currently not supported.
2024-03-26 12:12:21,208 epoch 1 - iter 9/95 - loss 3.13025724 - time (sec): 1.91 - samples/sec: 1644.18 - lr: 0.000003 - momentum: 0.000000
2024-03-26 12:12:23,119 epoch 1 - iter 18/95 - loss 3.02143589 - time (sec): 3.82 - samples/sec: 1737.47 - lr: 0.000005 - momentum: 0.000000
2024-03-26 12:12:25,540 epoch 1 - iter 27/95 - loss 2.85748883 - time (sec): 6.24 - samples/sec: 1662.40 - lr: 0.000008 - momentum: 0.000000
2024-03-26 12:12:27,055 epoch 1 - iter 36/95 - loss 2.68721258 - time (sec): 7.75 - samples/sec: 1742.22 - lr: 0.000011 - momentum: 0.000000
2024-03-26 12:12:29,242 epoch 1 - iter 45/95 - loss 2.53500844 - time (sec): 9.94 - samples/sec: 1728.71 - lr: 0.000014 - momentum: 0.000000
2024-03-26 12:12:30,855 epoch 1 - iter 54/95 - loss 2.38005317 - time (sec): 11.55 - samples/sec: 1751.24 - lr: 0.000017 - momentum: 0.000000
2024-03-26 12:12:32,532 epoch 1 - iter 63/95 - loss 2.24884194 - time (sec): 13.23 - samples/sec: 1768.75 - lr: 0.000020 - momentum: 0.000000
2024-03-26 12:12:34,462 epoch 1 - iter 72/95 - loss 2.11810715 - time (sec): 15.16 - samples/sec: 1761.01 - lr: 0.000022 - momentum: 0.000000
2024-03-26 12:12:36,555 epoch 1 - iter 81/95 - loss 1.97458181 - time (sec): 17.25 - samples/sec: 1747.37 - lr: 0.000025 - momentum: 0.000000
2024-03-26 12:12:38,201 epoch 1 - iter 90/95 - loss 1.86443011 - time (sec): 18.90 - samples/sec: 1742.15 - lr: 0.000028 - momentum: 0.000000
2024-03-26 12:12:38,984 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:38,984 EPOCH 1 done: loss 1.8102 - lr: 0.000028
2024-03-26 12:12:39,981 DEV : loss 0.4937804043292999 - f1-score (micro avg) 0.6384
2024-03-26 12:12:39,982 saving best model
2024-03-26 12:12:40,246 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:42,553 epoch 2 - iter 9/95 - loss 0.58742447 - time (sec): 2.31 - samples/sec: 1654.76 - lr: 0.000030 - momentum: 0.000000
2024-03-26 12:12:44,506 epoch 2 - iter 18/95 - loss 0.53837974 - time (sec): 4.26 - samples/sec: 1642.13 - lr: 0.000029 - momentum: 0.000000
2024-03-26 12:12:46,904 epoch 2 - iter 27/95 - loss 0.49843502 - time (sec): 6.66 - samples/sec: 1606.77 - lr: 0.000029 - momentum: 0.000000
2024-03-26 12:12:48,273 epoch 2 - iter 36/95 - loss 0.48166426 - time (sec): 8.03 - samples/sec: 1721.68 - lr: 0.000029 - momentum: 0.000000
2024-03-26 12:12:50,276 epoch 2 - iter 45/95 - loss 0.45221242 - time (sec): 10.03 - samples/sec: 1682.88 - lr: 0.000028 - momentum: 0.000000
2024-03-26 12:12:51,625 epoch 2 - iter 54/95 - loss 0.44817256 - time (sec): 11.38 - samples/sec: 1727.43 - lr: 0.000028 - momentum: 0.000000
2024-03-26 12:12:53,236 epoch 2 - iter 63/95 - loss 0.43139087 - time (sec): 12.99 - samples/sec: 1742.36 - lr: 0.000028 - momentum: 0.000000
2024-03-26 12:12:55,342 epoch 2 - iter 72/95 - loss 0.42449041 - time (sec): 15.10 - samples/sec: 1736.33 - lr: 0.000028 - momentum: 0.000000
2024-03-26 12:12:57,273 epoch 2 - iter 81/95 - loss 0.43031127 - time (sec): 17.03 - samples/sec: 1738.10 - lr: 0.000027 - momentum: 0.000000
2024-03-26 12:12:59,245 epoch 2 - iter 90/95 - loss 0.41306546 - time (sec): 19.00 - samples/sec: 1740.98 - lr: 0.000027 - momentum: 0.000000
2024-03-26 12:12:59,832 ----------------------------------------------------------------------------------------------------
2024-03-26 12:12:59,832 EPOCH 2 done: loss 0.4122 - lr: 0.000027
2024-03-26 12:13:00,762 DEV : loss 0.3092578947544098 - f1-score (micro avg) 0.8154
2024-03-26 12:13:00,765 saving best model
2024-03-26 12:13:01,193 ----------------------------------------------------------------------------------------------------
2024-03-26 12:13:02,422 epoch 3 - iter 9/95 - loss 0.34113790 - time (sec): 1.23 - samples/sec: 2112.58 - lr: 0.000026 - momentum: 0.000000
2024-03-26 12:13:04,696 epoch 3 - iter 18/95 - loss 0.28031594 - time (sec): 3.50 - samples/sec: 1833.74 - lr: 0.000026 - momentum: 0.000000
2024-03-26 12:13:06,415 epoch 3 - iter 27/95 - loss 0.27279972 - time (sec): 5.22 - samples/sec: 1869.20 - lr: 0.000026 - momentum: 0.000000
2024-03-26 12:13:08,180 epoch 3 - iter 36/95 - loss 0.26278997 - time (sec): 6.98 - samples/sec: 1885.44 - lr: 0.000025 - momentum: 0.000000
2024-03-26 12:13:09,639 epoch 3 - iter 45/95 - loss 0.24443392 - time (sec): 8.44 - samples/sec: 1880.76 - lr: 0.000025 - momentum: 0.000000
2024-03-26 12:13:11,835 epoch 3 - iter 54/95 - loss 0.23529485 - time (sec): 10.64 - samples/sec: 1815.61 - lr: 0.000025 - momentum: 0.000000
2024-03-26 12:13:13,566 epoch 3 - iter 63/95 - loss 0.23725847 - time (sec): 12.37 - samples/sec: 1799.13 - lr: 0.000025 - momentum: 0.000000
2024-03-26 12:13:15,909 epoch 3 - iter 72/95 - loss 0.22499834 - time (sec): 14.71 - samples/sec: 1762.87 - lr: 0.000024 - momentum: 0.000000
2024-03-26 12:13:18,163 epoch 3 - iter 81/95 - loss 0.22402785 - time (sec): 16.97 - samples/sec: 1754.21 - lr: 0.000024 - momentum: 0.000000
2024-03-26 12:13:19,920 epoch 3 - iter 90/95 - loss 0.22009592 - time (sec): 18.73 - samples/sec: 1747.41 - lr: 0.000024 - momentum: 0.000000
2024-03-26 12:13:20,816 ----------------------------------------------------------------------------------------------------
2024-03-26 12:13:20,816 EPOCH 3 done: loss 0.2181 - lr: 0.000024
2024-03-26 12:13:21,744 DEV : loss 0.24916645884513855 - f1-score (micro avg) 0.8601
2024-03-26 12:13:21,745 saving best model
2024-03-26 12:13:22,175 ----------------------------------------------------------------------------------------------------
2024-03-26 12:13:25,101 epoch 4 - iter 9/95 - loss 0.11820412 - time (sec): 2.92 - samples/sec: 1459.63 - lr: 0.000023 - momentum: 0.000000
2024-03-26 12:13:26,164 epoch 4 - iter 18/95 - loss 0.15462492 - time (sec): 3.99 - samples/sec: 1669.39 - lr: 0.000023 - momentum: 0.000000
2024-03-26 12:13:28,776 epoch 4 - iter 27/95 - loss 0.14114747 - time (sec): 6.60 - samples/sec: 1612.13 - lr: 0.000022 - momentum: 0.000000
2024-03-26 12:13:31,431 epoch 4 - iter 36/95 - loss 0.13929534 - time (sec): 9.25 - samples/sec: 1568.22 - lr: 0.000022 - momentum: 0.000000
2024-03-26 12:13:33,158 epoch 4 - iter 45/95 - loss 0.13082601 - time (sec): 10.98 - samples/sec: 1603.96 - lr: 0.000022 - momentum: 0.000000
2024-03-26 12:13:34,890 epoch 4 - iter 54/95 - loss 0.13322136 - time (sec): 12.71 - samples/sec: 1620.46 - lr: 0.000022 - momentum: 0.000000
2024-03-26 12:13:36,852 epoch 4 - iter 63/95 - loss 0.13291566 - time (sec): 14.67 - samples/sec: 1646.11 - lr: 0.000021 - momentum: 0.000000
2024-03-26 12:13:38,612 epoch 4 - iter 72/95 - loss 0.13863198 - time (sec): 16.43 - samples/sec: 1688.91 - lr: 0.000021 - momentum: 0.000000
2024-03-26 12:13:39,651 epoch 4 - iter 81/95 - loss 0.14000391 - time (sec): 17.47 - samples/sec: 1729.15 - lr: 0.000021 - momentum: 0.000000
2024-03-26 12:13:41,104 epoch 4 - iter 90/95 - loss 0.13896442 - time (sec): 18.93 - samples/sec: 1753.54 - lr: 0.000020 - momentum: 0.000000
2024-03-26 12:13:41,661 ----------------------------------------------------------------------------------------------------
2024-03-26 12:13:41,661 EPOCH 4 done: loss 0.1403 - lr: 0.000020
2024-03-26 12:13:42,613 DEV : loss 0.20587997138500214 - f1-score (micro avg) 0.8791
2024-03-26 12:13:42,614 saving best model
2024-03-26 12:13:43,044 ----------------------------------------------------------------------------------------------------
2024-03-26 12:13:44,716 epoch 5 - iter 9/95 - loss 0.12833967 - time (sec): 1.67 - samples/sec: 1960.98 - lr: 0.000020 - momentum: 0.000000
2024-03-26 12:13:46,724 epoch 5 - iter 18/95 - loss 0.10224678 - time (sec): 3.68 - samples/sec: 1935.84 - lr: 0.000019 - momentum: 0.000000
2024-03-26 12:13:48,896 epoch 5 - iter 27/95 - loss 0.08891684 - time (sec): 5.85 - samples/sec: 1809.69 - lr: 0.000019 - momentum: 0.000000
2024-03-26 12:13:50,265 epoch 5 - iter 36/95 - loss 0.10021714 - time (sec): 7.22 - samples/sec: 1862.38 - lr: 0.000019 - momentum: 0.000000
2024-03-26 12:13:52,392 epoch 5 - iter 45/95 - loss 0.09830224 - time (sec): 9.35 - samples/sec: 1823.04 - lr: 0.000019 - momentum: 0.000000
2024-03-26 12:13:53,589 epoch 5 - iter 54/95 - loss 0.09884655 - time (sec): 10.54 - samples/sec: 1857.10 - lr: 0.000018 - momentum: 0.000000
2024-03-26 12:13:55,110 epoch 5 - iter 63/95 - loss 0.10262264 - time (sec): 12.06 - samples/sec: 1868.69 - lr: 0.000018 - momentum: 0.000000
2024-03-26 12:13:57,140 epoch 5 - iter 72/95 - loss 0.10403952 - time (sec): 14.09 - samples/sec: 1832.27 - lr: 0.000018 - momentum: 0.000000
2024-03-26 12:13:58,948 epoch 5 - iter 81/95 - loss 0.10088906 - time (sec): 15.90 - samples/sec: 1821.49 - lr: 0.000017 - momentum: 0.000000
2024-03-26 12:14:01,414 epoch 5 - iter 90/95 - loss 0.10022864 - time (sec): 18.37 - samples/sec: 1789.96 - lr: 0.000017 - momentum: 0.000000
2024-03-26 12:14:02,413 ----------------------------------------------------------------------------------------------------
2024-03-26 12:14:02,413 EPOCH 5 done: loss 0.0980 - lr: 0.000017
2024-03-26 12:14:03,349 DEV : loss 0.2219552993774414 - f1-score (micro avg) 0.8969
2024-03-26 12:14:03,350 saving best model
2024-03-26 12:14:03,771 ----------------------------------------------------------------------------------------------------
2024-03-26 12:14:05,854 epoch 6 - iter 9/95 - loss 0.08154132 - time (sec): 2.08 - samples/sec: 1567.72 - lr: 0.000016 - momentum: 0.000000
2024-03-26 12:14:08,363 epoch 6 - iter 18/95 - loss 0.07889917 - time (sec): 4.59 - samples/sec: 1615.59 - lr: 0.000016 - momentum: 0.000000
2024-03-26 12:14:09,530 epoch 6 - iter 27/95 - loss 0.09804581 - time (sec): 5.76 - samples/sec: 1717.55 - lr: 0.000016 - momentum: 0.000000
2024-03-26 12:14:11,239 epoch 6 - iter 36/95 - loss 0.08941652 - time (sec): 7.47 - samples/sec: 1728.34 - lr: 0.000016 - momentum: 0.000000
2024-03-26 12:14:13,231 epoch 6 - iter 45/95 - loss 0.08535015 - time (sec): 9.46 - samples/sec: 1727.49 - lr: 0.000015 - momentum: 0.000000
2024-03-26 12:14:15,437 epoch 6 - iter 54/95 - loss 0.07909457 - time (sec): 11.66 - samples/sec: 1696.57 - lr: 0.000015 - momentum: 0.000000
2024-03-26 12:14:17,136 epoch 6 - iter 63/95 - loss 0.08119408 - time (sec): 13.36 - samples/sec: 1718.81 - lr: 0.000015 - momentum: 0.000000
2024-03-26 12:14:18,741 epoch 6 - iter 72/95 - loss 0.08171248 - time (sec): 14.97 - samples/sec: 1739.95 - lr: 0.000014 - momentum: 0.000000
2024-03-26 12:14:20,009 epoch 6 - iter 81/95 - loss 0.08051873 - time (sec): 16.24 - samples/sec: 1770.49 - lr: 0.000014 - momentum: 0.000000
2024-03-26 12:14:21,915 epoch 6 - iter 90/95 - loss 0.07683396 - time (sec): 18.14 - samples/sec: 1771.12 - lr: 0.000014 - momentum: 0.000000
2024-03-26 12:14:23,478 ----------------------------------------------------------------------------------------------------
2024-03-26 12:14:23,478 EPOCH 6 done: loss 0.0741 - lr: 0.000014
2024-03-26 12:14:24,413 DEV : loss 0.20766964554786682 - f1-score (micro avg) 0.9088
2024-03-26 12:14:24,414 saving best model
2024-03-26 12:14:24,842 ----------------------------------------------------------------------------------------------------
2024-03-26 12:14:26,546 epoch 7 - iter 9/95 - loss 0.03701075 - time (sec): 1.70 - samples/sec: 1850.51 - lr: 0.000013 - momentum: 0.000000
2024-03-26 12:14:28,062 epoch 7 - iter 18/95 - loss 0.05482186 - time (sec): 3.22 - samples/sec: 1828.40 - lr: 0.000013 - momentum: 0.000000
2024-03-26 12:14:29,376 epoch 7 - iter 27/95 - loss 0.07014500 - time (sec): 4.53 - samples/sec: 1868.91 - lr: 0.000013 - momentum: 0.000000
2024-03-26 12:14:31,662 epoch 7 - iter 36/95 - loss 0.06250571 - time (sec): 6.82 - samples/sec: 1864.31 - lr: 0.000012 - momentum: 0.000000
2024-03-26 12:14:33,631 epoch 7 - iter 45/95 - loss 0.06506855 - time (sec): 8.79 - samples/sec: 1853.20 - lr: 0.000012 - momentum: 0.000000
2024-03-26 12:14:35,345 epoch 7 - iter 54/95 - loss 0.06315221 - time (sec): 10.50 - samples/sec: 1846.73 - lr: 0.000012 - momentum: 0.000000
2024-03-26 12:14:36,942 epoch 7 - iter 63/95 - loss 0.06212348 - time (sec): 12.10 - samples/sec: 1863.11 - lr: 0.000011 - momentum: 0.000000
2024-03-26 12:14:38,488 epoch 7 - iter 72/95 - loss 0.06190864 - time (sec): 13.64 - samples/sec: 1852.74 - lr: 0.000011 - momentum: 0.000000
2024-03-26 12:14:41,298 epoch 7 - iter 81/95 - loss 0.05880913 - time (sec): 16.45 - samples/sec: 1785.58 - lr: 0.000011 - momentum: 0.000000
2024-03-26 12:14:42,960 epoch 7 - iter 90/95 - loss 0.05908614 - time (sec): 18.11 - samples/sec: 1792.78 - lr: 0.000010 - momentum: 0.000000
2024-03-26 12:14:44,132 ----------------------------------------------------------------------------------------------------
2024-03-26 12:14:44,132 EPOCH 7 done: loss 0.0586 - lr: 0.000010
2024-03-26 12:14:45,067 DEV : loss 0.19811469316482544 - f1-score (micro avg) 0.9226
2024-03-26 12:14:45,068 saving best model
2024-03-26 12:14:45,497 ----------------------------------------------------------------------------------------------------
2024-03-26 12:14:47,696 epoch 8 - iter 9/95 - loss 0.06801757 - time (sec): 2.20 - samples/sec: 1538.06 - lr: 0.000010 - momentum: 0.000000
2024-03-26 12:14:49,260 epoch 8 - iter 18/95 - loss 0.05022268 - time (sec): 3.76 - samples/sec: 1622.38 - lr: 0.000010 - momentum: 0.000000
2024-03-26 12:14:51,319 epoch 8 - iter 27/95 - loss 0.04794309 - time (sec): 5.82 - samples/sec: 1685.88 - lr: 0.000009 - momentum: 0.000000
2024-03-26 12:14:53,325 epoch 8 - iter 36/95 - loss 0.04331981 - time (sec): 7.83 - samples/sec: 1720.28 - lr: 0.000009 - momentum: 0.000000
2024-03-26 12:14:54,748 epoch 8 - iter 45/95 - loss 0.04240732 - time (sec): 9.25 - samples/sec: 1779.36 - lr: 0.000009 - momentum: 0.000000
2024-03-26 12:14:56,234 epoch 8 - iter 54/95 - loss 0.04200843 - time (sec): 10.74 - samples/sec: 1851.18 - lr: 0.000008 - momentum: 0.000000
2024-03-26 12:14:57,883 epoch 8 - iter 63/95 - loss 0.04390145 - time (sec): 12.38 - samples/sec: 1836.31 - lr: 0.000008 - momentum: 0.000000
2024-03-26 12:15:00,025 epoch 8 - iter 72/95 - loss 0.04192389 - time (sec): 14.53 - samples/sec: 1802.44 - lr: 0.000008 - momentum: 0.000000
2024-03-26 12:15:01,629 epoch 8 - iter 81/95 - loss 0.04350733 - time (sec): 16.13 - samples/sec: 1825.49 - lr: 0.000007 - momentum: 0.000000
2024-03-26 12:15:03,720 epoch 8 - iter 90/95 - loss 0.04499955 - time (sec): 18.22 - samples/sec: 1804.50 - lr: 0.000007 - momentum: 0.000000
2024-03-26 12:15:04,366 ----------------------------------------------------------------------------------------------------
2024-03-26 12:15:04,366 EPOCH 8 done: loss 0.0462 - lr: 0.000007
2024-03-26 12:15:05,325 DEV : loss 0.20118741691112518 - f1-score (micro avg) 0.9394
2024-03-26 12:15:05,326 saving best model
2024-03-26 12:15:05,774 ----------------------------------------------------------------------------------------------------
2024-03-26 12:15:08,384 epoch 9 - iter 9/95 - loss 0.02615153 - time (sec): 2.61 - samples/sec: 1653.42 - lr: 0.000007 - momentum: 0.000000
2024-03-26 12:15:09,978 epoch 9 - iter 18/95 - loss 0.03326910 - time (sec): 4.20 - samples/sec: 1720.59 - lr: 0.000006 - momentum: 0.000000
2024-03-26 12:15:12,592 epoch 9 - iter 27/95 - loss 0.03556908 - time (sec): 6.82 - samples/sec: 1658.17 - lr: 0.000006 - momentum: 0.000000
2024-03-26 12:15:14,456 epoch 9 - iter 36/95 - loss 0.04028057 - time (sec): 8.68 - samples/sec: 1669.48 - lr: 0.000006 - momentum: 0.000000
2024-03-26 12:15:15,637 epoch 9 - iter 45/95 - loss 0.03852358 - time (sec): 9.86 - samples/sec: 1729.66 - lr: 0.000005 - momentum: 0.000000
2024-03-26 12:15:17,406 epoch 9 - iter 54/95 - loss 0.03538469 - time (sec): 11.63 - samples/sec: 1725.02 - lr: 0.000005 - momentum: 0.000000
2024-03-26 12:15:18,828 epoch 9 - iter 63/95 - loss 0.04012157 - time (sec): 13.05 - samples/sec: 1769.75 - lr: 0.000005 - momentum: 0.000000
2024-03-26 12:15:20,027 epoch 9 - iter 72/95 - loss 0.03841923 - time (sec): 14.25 - samples/sec: 1817.85 - lr: 0.000004 - momentum: 0.000000
2024-03-26 12:15:21,570 epoch 9 - iter 81/95 - loss 0.03652668 - time (sec): 15.80 - samples/sec: 1818.78 - lr: 0.000004 - momentum: 0.000000
2024-03-26 12:15:24,370 epoch 9 - iter 90/95 - loss 0.03907133 - time (sec): 18.60 - samples/sec: 1773.31 - lr: 0.000004 - momentum: 0.000000
2024-03-26 12:15:25,165 ----------------------------------------------------------------------------------------------------
2024-03-26 12:15:25,165 EPOCH 9 done: loss 0.0383 - lr: 0.000004
2024-03-26 12:15:26,112 DEV : loss 0.19695152342319489 - f1-score (micro avg) 0.9206
2024-03-26 12:15:26,114 ----------------------------------------------------------------------------------------------------
2024-03-26 12:15:28,623 epoch 10 - iter 9/95 - loss 0.02975017 - time (sec): 2.51 - samples/sec: 1608.89 - lr: 0.000003 - momentum: 0.000000
2024-03-26 12:15:30,201 epoch 10 - iter 18/95 - loss 0.02892990 - time (sec): 4.09 - samples/sec: 1706.28 - lr: 0.000003 - momentum: 0.000000
2024-03-26 12:15:32,208 epoch 10 - iter 27/95 - loss 0.02863351 - time (sec): 6.09 - samples/sec: 1653.77 - lr: 0.000003 - momentum: 0.000000
2024-03-26 12:15:34,382 epoch 10 - iter 36/95 - loss 0.02798792 - time (sec): 8.27 - samples/sec: 1650.65 - lr: 0.000002 - momentum: 0.000000
2024-03-26 12:15:36,278 epoch 10 - iter 45/95 - loss 0.02736004 - time (sec): 10.16 - samples/sec: 1669.06 - lr: 0.000002 - momentum: 0.000000
2024-03-26 12:15:37,439 epoch 10 - iter 54/95 - loss 0.02848430 - time (sec): 11.32 - samples/sec: 1730.11 - lr: 0.000002 - momentum: 0.000000
2024-03-26 12:15:39,106 epoch 10 - iter 63/95 - loss 0.03460360 - time (sec): 12.99 - samples/sec: 1751.63 - lr: 0.000001 - momentum: 0.000000
2024-03-26 12:15:40,958 epoch 10 - iter 72/95 - loss 0.03291993 - time (sec): 14.84 - samples/sec: 1742.85 - lr: 0.000001 - momentum: 0.000000
2024-03-26 12:15:42,680 epoch 10 - iter 81/95 - loss 0.03440719 - time (sec): 16.57 - samples/sec: 1752.77 - lr: 0.000001 - momentum: 0.000000
2024-03-26 12:15:45,525 epoch 10 - iter 90/95 - loss 0.03165072 - time (sec): 19.41 - samples/sec: 1717.54 - lr: 0.000000 - momentum: 0.000000
2024-03-26 12:15:46,076 ----------------------------------------------------------------------------------------------------
2024-03-26 12:15:46,076 EPOCH 10 done: loss 0.0315 - lr: 0.000000
2024-03-26 12:15:47,013 DEV : loss 0.20343400537967682 - f1-score (micro avg) 0.9299
2024-03-26 12:15:47,277 ----------------------------------------------------------------------------------------------------
2024-03-26 12:15:47,277 Loading model from best epoch ...
2024-03-26 12:15:48,132 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
2024-03-26 12:15:48,895
Results:
- F-score (micro) 0.903
- F-score (macro) 0.6866
- Accuracy 0.8254
By class:
precision recall f1-score support
Unternehmen 0.8893 0.8759 0.8826 266
Auslagerung 0.8726 0.9076 0.8898 249
Ort 0.9635 0.9851 0.9742 134
Software 0.0000 0.0000 0.0000 0
micro avg 0.8955 0.9106 0.9030 649
macro avg 0.6814 0.6922 0.6866 649
weighted avg 0.8982 0.9106 0.9042 649
2024-03-26 12:15:48,895 ----------------------------------------------------------------------------------------------------