stefan-it commited on
Commit
bf767d2
1 Parent(s): c80f502

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +264 -0
training.log ADDED
@@ -0,0 +1,264 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-03-26 11:12:07,458 ----------------------------------------------------------------------------------------------------
2
+ 2024-03-26 11:12:07,459 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(30001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-11): 12 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=768, out_features=768, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=768, out_features=17, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
51
+ 2024-03-26 11:12:07,459 Corpus: 758 train + 94 dev + 96 test sentences
52
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
53
+ 2024-03-26 11:12:07,459 Train: 758 sentences
54
+ 2024-03-26 11:12:07,459 (train_with_dev=False, train_with_test=False)
55
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
56
+ 2024-03-26 11:12:07,459 Training Params:
57
+ 2024-03-26 11:12:07,459 - learning_rate: "3e-05"
58
+ 2024-03-26 11:12:07,459 - mini_batch_size: "16"
59
+ 2024-03-26 11:12:07,459 - max_epochs: "10"
60
+ 2024-03-26 11:12:07,459 - shuffle: "True"
61
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
62
+ 2024-03-26 11:12:07,459 Plugins:
63
+ 2024-03-26 11:12:07,459 - TensorboardLogger
64
+ 2024-03-26 11:12:07,459 - LinearScheduler | warmup_fraction: '0.1'
65
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
66
+ 2024-03-26 11:12:07,459 Final evaluation on model from best epoch (best-model.pt)
67
+ 2024-03-26 11:12:07,459 - metric: "('micro avg', 'f1-score')"
68
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
69
+ 2024-03-26 11:12:07,459 Computation:
70
+ 2024-03-26 11:12:07,459 - compute on device: cuda:0
71
+ 2024-03-26 11:12:07,459 - embedding storage: none
72
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
73
+ 2024-03-26 11:12:07,459 Model training base path: "flair-co-funer-german_bert_base-bs16-e10-lr3e-05-2"
74
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
75
+ 2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
76
+ 2024-03-26 11:12:07,459 Logging anything other than scalars to TensorBoard is currently not supported.
77
+ 2024-03-26 11:12:09,254 epoch 1 - iter 4/48 - loss 3.16306784 - time (sec): 1.79 - samples/sec: 1683.21 - lr: 0.000002 - momentum: 0.000000
78
+ 2024-03-26 11:12:11,484 epoch 1 - iter 8/48 - loss 3.08728552 - time (sec): 4.02 - samples/sec: 1542.55 - lr: 0.000004 - momentum: 0.000000
79
+ 2024-03-26 11:12:13,424 epoch 1 - iter 12/48 - loss 3.02503918 - time (sec): 5.96 - samples/sec: 1494.59 - lr: 0.000007 - momentum: 0.000000
80
+ 2024-03-26 11:12:15,448 epoch 1 - iter 16/48 - loss 2.90887267 - time (sec): 7.99 - samples/sec: 1518.99 - lr: 0.000009 - momentum: 0.000000
81
+ 2024-03-26 11:12:17,757 epoch 1 - iter 20/48 - loss 2.78404911 - time (sec): 10.30 - samples/sec: 1483.97 - lr: 0.000012 - momentum: 0.000000
82
+ 2024-03-26 11:12:20,883 epoch 1 - iter 24/48 - loss 2.66287611 - time (sec): 13.42 - samples/sec: 1354.23 - lr: 0.000014 - momentum: 0.000000
83
+ 2024-03-26 11:12:23,406 epoch 1 - iter 28/48 - loss 2.53932089 - time (sec): 15.95 - samples/sec: 1336.49 - lr: 0.000017 - momentum: 0.000000
84
+ 2024-03-26 11:12:24,241 epoch 1 - iter 32/48 - loss 2.45998768 - time (sec): 16.78 - samples/sec: 1391.37 - lr: 0.000019 - momentum: 0.000000
85
+ 2024-03-26 11:12:25,574 epoch 1 - iter 36/48 - loss 2.36289832 - time (sec): 18.11 - samples/sec: 1443.65 - lr: 0.000022 - momentum: 0.000000
86
+ 2024-03-26 11:12:27,525 epoch 1 - iter 40/48 - loss 2.28678685 - time (sec): 20.07 - samples/sec: 1449.80 - lr: 0.000024 - momentum: 0.000000
87
+ 2024-03-26 11:12:29,505 epoch 1 - iter 44/48 - loss 2.18927422 - time (sec): 22.05 - samples/sec: 1449.30 - lr: 0.000027 - momentum: 0.000000
88
+ 2024-03-26 11:12:30,921 epoch 1 - iter 48/48 - loss 2.11069614 - time (sec): 23.46 - samples/sec: 1469.26 - lr: 0.000029 - momentum: 0.000000
89
+ 2024-03-26 11:12:30,922 ----------------------------------------------------------------------------------------------------
90
+ 2024-03-26 11:12:30,922 EPOCH 1 done: loss 2.1107 - lr: 0.000029
91
+ 2024-03-26 11:12:31,865 DEV : loss 0.7406538128852844 - f1-score (micro avg) 0.4904
92
+ 2024-03-26 11:12:31,866 saving best model
93
+ 2024-03-26 11:12:32,158 ----------------------------------------------------------------------------------------------------
94
+ 2024-03-26 11:12:33,483 epoch 2 - iter 4/48 - loss 1.05138817 - time (sec): 1.32 - samples/sec: 2191.17 - lr: 0.000030 - momentum: 0.000000
95
+ 2024-03-26 11:12:35,353 epoch 2 - iter 8/48 - loss 0.87906687 - time (sec): 3.19 - samples/sec: 1909.15 - lr: 0.000030 - momentum: 0.000000
96
+ 2024-03-26 11:12:38,859 epoch 2 - iter 12/48 - loss 0.75980156 - time (sec): 6.70 - samples/sec: 1519.05 - lr: 0.000029 - momentum: 0.000000
97
+ 2024-03-26 11:12:41,406 epoch 2 - iter 16/48 - loss 0.70656249 - time (sec): 9.25 - samples/sec: 1440.33 - lr: 0.000029 - momentum: 0.000000
98
+ 2024-03-26 11:12:44,203 epoch 2 - iter 20/48 - loss 0.65983774 - time (sec): 12.04 - samples/sec: 1379.31 - lr: 0.000029 - momentum: 0.000000
99
+ 2024-03-26 11:12:46,198 epoch 2 - iter 24/48 - loss 0.62246019 - time (sec): 14.04 - samples/sec: 1373.20 - lr: 0.000028 - momentum: 0.000000
100
+ 2024-03-26 11:12:48,002 epoch 2 - iter 28/48 - loss 0.61743183 - time (sec): 15.84 - samples/sec: 1384.37 - lr: 0.000028 - momentum: 0.000000
101
+ 2024-03-26 11:12:49,808 epoch 2 - iter 32/48 - loss 0.60027304 - time (sec): 17.65 - samples/sec: 1394.07 - lr: 0.000028 - momentum: 0.000000
102
+ 2024-03-26 11:12:51,737 epoch 2 - iter 36/48 - loss 0.58144132 - time (sec): 19.58 - samples/sec: 1401.19 - lr: 0.000028 - momentum: 0.000000
103
+ 2024-03-26 11:12:52,763 epoch 2 - iter 40/48 - loss 0.56884272 - time (sec): 20.60 - samples/sec: 1448.84 - lr: 0.000027 - momentum: 0.000000
104
+ 2024-03-26 11:12:54,236 epoch 2 - iter 44/48 - loss 0.56205915 - time (sec): 22.08 - samples/sec: 1468.31 - lr: 0.000027 - momentum: 0.000000
105
+ 2024-03-26 11:12:55,816 epoch 2 - iter 48/48 - loss 0.54402847 - time (sec): 23.66 - samples/sec: 1457.17 - lr: 0.000027 - momentum: 0.000000
106
+ 2024-03-26 11:12:55,816 ----------------------------------------------------------------------------------------------------
107
+ 2024-03-26 11:12:55,816 EPOCH 2 done: loss 0.5440 - lr: 0.000027
108
+ 2024-03-26 11:12:56,749 DEV : loss 0.3272944986820221 - f1-score (micro avg) 0.7565
109
+ 2024-03-26 11:12:56,750 saving best model
110
+ 2024-03-26 11:12:57,213 ----------------------------------------------------------------------------------------------------
111
+ 2024-03-26 11:12:59,771 epoch 3 - iter 4/48 - loss 0.30544675 - time (sec): 2.56 - samples/sec: 1176.92 - lr: 0.000026 - momentum: 0.000000
112
+ 2024-03-26 11:13:01,960 epoch 3 - iter 8/48 - loss 0.30379025 - time (sec): 4.75 - samples/sec: 1338.00 - lr: 0.000026 - momentum: 0.000000
113
+ 2024-03-26 11:13:03,549 epoch 3 - iter 12/48 - loss 0.31891615 - time (sec): 6.33 - samples/sec: 1400.54 - lr: 0.000026 - momentum: 0.000000
114
+ 2024-03-26 11:13:05,320 epoch 3 - iter 16/48 - loss 0.29901581 - time (sec): 8.11 - samples/sec: 1402.16 - lr: 0.000026 - momentum: 0.000000
115
+ 2024-03-26 11:13:06,511 epoch 3 - iter 20/48 - loss 0.30884041 - time (sec): 9.30 - samples/sec: 1471.74 - lr: 0.000025 - momentum: 0.000000
116
+ 2024-03-26 11:13:08,379 epoch 3 - iter 24/48 - loss 0.31810372 - time (sec): 11.16 - samples/sec: 1473.90 - lr: 0.000025 - momentum: 0.000000
117
+ 2024-03-26 11:13:10,884 epoch 3 - iter 28/48 - loss 0.31428333 - time (sec): 13.67 - samples/sec: 1415.34 - lr: 0.000025 - momentum: 0.000000
118
+ 2024-03-26 11:13:12,794 epoch 3 - iter 32/48 - loss 0.31267095 - time (sec): 15.58 - samples/sec: 1420.98 - lr: 0.000025 - momentum: 0.000000
119
+ 2024-03-26 11:13:14,278 epoch 3 - iter 36/48 - loss 0.30305540 - time (sec): 17.06 - samples/sec: 1452.18 - lr: 0.000024 - momentum: 0.000000
120
+ 2024-03-26 11:13:16,599 epoch 3 - iter 40/48 - loss 0.29326992 - time (sec): 19.38 - samples/sec: 1424.28 - lr: 0.000024 - momentum: 0.000000
121
+ 2024-03-26 11:13:19,982 epoch 3 - iter 44/48 - loss 0.27254472 - time (sec): 22.77 - samples/sec: 1415.27 - lr: 0.000024 - momentum: 0.000000
122
+ 2024-03-26 11:13:21,332 epoch 3 - iter 48/48 - loss 0.26917818 - time (sec): 24.12 - samples/sec: 1429.30 - lr: 0.000023 - momentum: 0.000000
123
+ 2024-03-26 11:13:21,333 ----------------------------------------------------------------------------------------------------
124
+ 2024-03-26 11:13:21,333 EPOCH 3 done: loss 0.2692 - lr: 0.000023
125
+ 2024-03-26 11:13:22,276 DEV : loss 0.2584502100944519 - f1-score (micro avg) 0.8292
126
+ 2024-03-26 11:13:22,279 saving best model
127
+ 2024-03-26 11:13:22,739 ----------------------------------------------------------------------------------------------------
128
+ 2024-03-26 11:13:24,351 epoch 4 - iter 4/48 - loss 0.29781570 - time (sec): 1.61 - samples/sec: 1582.51 - lr: 0.000023 - momentum: 0.000000
129
+ 2024-03-26 11:13:26,710 epoch 4 - iter 8/48 - loss 0.22897059 - time (sec): 3.97 - samples/sec: 1509.70 - lr: 0.000023 - momentum: 0.000000
130
+ 2024-03-26 11:13:27,969 epoch 4 - iter 12/48 - loss 0.21408042 - time (sec): 5.23 - samples/sec: 1598.39 - lr: 0.000023 - momentum: 0.000000
131
+ 2024-03-26 11:13:30,257 epoch 4 - iter 16/48 - loss 0.20453825 - time (sec): 7.52 - samples/sec: 1499.82 - lr: 0.000022 - momentum: 0.000000
132
+ 2024-03-26 11:13:32,884 epoch 4 - iter 20/48 - loss 0.19298299 - time (sec): 10.14 - samples/sec: 1378.36 - lr: 0.000022 - momentum: 0.000000
133
+ 2024-03-26 11:13:35,007 epoch 4 - iter 24/48 - loss 0.20089208 - time (sec): 12.27 - samples/sec: 1372.28 - lr: 0.000022 - momentum: 0.000000
134
+ 2024-03-26 11:13:37,157 epoch 4 - iter 28/48 - loss 0.19731722 - time (sec): 14.42 - samples/sec: 1379.88 - lr: 0.000022 - momentum: 0.000000
135
+ 2024-03-26 11:13:39,826 epoch 4 - iter 32/48 - loss 0.19273634 - time (sec): 17.09 - samples/sec: 1349.67 - lr: 0.000021 - momentum: 0.000000
136
+ 2024-03-26 11:13:42,677 epoch 4 - iter 36/48 - loss 0.18444787 - time (sec): 19.94 - samples/sec: 1341.70 - lr: 0.000021 - momentum: 0.000000
137
+ 2024-03-26 11:13:44,459 epoch 4 - iter 40/48 - loss 0.17908396 - time (sec): 21.72 - samples/sec: 1339.53 - lr: 0.000021 - momentum: 0.000000
138
+ 2024-03-26 11:13:46,546 epoch 4 - iter 44/48 - loss 0.17774704 - time (sec): 23.81 - samples/sec: 1340.95 - lr: 0.000020 - momentum: 0.000000
139
+ 2024-03-26 11:13:48,279 epoch 4 - iter 48/48 - loss 0.17546508 - time (sec): 25.54 - samples/sec: 1349.77 - lr: 0.000020 - momentum: 0.000000
140
+ 2024-03-26 11:13:48,279 ----------------------------------------------------------------------------------------------------
141
+ 2024-03-26 11:13:48,279 EPOCH 4 done: loss 0.1755 - lr: 0.000020
142
+ 2024-03-26 11:13:49,244 DEV : loss 0.2244909107685089 - f1-score (micro avg) 0.8723
143
+ 2024-03-26 11:13:49,246 saving best model
144
+ 2024-03-26 11:13:49,699 ----------------------------------------------------------------------------------------------------
145
+ 2024-03-26 11:13:50,536 epoch 5 - iter 4/48 - loss 0.09331887 - time (sec): 0.84 - samples/sec: 2191.66 - lr: 0.000020 - momentum: 0.000000
146
+ 2024-03-26 11:13:51,957 epoch 5 - iter 8/48 - loss 0.11095802 - time (sec): 2.26 - samples/sec: 1970.19 - lr: 0.000020 - momentum: 0.000000
147
+ 2024-03-26 11:13:54,821 epoch 5 - iter 12/48 - loss 0.10642550 - time (sec): 5.12 - samples/sec: 1558.10 - lr: 0.000019 - momentum: 0.000000
148
+ 2024-03-26 11:13:57,934 epoch 5 - iter 16/48 - loss 0.10476513 - time (sec): 8.23 - samples/sec: 1370.39 - lr: 0.000019 - momentum: 0.000000
149
+ 2024-03-26 11:13:59,365 epoch 5 - iter 20/48 - loss 0.11438732 - time (sec): 9.67 - samples/sec: 1420.32 - lr: 0.000019 - momentum: 0.000000
150
+ 2024-03-26 11:14:02,022 epoch 5 - iter 24/48 - loss 0.11257857 - time (sec): 12.32 - samples/sec: 1359.65 - lr: 0.000018 - momentum: 0.000000
151
+ 2024-03-26 11:14:04,162 epoch 5 - iter 28/48 - loss 0.11104460 - time (sec): 14.46 - samples/sec: 1351.01 - lr: 0.000018 - momentum: 0.000000
152
+ 2024-03-26 11:14:06,530 epoch 5 - iter 32/48 - loss 0.11618625 - time (sec): 16.83 - samples/sec: 1376.21 - lr: 0.000018 - momentum: 0.000000
153
+ 2024-03-26 11:14:08,037 epoch 5 - iter 36/48 - loss 0.12098463 - time (sec): 18.34 - samples/sec: 1400.80 - lr: 0.000018 - momentum: 0.000000
154
+ 2024-03-26 11:14:10,603 epoch 5 - iter 40/48 - loss 0.11599874 - time (sec): 20.90 - samples/sec: 1359.09 - lr: 0.000017 - momentum: 0.000000
155
+ 2024-03-26 11:14:12,737 epoch 5 - iter 44/48 - loss 0.11697148 - time (sec): 23.04 - samples/sec: 1373.23 - lr: 0.000017 - momentum: 0.000000
156
+ 2024-03-26 11:14:14,708 epoch 5 - iter 48/48 - loss 0.11878602 - time (sec): 25.01 - samples/sec: 1378.39 - lr: 0.000017 - momentum: 0.000000
157
+ 2024-03-26 11:14:14,708 ----------------------------------------------------------------------------------------------------
158
+ 2024-03-26 11:14:14,709 EPOCH 5 done: loss 0.1188 - lr: 0.000017
159
+ 2024-03-26 11:14:15,653 DEV : loss 0.20735225081443787 - f1-score (micro avg) 0.8886
160
+ 2024-03-26 11:14:15,654 saving best model
161
+ 2024-03-26 11:14:16,137 ----------------------------------------------------------------------------------------------------
162
+ 2024-03-26 11:14:17,840 epoch 6 - iter 4/48 - loss 0.11428082 - time (sec): 1.70 - samples/sec: 1463.60 - lr: 0.000017 - momentum: 0.000000
163
+ 2024-03-26 11:14:20,285 epoch 6 - iter 8/48 - loss 0.10615835 - time (sec): 4.15 - samples/sec: 1543.50 - lr: 0.000016 - momentum: 0.000000
164
+ 2024-03-26 11:14:22,266 epoch 6 - iter 12/48 - loss 0.09809695 - time (sec): 6.13 - samples/sec: 1478.37 - lr: 0.000016 - momentum: 0.000000
165
+ 2024-03-26 11:14:24,383 epoch 6 - iter 16/48 - loss 0.09716603 - time (sec): 8.24 - samples/sec: 1470.82 - lr: 0.000016 - momentum: 0.000000
166
+ 2024-03-26 11:14:27,150 epoch 6 - iter 20/48 - loss 0.09527628 - time (sec): 11.01 - samples/sec: 1450.83 - lr: 0.000015 - momentum: 0.000000
167
+ 2024-03-26 11:14:28,723 epoch 6 - iter 24/48 - loss 0.10550915 - time (sec): 12.58 - samples/sec: 1470.85 - lr: 0.000015 - momentum: 0.000000
168
+ 2024-03-26 11:14:30,147 epoch 6 - iter 28/48 - loss 0.10560312 - time (sec): 14.01 - samples/sec: 1475.45 - lr: 0.000015 - momentum: 0.000000
169
+ 2024-03-26 11:14:31,356 epoch 6 - iter 32/48 - loss 0.10260877 - time (sec): 15.22 - samples/sec: 1495.04 - lr: 0.000015 - momentum: 0.000000
170
+ 2024-03-26 11:14:32,882 epoch 6 - iter 36/48 - loss 0.09749042 - time (sec): 16.74 - samples/sec: 1524.96 - lr: 0.000014 - momentum: 0.000000
171
+ 2024-03-26 11:14:34,858 epoch 6 - iter 40/48 - loss 0.09905047 - time (sec): 18.72 - samples/sec: 1513.84 - lr: 0.000014 - momentum: 0.000000
172
+ 2024-03-26 11:14:37,140 epoch 6 - iter 44/48 - loss 0.09659837 - time (sec): 21.00 - samples/sec: 1531.03 - lr: 0.000014 - momentum: 0.000000
173
+ 2024-03-26 11:14:38,883 epoch 6 - iter 48/48 - loss 0.09620050 - time (sec): 22.74 - samples/sec: 1515.63 - lr: 0.000014 - momentum: 0.000000
174
+ 2024-03-26 11:14:38,883 ----------------------------------------------------------------------------------------------------
175
+ 2024-03-26 11:14:38,883 EPOCH 6 done: loss 0.0962 - lr: 0.000014
176
+ 2024-03-26 11:14:39,836 DEV : loss 0.18259809911251068 - f1-score (micro avg) 0.9103
177
+ 2024-03-26 11:14:39,837 saving best model
178
+ 2024-03-26 11:14:40,303 ----------------------------------------------------------------------------------------------------
179
+ 2024-03-26 11:14:41,940 epoch 7 - iter 4/48 - loss 0.06204275 - time (sec): 1.64 - samples/sec: 1488.75 - lr: 0.000013 - momentum: 0.000000
180
+ 2024-03-26 11:14:43,601 epoch 7 - iter 8/48 - loss 0.08162162 - time (sec): 3.30 - samples/sec: 1502.42 - lr: 0.000013 - momentum: 0.000000
181
+ 2024-03-26 11:14:45,773 epoch 7 - iter 12/48 - loss 0.07584555 - time (sec): 5.47 - samples/sec: 1439.06 - lr: 0.000013 - momentum: 0.000000
182
+ 2024-03-26 11:14:47,847 epoch 7 - iter 16/48 - loss 0.07316749 - time (sec): 7.54 - samples/sec: 1477.06 - lr: 0.000012 - momentum: 0.000000
183
+ 2024-03-26 11:14:48,507 epoch 7 - iter 20/48 - loss 0.06894237 - time (sec): 8.20 - samples/sec: 1579.88 - lr: 0.000012 - momentum: 0.000000
184
+ 2024-03-26 11:14:50,098 epoch 7 - iter 24/48 - loss 0.06789884 - time (sec): 9.79 - samples/sec: 1564.39 - lr: 0.000012 - momentum: 0.000000
185
+ 2024-03-26 11:14:53,003 epoch 7 - iter 28/48 - loss 0.06686264 - time (sec): 12.70 - samples/sec: 1466.57 - lr: 0.000012 - momentum: 0.000000
186
+ 2024-03-26 11:14:55,810 epoch 7 - iter 32/48 - loss 0.06577245 - time (sec): 15.51 - samples/sec: 1397.17 - lr: 0.000011 - momentum: 0.000000
187
+ 2024-03-26 11:14:58,649 epoch 7 - iter 36/48 - loss 0.07040216 - time (sec): 18.34 - samples/sec: 1405.31 - lr: 0.000011 - momentum: 0.000000
188
+ 2024-03-26 11:15:00,626 epoch 7 - iter 40/48 - loss 0.07466776 - time (sec): 20.32 - samples/sec: 1414.61 - lr: 0.000011 - momentum: 0.000000
189
+ 2024-03-26 11:15:03,204 epoch 7 - iter 44/48 - loss 0.07480391 - time (sec): 22.90 - samples/sec: 1390.99 - lr: 0.000010 - momentum: 0.000000
190
+ 2024-03-26 11:15:05,048 epoch 7 - iter 48/48 - loss 0.07371206 - time (sec): 24.74 - samples/sec: 1393.15 - lr: 0.000010 - momentum: 0.000000
191
+ 2024-03-26 11:15:05,048 ----------------------------------------------------------------------------------------------------
192
+ 2024-03-26 11:15:05,048 EPOCH 7 done: loss 0.0737 - lr: 0.000010
193
+ 2024-03-26 11:15:06,019 DEV : loss 0.1877364218235016 - f1-score (micro avg) 0.9073
194
+ 2024-03-26 11:15:06,020 ----------------------------------------------------------------------------------------------------
195
+ 2024-03-26 11:15:08,718 epoch 8 - iter 4/48 - loss 0.07579462 - time (sec): 2.70 - samples/sec: 1224.41 - lr: 0.000010 - momentum: 0.000000
196
+ 2024-03-26 11:15:10,839 epoch 8 - iter 8/48 - loss 0.05991518 - time (sec): 4.82 - samples/sec: 1217.82 - lr: 0.000010 - momentum: 0.000000
197
+ 2024-03-26 11:15:14,025 epoch 8 - iter 12/48 - loss 0.05723449 - time (sec): 8.00 - samples/sec: 1210.78 - lr: 0.000009 - momentum: 0.000000
198
+ 2024-03-26 11:15:16,001 epoch 8 - iter 16/48 - loss 0.06624845 - time (sec): 9.98 - samples/sec: 1236.69 - lr: 0.000009 - momentum: 0.000000
199
+ 2024-03-26 11:15:17,487 epoch 8 - iter 20/48 - loss 0.06264195 - time (sec): 11.47 - samples/sec: 1280.67 - lr: 0.000009 - momentum: 0.000000
200
+ 2024-03-26 11:15:20,048 epoch 8 - iter 24/48 - loss 0.06002540 - time (sec): 14.03 - samples/sec: 1272.13 - lr: 0.000009 - momentum: 0.000000
201
+ 2024-03-26 11:15:21,828 epoch 8 - iter 28/48 - loss 0.06409926 - time (sec): 15.81 - samples/sec: 1308.14 - lr: 0.000008 - momentum: 0.000000
202
+ 2024-03-26 11:15:23,457 epoch 8 - iter 32/48 - loss 0.06212462 - time (sec): 17.44 - samples/sec: 1334.23 - lr: 0.000008 - momentum: 0.000000
203
+ 2024-03-26 11:15:24,748 epoch 8 - iter 36/48 - loss 0.06077673 - time (sec): 18.73 - samples/sec: 1366.08 - lr: 0.000008 - momentum: 0.000000
204
+ 2024-03-26 11:15:27,154 epoch 8 - iter 40/48 - loss 0.06101414 - time (sec): 21.13 - samples/sec: 1371.91 - lr: 0.000007 - momentum: 0.000000
205
+ 2024-03-26 11:15:30,041 epoch 8 - iter 44/48 - loss 0.05819661 - time (sec): 24.02 - samples/sec: 1341.24 - lr: 0.000007 - momentum: 0.000000
206
+ 2024-03-26 11:15:32,117 epoch 8 - iter 48/48 - loss 0.05850858 - time (sec): 26.10 - samples/sec: 1320.96 - lr: 0.000007 - momentum: 0.000000
207
+ 2024-03-26 11:15:32,117 ----------------------------------------------------------------------------------------------------
208
+ 2024-03-26 11:15:32,117 EPOCH 8 done: loss 0.0585 - lr: 0.000007
209
+ 2024-03-26 11:15:33,074 DEV : loss 0.18553169071674347 - f1-score (micro avg) 0.9269
210
+ 2024-03-26 11:15:33,075 saving best model
211
+ 2024-03-26 11:15:33,555 ----------------------------------------------------------------------------------------------------
212
+ 2024-03-26 11:15:35,421 epoch 9 - iter 4/48 - loss 0.05732311 - time (sec): 1.86 - samples/sec: 1525.50 - lr: 0.000007 - momentum: 0.000000
213
+ 2024-03-26 11:15:37,946 epoch 9 - iter 8/48 - loss 0.04919745 - time (sec): 4.39 - samples/sec: 1397.05 - lr: 0.000006 - momentum: 0.000000
214
+ 2024-03-26 11:15:40,357 epoch 9 - iter 12/48 - loss 0.06047389 - time (sec): 6.80 - samples/sec: 1357.42 - lr: 0.000006 - momentum: 0.000000
215
+ 2024-03-26 11:15:42,457 epoch 9 - iter 16/48 - loss 0.06071694 - time (sec): 8.90 - samples/sec: 1358.99 - lr: 0.000006 - momentum: 0.000000
216
+ 2024-03-26 11:15:43,964 epoch 9 - iter 20/48 - loss 0.05316476 - time (sec): 10.41 - samples/sec: 1416.10 - lr: 0.000006 - momentum: 0.000000
217
+ 2024-03-26 11:15:45,205 epoch 9 - iter 24/48 - loss 0.05009929 - time (sec): 11.65 - samples/sec: 1462.49 - lr: 0.000005 - momentum: 0.000000
218
+ 2024-03-26 11:15:46,906 epoch 9 - iter 28/48 - loss 0.04927559 - time (sec): 13.35 - samples/sec: 1481.52 - lr: 0.000005 - momentum: 0.000000
219
+ 2024-03-26 11:15:49,258 epoch 9 - iter 32/48 - loss 0.05509981 - time (sec): 15.70 - samples/sec: 1464.55 - lr: 0.000005 - momentum: 0.000000
220
+ 2024-03-26 11:15:51,996 epoch 9 - iter 36/48 - loss 0.05508145 - time (sec): 18.44 - samples/sec: 1416.65 - lr: 0.000004 - momentum: 0.000000
221
+ 2024-03-26 11:15:54,978 epoch 9 - iter 40/48 - loss 0.05449908 - time (sec): 21.42 - samples/sec: 1375.91 - lr: 0.000004 - momentum: 0.000000
222
+ 2024-03-26 11:15:56,846 epoch 9 - iter 44/48 - loss 0.05400296 - time (sec): 23.29 - samples/sec: 1390.32 - lr: 0.000004 - momentum: 0.000000
223
+ 2024-03-26 11:15:57,905 epoch 9 - iter 48/48 - loss 0.05389343 - time (sec): 24.35 - samples/sec: 1415.78 - lr: 0.000004 - momentum: 0.000000
224
+ 2024-03-26 11:15:57,905 ----------------------------------------------------------------------------------------------------
225
+ 2024-03-26 11:15:57,905 EPOCH 9 done: loss 0.0539 - lr: 0.000004
226
+ 2024-03-26 11:15:58,855 DEV : loss 0.1756318360567093 - f1-score (micro avg) 0.9235
227
+ 2024-03-26 11:15:58,857 ----------------------------------------------------------------------------------------------------
228
+ 2024-03-26 11:16:01,220 epoch 10 - iter 4/48 - loss 0.02756730 - time (sec): 2.36 - samples/sec: 1397.98 - lr: 0.000003 - momentum: 0.000000
229
+ 2024-03-26 11:16:03,396 epoch 10 - iter 8/48 - loss 0.03411980 - time (sec): 4.54 - samples/sec: 1361.41 - lr: 0.000003 - momentum: 0.000000
230
+ 2024-03-26 11:16:05,325 epoch 10 - iter 12/48 - loss 0.03366891 - time (sec): 6.47 - samples/sec: 1364.60 - lr: 0.000003 - momentum: 0.000000
231
+ 2024-03-26 11:16:06,556 epoch 10 - iter 16/48 - loss 0.03593148 - time (sec): 7.70 - samples/sec: 1431.57 - lr: 0.000002 - momentum: 0.000000
232
+ 2024-03-26 11:16:08,558 epoch 10 - iter 20/48 - loss 0.04270731 - time (sec): 9.70 - samples/sec: 1413.16 - lr: 0.000002 - momentum: 0.000000
233
+ 2024-03-26 11:16:10,903 epoch 10 - iter 24/48 - loss 0.05028433 - time (sec): 12.04 - samples/sec: 1378.76 - lr: 0.000002 - momentum: 0.000000
234
+ 2024-03-26 11:16:11,805 epoch 10 - iter 28/48 - loss 0.05128601 - time (sec): 12.95 - samples/sec: 1451.20 - lr: 0.000002 - momentum: 0.000000
235
+ 2024-03-26 11:16:13,125 epoch 10 - iter 32/48 - loss 0.04953866 - time (sec): 14.27 - samples/sec: 1488.71 - lr: 0.000001 - momentum: 0.000000
236
+ 2024-03-26 11:16:15,943 epoch 10 - iter 36/48 - loss 0.04689467 - time (sec): 17.08 - samples/sec: 1445.36 - lr: 0.000001 - momentum: 0.000000
237
+ 2024-03-26 11:16:18,324 epoch 10 - iter 40/48 - loss 0.04744759 - time (sec): 19.47 - samples/sec: 1477.13 - lr: 0.000001 - momentum: 0.000000
238
+ 2024-03-26 11:16:20,948 epoch 10 - iter 44/48 - loss 0.04670001 - time (sec): 22.09 - samples/sec: 1451.98 - lr: 0.000001 - momentum: 0.000000
239
+ 2024-03-26 11:16:22,941 epoch 10 - iter 48/48 - loss 0.04566351 - time (sec): 24.08 - samples/sec: 1431.38 - lr: 0.000000 - momentum: 0.000000
240
+ 2024-03-26 11:16:22,942 ----------------------------------------------------------------------------------------------------
241
+ 2024-03-26 11:16:22,942 EPOCH 10 done: loss 0.0457 - lr: 0.000000
242
+ 2024-03-26 11:16:23,896 DEV : loss 0.18053248524665833 - f1-score (micro avg) 0.9227
243
+ 2024-03-26 11:16:24,184 ----------------------------------------------------------------------------------------------------
244
+ 2024-03-26 11:16:24,185 Loading model from best epoch ...
245
+ 2024-03-26 11:16:25,052 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
246
+ 2024-03-26 11:16:25,801
247
+ Results:
248
+ - F-score (micro) 0.8963
249
+ - F-score (macro) 0.6819
250
+ - Accuracy 0.8144
251
+
252
+ By class:
253
+ precision recall f1-score support
254
+
255
+ Unternehmen 0.8859 0.8759 0.8809 266
256
+ Auslagerung 0.8577 0.8956 0.8762 249
257
+ Ort 0.9565 0.9851 0.9706 134
258
+ Software 0.0000 0.0000 0.0000 0
259
+
260
+ micro avg 0.8869 0.9060 0.8963 649
261
+ macro avg 0.6750 0.6891 0.6819 649
262
+ weighted avg 0.8897 0.9060 0.8976 649
263
+
264
+ 2024-03-26 11:16:25,801 ----------------------------------------------------------------------------------------------------