File size: 23,850 Bytes
25d16ec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
2024-03-26 09:39:45,010 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,010 Model: "SequenceTagger(
  (embeddings): TransformerWordEmbeddings(
    (model): BertModel(
      (embeddings): BertEmbeddings(
        (word_embeddings): Embedding(31103, 768)
        (position_embeddings): Embedding(512, 768)
        (token_type_embeddings): Embedding(2, 768)
        (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
        (dropout): Dropout(p=0.1, inplace=False)
      )
      (encoder): BertEncoder(
        (layer): ModuleList(
          (0-11): 12 x BertLayer(
            (attention): BertAttention(
              (self): BertSelfAttention(
                (query): Linear(in_features=768, out_features=768, bias=True)
                (key): Linear(in_features=768, out_features=768, bias=True)
                (value): Linear(in_features=768, out_features=768, bias=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
              (output): BertSelfOutput(
                (dense): Linear(in_features=768, out_features=768, bias=True)
                (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
                (dropout): Dropout(p=0.1, inplace=False)
              )
            )
            (intermediate): BertIntermediate(
              (dense): Linear(in_features=768, out_features=3072, bias=True)
              (intermediate_act_fn): GELUActivation()
            )
            (output): BertOutput(
              (dense): Linear(in_features=3072, out_features=768, bias=True)
              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
              (dropout): Dropout(p=0.1, inplace=False)
            )
          )
        )
      )
      (pooler): BertPooler(
        (dense): Linear(in_features=768, out_features=768, bias=True)
        (activation): Tanh()
      )
    )
  )
  (locked_dropout): LockedDropout(p=0.5)
  (linear): Linear(in_features=768, out_features=17, bias=True)
  (loss_function): CrossEntropyLoss()
)"
2024-03-26 09:39:45,010 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Corpus: 758 train + 94 dev + 96 test sentences
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Train:  758 sentences
2024-03-26 09:39:45,011         (train_with_dev=False, train_with_test=False)
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Training Params:
2024-03-26 09:39:45,011  - learning_rate: "5e-05" 
2024-03-26 09:39:45,011  - mini_batch_size: "8"
2024-03-26 09:39:45,011  - max_epochs: "10"
2024-03-26 09:39:45,011  - shuffle: "True"
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Plugins:
2024-03-26 09:39:45,011  - TensorboardLogger
2024-03-26 09:39:45,011  - LinearScheduler | warmup_fraction: '0.1'
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Final evaluation on model from best epoch (best-model.pt)
2024-03-26 09:39:45,011  - metric: "('micro avg', 'f1-score')"
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Computation:
2024-03-26 09:39:45,011  - compute on device: cuda:0
2024-03-26 09:39:45,011  - embedding storage: none
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Model training base path: "flair-co-funer-gbert_base-bs8-e10-lr5e-05-1"
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 ----------------------------------------------------------------------------------------------------
2024-03-26 09:39:45,011 Logging anything other than scalars to TensorBoard is currently not supported.
2024-03-26 09:39:46,596 epoch 1 - iter 9/95 - loss 3.05326256 - time (sec): 1.59 - samples/sec: 1942.31 - lr: 0.000004 - momentum: 0.000000
2024-03-26 09:39:48,126 epoch 1 - iter 18/95 - loss 2.83895391 - time (sec): 3.12 - samples/sec: 2006.72 - lr: 0.000009 - momentum: 0.000000
2024-03-26 09:39:50,518 epoch 1 - iter 27/95 - loss 2.59331548 - time (sec): 5.51 - samples/sec: 1859.51 - lr: 0.000014 - momentum: 0.000000
2024-03-26 09:39:52,748 epoch 1 - iter 36/95 - loss 2.43003112 - time (sec): 7.74 - samples/sec: 1806.93 - lr: 0.000018 - momentum: 0.000000
2024-03-26 09:39:54,635 epoch 1 - iter 45/95 - loss 2.28997757 - time (sec): 9.62 - samples/sec: 1814.48 - lr: 0.000023 - momentum: 0.000000
2024-03-26 09:39:55,860 epoch 1 - iter 54/95 - loss 2.17102423 - time (sec): 10.85 - samples/sec: 1856.38 - lr: 0.000028 - momentum: 0.000000
2024-03-26 09:39:57,564 epoch 1 - iter 63/95 - loss 2.05061689 - time (sec): 12.55 - samples/sec: 1853.40 - lr: 0.000033 - momentum: 0.000000
2024-03-26 09:39:58,845 epoch 1 - iter 72/95 - loss 1.94125898 - time (sec): 13.83 - samples/sec: 1883.06 - lr: 0.000037 - momentum: 0.000000
2024-03-26 09:40:00,817 epoch 1 - iter 81/95 - loss 1.81009489 - time (sec): 15.81 - samples/sec: 1873.87 - lr: 0.000042 - momentum: 0.000000
2024-03-26 09:40:02,140 epoch 1 - iter 90/95 - loss 1.71143912 - time (sec): 17.13 - samples/sec: 1893.86 - lr: 0.000047 - momentum: 0.000000
2024-03-26 09:40:03,361 ----------------------------------------------------------------------------------------------------
2024-03-26 09:40:03,361 EPOCH 1 done: loss 1.6404 - lr: 0.000047
2024-03-26 09:40:04,256 DEV : loss 0.47643521428108215 - f1-score (micro avg)  0.6785
2024-03-26 09:40:04,257 saving best model
2024-03-26 09:40:04,516 ----------------------------------------------------------------------------------------------------
2024-03-26 09:40:06,572 epoch 2 - iter 9/95 - loss 0.50078791 - time (sec): 2.06 - samples/sec: 1796.30 - lr: 0.000050 - momentum: 0.000000
2024-03-26 09:40:08,253 epoch 2 - iter 18/95 - loss 0.51331330 - time (sec): 3.74 - samples/sec: 1941.68 - lr: 0.000049 - momentum: 0.000000
2024-03-26 09:40:10,067 epoch 2 - iter 27/95 - loss 0.48020903 - time (sec): 5.55 - samples/sec: 1857.13 - lr: 0.000048 - momentum: 0.000000
2024-03-26 09:40:11,830 epoch 2 - iter 36/95 - loss 0.45498260 - time (sec): 7.31 - samples/sec: 1828.20 - lr: 0.000048 - momentum: 0.000000
2024-03-26 09:40:13,733 epoch 2 - iter 45/95 - loss 0.42599735 - time (sec): 9.22 - samples/sec: 1837.36 - lr: 0.000047 - momentum: 0.000000
2024-03-26 09:40:15,932 epoch 2 - iter 54/95 - loss 0.39935025 - time (sec): 11.42 - samples/sec: 1808.62 - lr: 0.000047 - momentum: 0.000000
2024-03-26 09:40:17,254 epoch 2 - iter 63/95 - loss 0.39798422 - time (sec): 12.74 - samples/sec: 1849.41 - lr: 0.000046 - momentum: 0.000000
2024-03-26 09:40:18,582 epoch 2 - iter 72/95 - loss 0.38729757 - time (sec): 14.07 - samples/sec: 1880.73 - lr: 0.000046 - momentum: 0.000000
2024-03-26 09:40:20,377 epoch 2 - iter 81/95 - loss 0.37731322 - time (sec): 15.86 - samples/sec: 1866.35 - lr: 0.000045 - momentum: 0.000000
2024-03-26 09:40:22,028 epoch 2 - iter 90/95 - loss 0.36835332 - time (sec): 17.51 - samples/sec: 1863.65 - lr: 0.000045 - momentum: 0.000000
2024-03-26 09:40:22,959 ----------------------------------------------------------------------------------------------------
2024-03-26 09:40:22,959 EPOCH 2 done: loss 0.3625 - lr: 0.000045
2024-03-26 09:40:23,850 DEV : loss 0.2613222301006317 - f1-score (micro avg)  0.8448
2024-03-26 09:40:23,851 saving best model
2024-03-26 09:40:24,277 ----------------------------------------------------------------------------------------------------
2024-03-26 09:40:26,224 epoch 3 - iter 9/95 - loss 0.29508313 - time (sec): 1.95 - samples/sec: 1724.83 - lr: 0.000044 - momentum: 0.000000
2024-03-26 09:40:28,144 epoch 3 - iter 18/95 - loss 0.25482220 - time (sec): 3.87 - samples/sec: 1740.63 - lr: 0.000043 - momentum: 0.000000
2024-03-26 09:40:29,490 epoch 3 - iter 27/95 - loss 0.23523110 - time (sec): 5.21 - samples/sec: 1835.40 - lr: 0.000043 - momentum: 0.000000
2024-03-26 09:40:31,951 epoch 3 - iter 36/95 - loss 0.22688565 - time (sec): 7.67 - samples/sec: 1762.04 - lr: 0.000042 - momentum: 0.000000
2024-03-26 09:40:34,173 epoch 3 - iter 45/95 - loss 0.21492865 - time (sec): 9.89 - samples/sec: 1794.15 - lr: 0.000042 - momentum: 0.000000
2024-03-26 09:40:35,332 epoch 3 - iter 54/95 - loss 0.21043158 - time (sec): 11.05 - samples/sec: 1853.67 - lr: 0.000041 - momentum: 0.000000
2024-03-26 09:40:37,247 epoch 3 - iter 63/95 - loss 0.20029173 - time (sec): 12.97 - samples/sec: 1836.73 - lr: 0.000041 - momentum: 0.000000
2024-03-26 09:40:38,856 epoch 3 - iter 72/95 - loss 0.19106908 - time (sec): 14.58 - samples/sec: 1842.42 - lr: 0.000040 - momentum: 0.000000
2024-03-26 09:40:40,595 epoch 3 - iter 81/95 - loss 0.19348835 - time (sec): 16.32 - samples/sec: 1833.29 - lr: 0.000040 - momentum: 0.000000
2024-03-26 09:40:42,760 epoch 3 - iter 90/95 - loss 0.18565234 - time (sec): 18.48 - samples/sec: 1802.35 - lr: 0.000039 - momentum: 0.000000
2024-03-26 09:40:43,234 ----------------------------------------------------------------------------------------------------
2024-03-26 09:40:43,234 EPOCH 3 done: loss 0.1856 - lr: 0.000039
2024-03-26 09:40:44,130 DEV : loss 0.23596186935901642 - f1-score (micro avg)  0.8698
2024-03-26 09:40:44,131 saving best model
2024-03-26 09:40:44,555 ----------------------------------------------------------------------------------------------------
2024-03-26 09:40:46,145 epoch 4 - iter 9/95 - loss 0.14874311 - time (sec): 1.59 - samples/sec: 2028.73 - lr: 0.000039 - momentum: 0.000000
2024-03-26 09:40:48,154 epoch 4 - iter 18/95 - loss 0.12660212 - time (sec): 3.60 - samples/sec: 1792.97 - lr: 0.000038 - momentum: 0.000000
2024-03-26 09:40:49,928 epoch 4 - iter 27/95 - loss 0.13552995 - time (sec): 5.37 - samples/sec: 1813.99 - lr: 0.000037 - momentum: 0.000000
2024-03-26 09:40:52,465 epoch 4 - iter 36/95 - loss 0.11480824 - time (sec): 7.91 - samples/sec: 1742.21 - lr: 0.000037 - momentum: 0.000000
2024-03-26 09:40:54,141 epoch 4 - iter 45/95 - loss 0.12251085 - time (sec): 9.58 - samples/sec: 1761.77 - lr: 0.000036 - momentum: 0.000000
2024-03-26 09:40:55,663 epoch 4 - iter 54/95 - loss 0.12355944 - time (sec): 11.11 - samples/sec: 1816.10 - lr: 0.000036 - momentum: 0.000000
2024-03-26 09:40:57,506 epoch 4 - iter 63/95 - loss 0.12445641 - time (sec): 12.95 - samples/sec: 1838.45 - lr: 0.000035 - momentum: 0.000000
2024-03-26 09:40:58,769 epoch 4 - iter 72/95 - loss 0.12555656 - time (sec): 14.21 - samples/sec: 1869.52 - lr: 0.000035 - momentum: 0.000000
2024-03-26 09:41:00,477 epoch 4 - iter 81/95 - loss 0.12386474 - time (sec): 15.92 - samples/sec: 1858.83 - lr: 0.000034 - momentum: 0.000000
2024-03-26 09:41:01,961 epoch 4 - iter 90/95 - loss 0.12057520 - time (sec): 17.40 - samples/sec: 1879.79 - lr: 0.000034 - momentum: 0.000000
2024-03-26 09:41:02,859 ----------------------------------------------------------------------------------------------------
2024-03-26 09:41:02,859 EPOCH 4 done: loss 0.1194 - lr: 0.000034
2024-03-26 09:41:03,820 DEV : loss 0.19999347627162933 - f1-score (micro avg)  0.901
2024-03-26 09:41:03,822 saving best model
2024-03-26 09:41:04,250 ----------------------------------------------------------------------------------------------------
2024-03-26 09:41:05,898 epoch 5 - iter 9/95 - loss 0.07573403 - time (sec): 1.65 - samples/sec: 1922.66 - lr: 0.000033 - momentum: 0.000000
2024-03-26 09:41:08,022 epoch 5 - iter 18/95 - loss 0.07869165 - time (sec): 3.77 - samples/sec: 1777.86 - lr: 0.000032 - momentum: 0.000000
2024-03-26 09:41:09,581 epoch 5 - iter 27/95 - loss 0.07792090 - time (sec): 5.33 - samples/sec: 1820.04 - lr: 0.000032 - momentum: 0.000000
2024-03-26 09:41:11,245 epoch 5 - iter 36/95 - loss 0.08045788 - time (sec): 6.99 - samples/sec: 1803.86 - lr: 0.000031 - momentum: 0.000000
2024-03-26 09:41:12,916 epoch 5 - iter 45/95 - loss 0.08891161 - time (sec): 8.66 - samples/sec: 1851.54 - lr: 0.000031 - momentum: 0.000000
2024-03-26 09:41:14,512 epoch 5 - iter 54/95 - loss 0.09411735 - time (sec): 10.26 - samples/sec: 1895.46 - lr: 0.000030 - momentum: 0.000000
2024-03-26 09:41:16,339 epoch 5 - iter 63/95 - loss 0.09161241 - time (sec): 12.09 - samples/sec: 1874.42 - lr: 0.000030 - momentum: 0.000000
2024-03-26 09:41:18,555 epoch 5 - iter 72/95 - loss 0.08398116 - time (sec): 14.30 - samples/sec: 1897.22 - lr: 0.000029 - momentum: 0.000000
2024-03-26 09:41:19,793 epoch 5 - iter 81/95 - loss 0.08584221 - time (sec): 15.54 - samples/sec: 1916.06 - lr: 0.000029 - momentum: 0.000000
2024-03-26 09:41:21,926 epoch 5 - iter 90/95 - loss 0.08331687 - time (sec): 17.67 - samples/sec: 1873.64 - lr: 0.000028 - momentum: 0.000000
2024-03-26 09:41:22,547 ----------------------------------------------------------------------------------------------------
2024-03-26 09:41:22,547 EPOCH 5 done: loss 0.0837 - lr: 0.000028
2024-03-26 09:41:23,531 DEV : loss 0.18806229531764984 - f1-score (micro avg)  0.911
2024-03-26 09:41:23,532 saving best model
2024-03-26 09:41:23,936 ----------------------------------------------------------------------------------------------------
2024-03-26 09:41:25,491 epoch 6 - iter 9/95 - loss 0.04652808 - time (sec): 1.55 - samples/sec: 1859.82 - lr: 0.000027 - momentum: 0.000000
2024-03-26 09:41:27,487 epoch 6 - iter 18/95 - loss 0.06784961 - time (sec): 3.55 - samples/sec: 1847.95 - lr: 0.000027 - momentum: 0.000000
2024-03-26 09:41:29,145 epoch 6 - iter 27/95 - loss 0.07010724 - time (sec): 5.21 - samples/sec: 1887.21 - lr: 0.000026 - momentum: 0.000000
2024-03-26 09:41:30,788 epoch 6 - iter 36/95 - loss 0.06725345 - time (sec): 6.85 - samples/sec: 1849.66 - lr: 0.000026 - momentum: 0.000000
2024-03-26 09:41:32,367 epoch 6 - iter 45/95 - loss 0.06885703 - time (sec): 8.43 - samples/sec: 1865.39 - lr: 0.000025 - momentum: 0.000000
2024-03-26 09:41:34,350 epoch 6 - iter 54/95 - loss 0.06740445 - time (sec): 10.41 - samples/sec: 1846.29 - lr: 0.000025 - momentum: 0.000000
2024-03-26 09:41:35,908 epoch 6 - iter 63/95 - loss 0.07108303 - time (sec): 11.97 - samples/sec: 1846.54 - lr: 0.000024 - momentum: 0.000000
2024-03-26 09:41:38,697 epoch 6 - iter 72/95 - loss 0.06547846 - time (sec): 14.76 - samples/sec: 1806.63 - lr: 0.000024 - momentum: 0.000000
2024-03-26 09:41:40,524 epoch 6 - iter 81/95 - loss 0.06482397 - time (sec): 16.59 - samples/sec: 1815.43 - lr: 0.000023 - momentum: 0.000000
2024-03-26 09:41:42,179 epoch 6 - iter 90/95 - loss 0.06574631 - time (sec): 18.24 - samples/sec: 1809.61 - lr: 0.000023 - momentum: 0.000000
2024-03-26 09:41:42,794 ----------------------------------------------------------------------------------------------------
2024-03-26 09:41:42,795 EPOCH 6 done: loss 0.0667 - lr: 0.000023
2024-03-26 09:41:43,692 DEV : loss 0.174924835562706 - f1-score (micro avg)  0.9185
2024-03-26 09:41:43,693 saving best model
2024-03-26 09:41:44,116 ----------------------------------------------------------------------------------------------------
2024-03-26 09:41:45,420 epoch 7 - iter 9/95 - loss 0.06767963 - time (sec): 1.30 - samples/sec: 2270.48 - lr: 0.000022 - momentum: 0.000000
2024-03-26 09:41:47,027 epoch 7 - iter 18/95 - loss 0.06888470 - time (sec): 2.91 - samples/sec: 2018.29 - lr: 0.000021 - momentum: 0.000000
2024-03-26 09:41:48,805 epoch 7 - iter 27/95 - loss 0.06592699 - time (sec): 4.69 - samples/sec: 1949.91 - lr: 0.000021 - momentum: 0.000000
2024-03-26 09:41:50,655 epoch 7 - iter 36/95 - loss 0.05818934 - time (sec): 6.54 - samples/sec: 1913.50 - lr: 0.000020 - momentum: 0.000000
2024-03-26 09:41:52,926 epoch 7 - iter 45/95 - loss 0.05334698 - time (sec): 8.81 - samples/sec: 1860.40 - lr: 0.000020 - momentum: 0.000000
2024-03-26 09:41:53,898 epoch 7 - iter 54/95 - loss 0.05494572 - time (sec): 9.78 - samples/sec: 1936.94 - lr: 0.000019 - momentum: 0.000000
2024-03-26 09:41:55,732 epoch 7 - iter 63/95 - loss 0.05116780 - time (sec): 11.61 - samples/sec: 1936.77 - lr: 0.000019 - momentum: 0.000000
2024-03-26 09:41:57,627 epoch 7 - iter 72/95 - loss 0.04795444 - time (sec): 13.51 - samples/sec: 1895.98 - lr: 0.000018 - momentum: 0.000000
2024-03-26 09:41:59,541 epoch 7 - iter 81/95 - loss 0.04872493 - time (sec): 15.42 - samples/sec: 1893.01 - lr: 0.000018 - momentum: 0.000000
2024-03-26 09:42:01,463 epoch 7 - iter 90/95 - loss 0.04880589 - time (sec): 17.35 - samples/sec: 1895.49 - lr: 0.000017 - momentum: 0.000000
2024-03-26 09:42:02,287 ----------------------------------------------------------------------------------------------------
2024-03-26 09:42:02,287 EPOCH 7 done: loss 0.0481 - lr: 0.000017
2024-03-26 09:42:03,187 DEV : loss 0.1872955858707428 - f1-score (micro avg)  0.92
2024-03-26 09:42:03,188 saving best model
2024-03-26 09:42:03,612 ----------------------------------------------------------------------------------------------------
2024-03-26 09:42:05,211 epoch 8 - iter 9/95 - loss 0.04073955 - time (sec): 1.60 - samples/sec: 1872.84 - lr: 0.000016 - momentum: 0.000000
2024-03-26 09:42:07,208 epoch 8 - iter 18/95 - loss 0.03528090 - time (sec): 3.59 - samples/sec: 1692.16 - lr: 0.000016 - momentum: 0.000000
2024-03-26 09:42:08,761 epoch 8 - iter 27/95 - loss 0.04052889 - time (sec): 5.15 - samples/sec: 1788.69 - lr: 0.000015 - momentum: 0.000000
2024-03-26 09:42:10,474 epoch 8 - iter 36/95 - loss 0.04517583 - time (sec): 6.86 - samples/sec: 1835.26 - lr: 0.000015 - momentum: 0.000000
2024-03-26 09:42:12,770 epoch 8 - iter 45/95 - loss 0.03767857 - time (sec): 9.16 - samples/sec: 1815.74 - lr: 0.000014 - momentum: 0.000000
2024-03-26 09:42:15,060 epoch 8 - iter 54/95 - loss 0.03986219 - time (sec): 11.45 - samples/sec: 1819.38 - lr: 0.000014 - momentum: 0.000000
2024-03-26 09:42:17,011 epoch 8 - iter 63/95 - loss 0.04051201 - time (sec): 13.40 - samples/sec: 1822.54 - lr: 0.000013 - momentum: 0.000000
2024-03-26 09:42:18,089 epoch 8 - iter 72/95 - loss 0.03988279 - time (sec): 14.47 - samples/sec: 1855.07 - lr: 0.000013 - momentum: 0.000000
2024-03-26 09:42:19,748 epoch 8 - iter 81/95 - loss 0.03860408 - time (sec): 16.13 - samples/sec: 1839.72 - lr: 0.000012 - momentum: 0.000000
2024-03-26 09:42:21,104 epoch 8 - iter 90/95 - loss 0.03816824 - time (sec): 17.49 - samples/sec: 1855.88 - lr: 0.000012 - momentum: 0.000000
2024-03-26 09:42:22,312 ----------------------------------------------------------------------------------------------------
2024-03-26 09:42:22,312 EPOCH 8 done: loss 0.0396 - lr: 0.000012
2024-03-26 09:42:23,209 DEV : loss 0.18396545946598053 - f1-score (micro avg)  0.9319
2024-03-26 09:42:23,210 saving best model
2024-03-26 09:42:23,634 ----------------------------------------------------------------------------------------------------
2024-03-26 09:42:25,375 epoch 9 - iter 9/95 - loss 0.01845985 - time (sec): 1.74 - samples/sec: 1997.59 - lr: 0.000011 - momentum: 0.000000
2024-03-26 09:42:27,288 epoch 9 - iter 18/95 - loss 0.01833515 - time (sec): 3.65 - samples/sec: 1849.94 - lr: 0.000010 - momentum: 0.000000
2024-03-26 09:42:29,101 epoch 9 - iter 27/95 - loss 0.02202731 - time (sec): 5.47 - samples/sec: 1797.96 - lr: 0.000010 - momentum: 0.000000
2024-03-26 09:42:30,945 epoch 9 - iter 36/95 - loss 0.03266732 - time (sec): 7.31 - samples/sec: 1842.06 - lr: 0.000009 - momentum: 0.000000
2024-03-26 09:42:32,805 epoch 9 - iter 45/95 - loss 0.02971071 - time (sec): 9.17 - samples/sec: 1818.40 - lr: 0.000009 - momentum: 0.000000
2024-03-26 09:42:34,629 epoch 9 - iter 54/95 - loss 0.02917467 - time (sec): 10.99 - samples/sec: 1850.50 - lr: 0.000008 - momentum: 0.000000
2024-03-26 09:42:36,484 epoch 9 - iter 63/95 - loss 0.02881615 - time (sec): 12.85 - samples/sec: 1848.20 - lr: 0.000008 - momentum: 0.000000
2024-03-26 09:42:38,047 epoch 9 - iter 72/95 - loss 0.03226438 - time (sec): 14.41 - samples/sec: 1857.67 - lr: 0.000007 - momentum: 0.000000
2024-03-26 09:42:39,732 epoch 9 - iter 81/95 - loss 0.03501505 - time (sec): 16.10 - samples/sec: 1847.40 - lr: 0.000007 - momentum: 0.000000
2024-03-26 09:42:41,468 epoch 9 - iter 90/95 - loss 0.03249197 - time (sec): 17.83 - samples/sec: 1864.18 - lr: 0.000006 - momentum: 0.000000
2024-03-26 09:42:41,959 ----------------------------------------------------------------------------------------------------
2024-03-26 09:42:41,959 EPOCH 9 done: loss 0.0336 - lr: 0.000006
2024-03-26 09:42:42,855 DEV : loss 0.17702238261699677 - f1-score (micro avg)  0.9415
2024-03-26 09:42:42,856 saving best model
2024-03-26 09:42:43,284 ----------------------------------------------------------------------------------------------------
2024-03-26 09:42:44,746 epoch 10 - iter 9/95 - loss 0.00570096 - time (sec): 1.46 - samples/sec: 1901.64 - lr: 0.000005 - momentum: 0.000000
2024-03-26 09:42:46,561 epoch 10 - iter 18/95 - loss 0.01306014 - time (sec): 3.28 - samples/sec: 1845.46 - lr: 0.000005 - momentum: 0.000000
2024-03-26 09:42:48,768 epoch 10 - iter 27/95 - loss 0.02043505 - time (sec): 5.48 - samples/sec: 1763.51 - lr: 0.000004 - momentum: 0.000000
2024-03-26 09:42:50,611 epoch 10 - iter 36/95 - loss 0.02853352 - time (sec): 7.33 - samples/sec: 1790.46 - lr: 0.000004 - momentum: 0.000000
2024-03-26 09:42:51,766 epoch 10 - iter 45/95 - loss 0.02757904 - time (sec): 8.48 - samples/sec: 1848.38 - lr: 0.000003 - momentum: 0.000000
2024-03-26 09:42:53,667 epoch 10 - iter 54/95 - loss 0.03013318 - time (sec): 10.38 - samples/sec: 1834.01 - lr: 0.000003 - momentum: 0.000000
2024-03-26 09:42:55,048 epoch 10 - iter 63/95 - loss 0.03142502 - time (sec): 11.76 - samples/sec: 1847.45 - lr: 0.000002 - momentum: 0.000000
2024-03-26 09:42:57,280 epoch 10 - iter 72/95 - loss 0.02735320 - time (sec): 13.99 - samples/sec: 1830.31 - lr: 0.000002 - momentum: 0.000000
2024-03-26 09:42:59,572 epoch 10 - iter 81/95 - loss 0.03121748 - time (sec): 16.29 - samples/sec: 1813.03 - lr: 0.000001 - momentum: 0.000000
2024-03-26 09:43:01,405 epoch 10 - iter 90/95 - loss 0.02889432 - time (sec): 18.12 - samples/sec: 1806.66 - lr: 0.000001 - momentum: 0.000000
2024-03-26 09:43:02,414 ----------------------------------------------------------------------------------------------------
2024-03-26 09:43:02,414 EPOCH 10 done: loss 0.0278 - lr: 0.000001
2024-03-26 09:43:03,313 DEV : loss 0.18273191154003143 - f1-score (micro avg)  0.9477
2024-03-26 09:43:03,314 saving best model
2024-03-26 09:43:04,053 ----------------------------------------------------------------------------------------------------
2024-03-26 09:43:04,053 Loading model from best epoch ...
2024-03-26 09:43:04,938 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
2024-03-26 09:43:05,688 
Results:
- F-score (micro) 0.918
- F-score (macro) 0.696
- Accuracy 0.8509

By class:
              precision    recall  f1-score   support

 Unternehmen     0.9294    0.8910    0.9098       266
 Auslagerung     0.8779    0.9237    0.9002       249
         Ort     0.9635    0.9851    0.9742       134
    Software     0.0000    0.0000    0.0000         0

   micro avg     0.9131    0.9230    0.9180       649
   macro avg     0.6927    0.6999    0.6960       649
weighted avg     0.9167    0.9230    0.9194       649

2024-03-26 09:43:05,688 ----------------------------------------------------------------------------------------------------