stefan-it commited on
Commit
ec3e9d1
1 Parent(s): 9013245

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +267 -0
training.log ADDED
@@ -0,0 +1,267 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-03-26 09:59:31,395 ----------------------------------------------------------------------------------------------------
2
+ 2024-03-26 09:59:31,395 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(31103, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-11): 12 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=768, out_features=768, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=768, out_features=17, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2024-03-26 09:59:31,395 ----------------------------------------------------------------------------------------------------
51
+ 2024-03-26 09:59:31,395 Corpus: 758 train + 94 dev + 96 test sentences
52
+ 2024-03-26 09:59:31,395 ----------------------------------------------------------------------------------------------------
53
+ 2024-03-26 09:59:31,395 Train: 758 sentences
54
+ 2024-03-26 09:59:31,395 (train_with_dev=False, train_with_test=False)
55
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
56
+ 2024-03-26 09:59:31,396 Training Params:
57
+ 2024-03-26 09:59:31,396 - learning_rate: "3e-05"
58
+ 2024-03-26 09:59:31,396 - mini_batch_size: "16"
59
+ 2024-03-26 09:59:31,396 - max_epochs: "10"
60
+ 2024-03-26 09:59:31,396 - shuffle: "True"
61
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
62
+ 2024-03-26 09:59:31,396 Plugins:
63
+ 2024-03-26 09:59:31,396 - TensorboardLogger
64
+ 2024-03-26 09:59:31,396 - LinearScheduler | warmup_fraction: '0.1'
65
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
66
+ 2024-03-26 09:59:31,396 Final evaluation on model from best epoch (best-model.pt)
67
+ 2024-03-26 09:59:31,396 - metric: "('micro avg', 'f1-score')"
68
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
69
+ 2024-03-26 09:59:31,396 Computation:
70
+ 2024-03-26 09:59:31,396 - compute on device: cuda:0
71
+ 2024-03-26 09:59:31,396 - embedding storage: none
72
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
73
+ 2024-03-26 09:59:31,396 Model training base path: "flair-co-funer-gbert_base-bs16-e10-lr3e-05-3"
74
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
75
+ 2024-03-26 09:59:31,396 ----------------------------------------------------------------------------------------------------
76
+ 2024-03-26 09:59:31,396 Logging anything other than scalars to TensorBoard is currently not supported.
77
+ 2024-03-26 09:59:32,643 epoch 1 - iter 4/48 - loss 3.43646522 - time (sec): 1.25 - samples/sec: 2208.74 - lr: 0.000002 - momentum: 0.000000
78
+ 2024-03-26 09:59:34,614 epoch 1 - iter 8/48 - loss 3.36082406 - time (sec): 3.22 - samples/sec: 1810.00 - lr: 0.000004 - momentum: 0.000000
79
+ 2024-03-26 09:59:36,149 epoch 1 - iter 12/48 - loss 3.27049571 - time (sec): 4.75 - samples/sec: 1761.72 - lr: 0.000007 - momentum: 0.000000
80
+ 2024-03-26 09:59:39,034 epoch 1 - iter 16/48 - loss 3.11812662 - time (sec): 7.64 - samples/sec: 1518.44 - lr: 0.000009 - momentum: 0.000000
81
+ 2024-03-26 09:59:40,677 epoch 1 - iter 20/48 - loss 2.96663653 - time (sec): 9.28 - samples/sec: 1551.98 - lr: 0.000012 - momentum: 0.000000
82
+ 2024-03-26 09:59:42,103 epoch 1 - iter 24/48 - loss 2.83877052 - time (sec): 10.71 - samples/sec: 1603.40 - lr: 0.000014 - momentum: 0.000000
83
+ 2024-03-26 09:59:43,412 epoch 1 - iter 28/48 - loss 2.71878471 - time (sec): 12.02 - samples/sec: 1625.16 - lr: 0.000017 - momentum: 0.000000
84
+ 2024-03-26 09:59:45,484 epoch 1 - iter 32/48 - loss 2.59293993 - time (sec): 14.09 - samples/sec: 1613.37 - lr: 0.000019 - momentum: 0.000000
85
+ 2024-03-26 09:59:47,021 epoch 1 - iter 36/48 - loss 2.47846593 - time (sec): 15.63 - samples/sec: 1633.21 - lr: 0.000022 - momentum: 0.000000
86
+ 2024-03-26 09:59:49,236 epoch 1 - iter 40/48 - loss 2.36097982 - time (sec): 17.84 - samples/sec: 1624.25 - lr: 0.000024 - momentum: 0.000000
87
+ 2024-03-26 09:59:51,141 epoch 1 - iter 44/48 - loss 2.25907270 - time (sec): 19.75 - samples/sec: 1624.00 - lr: 0.000027 - momentum: 0.000000
88
+ 2024-03-26 09:59:52,744 epoch 1 - iter 48/48 - loss 2.17647950 - time (sec): 21.35 - samples/sec: 1614.77 - lr: 0.000029 - momentum: 0.000000
89
+ 2024-03-26 09:59:52,744 ----------------------------------------------------------------------------------------------------
90
+ 2024-03-26 09:59:52,744 EPOCH 1 done: loss 2.1765 - lr: 0.000029
91
+ 2024-03-26 09:59:53,553 DEV : loss 0.7939577102661133 - f1-score (micro avg) 0.4569
92
+ 2024-03-26 09:59:53,554 saving best model
93
+ 2024-03-26 09:59:53,834 ----------------------------------------------------------------------------------------------------
94
+ 2024-03-26 09:59:55,240 epoch 2 - iter 4/48 - loss 0.98641542 - time (sec): 1.41 - samples/sec: 1775.70 - lr: 0.000030 - momentum: 0.000000
95
+ 2024-03-26 09:59:56,693 epoch 2 - iter 8/48 - loss 0.85278575 - time (sec): 2.86 - samples/sec: 1707.91 - lr: 0.000030 - momentum: 0.000000
96
+ 2024-03-26 09:59:58,106 epoch 2 - iter 12/48 - loss 0.81842829 - time (sec): 4.27 - samples/sec: 1799.84 - lr: 0.000029 - momentum: 0.000000
97
+ 2024-03-26 09:59:59,970 epoch 2 - iter 16/48 - loss 0.75652823 - time (sec): 6.14 - samples/sec: 1753.35 - lr: 0.000029 - momentum: 0.000000
98
+ 2024-03-26 10:00:02,304 epoch 2 - iter 20/48 - loss 0.72523901 - time (sec): 8.47 - samples/sec: 1674.24 - lr: 0.000029 - momentum: 0.000000
99
+ 2024-03-26 10:00:04,313 epoch 2 - iter 24/48 - loss 0.67602021 - time (sec): 10.48 - samples/sec: 1655.27 - lr: 0.000028 - momentum: 0.000000
100
+ 2024-03-26 10:00:07,019 epoch 2 - iter 28/48 - loss 0.65009793 - time (sec): 13.18 - samples/sec: 1587.16 - lr: 0.000028 - momentum: 0.000000
101
+ 2024-03-26 10:00:09,208 epoch 2 - iter 32/48 - loss 0.62484745 - time (sec): 15.37 - samples/sec: 1552.58 - lr: 0.000028 - momentum: 0.000000
102
+ 2024-03-26 10:00:10,913 epoch 2 - iter 36/48 - loss 0.61261076 - time (sec): 17.08 - samples/sec: 1546.83 - lr: 0.000028 - momentum: 0.000000
103
+ 2024-03-26 10:00:12,600 epoch 2 - iter 40/48 - loss 0.60825046 - time (sec): 18.77 - samples/sec: 1554.54 - lr: 0.000027 - momentum: 0.000000
104
+ 2024-03-26 10:00:14,733 epoch 2 - iter 44/48 - loss 0.59375280 - time (sec): 20.90 - samples/sec: 1550.50 - lr: 0.000027 - momentum: 0.000000
105
+ 2024-03-26 10:00:16,229 epoch 2 - iter 48/48 - loss 0.57760153 - time (sec): 22.39 - samples/sec: 1539.34 - lr: 0.000027 - momentum: 0.000000
106
+ 2024-03-26 10:00:16,229 ----------------------------------------------------------------------------------------------------
107
+ 2024-03-26 10:00:16,229 EPOCH 2 done: loss 0.5776 - lr: 0.000027
108
+ 2024-03-26 10:00:17,119 DEV : loss 0.32868492603302 - f1-score (micro avg) 0.7997
109
+ 2024-03-26 10:00:17,120 saving best model
110
+ 2024-03-26 10:00:17,588 ----------------------------------------------------------------------------------------------------
111
+ 2024-03-26 10:00:19,074 epoch 3 - iter 4/48 - loss 0.38161214 - time (sec): 1.48 - samples/sec: 1650.47 - lr: 0.000026 - momentum: 0.000000
112
+ 2024-03-26 10:00:21,853 epoch 3 - iter 8/48 - loss 0.33926201 - time (sec): 4.26 - samples/sec: 1343.95 - lr: 0.000026 - momentum: 0.000000
113
+ 2024-03-26 10:00:23,088 epoch 3 - iter 12/48 - loss 0.34656835 - time (sec): 5.50 - samples/sec: 1481.70 - lr: 0.000026 - momentum: 0.000000
114
+ 2024-03-26 10:00:24,439 epoch 3 - iter 16/48 - loss 0.31664455 - time (sec): 6.85 - samples/sec: 1611.63 - lr: 0.000026 - momentum: 0.000000
115
+ 2024-03-26 10:00:25,877 epoch 3 - iter 20/48 - loss 0.32091794 - time (sec): 8.29 - samples/sec: 1630.39 - lr: 0.000025 - momentum: 0.000000
116
+ 2024-03-26 10:00:28,556 epoch 3 - iter 24/48 - loss 0.31277796 - time (sec): 10.97 - samples/sec: 1523.16 - lr: 0.000025 - momentum: 0.000000
117
+ 2024-03-26 10:00:30,448 epoch 3 - iter 28/48 - loss 0.30764724 - time (sec): 12.86 - samples/sec: 1541.44 - lr: 0.000025 - momentum: 0.000000
118
+ 2024-03-26 10:00:32,937 epoch 3 - iter 32/48 - loss 0.29367022 - time (sec): 15.35 - samples/sec: 1485.72 - lr: 0.000025 - momentum: 0.000000
119
+ 2024-03-26 10:00:34,844 epoch 3 - iter 36/48 - loss 0.29104830 - time (sec): 17.25 - samples/sec: 1482.42 - lr: 0.000024 - momentum: 0.000000
120
+ 2024-03-26 10:00:37,177 epoch 3 - iter 40/48 - loss 0.28398890 - time (sec): 19.59 - samples/sec: 1458.33 - lr: 0.000024 - momentum: 0.000000
121
+ 2024-03-26 10:00:39,592 epoch 3 - iter 44/48 - loss 0.29123866 - time (sec): 22.00 - samples/sec: 1446.29 - lr: 0.000024 - momentum: 0.000000
122
+ 2024-03-26 10:00:41,895 epoch 3 - iter 48/48 - loss 0.28152527 - time (sec): 24.31 - samples/sec: 1418.30 - lr: 0.000023 - momentum: 0.000000
123
+ 2024-03-26 10:00:41,895 ----------------------------------------------------------------------------------------------------
124
+ 2024-03-26 10:00:41,895 EPOCH 3 done: loss 0.2815 - lr: 0.000023
125
+ 2024-03-26 10:00:42,794 DEV : loss 0.25886547565460205 - f1-score (micro avg) 0.8309
126
+ 2024-03-26 10:00:42,796 saving best model
127
+ 2024-03-26 10:00:43,250 ----------------------------------------------------------------------------------------------------
128
+ 2024-03-26 10:00:44,615 epoch 4 - iter 4/48 - loss 0.22042310 - time (sec): 1.36 - samples/sec: 1839.59 - lr: 0.000023 - momentum: 0.000000
129
+ 2024-03-26 10:00:46,517 epoch 4 - iter 8/48 - loss 0.20257439 - time (sec): 3.26 - samples/sec: 1641.50 - lr: 0.000023 - momentum: 0.000000
130
+ 2024-03-26 10:00:49,029 epoch 4 - iter 12/48 - loss 0.18510721 - time (sec): 5.78 - samples/sec: 1461.04 - lr: 0.000023 - momentum: 0.000000
131
+ 2024-03-26 10:00:50,904 epoch 4 - iter 16/48 - loss 0.18860193 - time (sec): 7.65 - samples/sec: 1479.96 - lr: 0.000022 - momentum: 0.000000
132
+ 2024-03-26 10:00:53,270 epoch 4 - iter 20/48 - loss 0.17922849 - time (sec): 10.02 - samples/sec: 1466.72 - lr: 0.000022 - momentum: 0.000000
133
+ 2024-03-26 10:00:56,134 epoch 4 - iter 24/48 - loss 0.16967835 - time (sec): 12.88 - samples/sec: 1415.56 - lr: 0.000022 - momentum: 0.000000
134
+ 2024-03-26 10:00:57,249 epoch 4 - iter 28/48 - loss 0.16942022 - time (sec): 14.00 - samples/sec: 1451.42 - lr: 0.000022 - momentum: 0.000000
135
+ 2024-03-26 10:01:00,237 epoch 4 - iter 32/48 - loss 0.16567032 - time (sec): 16.98 - samples/sec: 1390.60 - lr: 0.000021 - momentum: 0.000000
136
+ 2024-03-26 10:01:01,951 epoch 4 - iter 36/48 - loss 0.17214452 - time (sec): 18.70 - samples/sec: 1422.89 - lr: 0.000021 - momentum: 0.000000
137
+ 2024-03-26 10:01:04,761 epoch 4 - iter 40/48 - loss 0.17727138 - time (sec): 21.51 - samples/sec: 1389.46 - lr: 0.000021 - momentum: 0.000000
138
+ 2024-03-26 10:01:05,665 epoch 4 - iter 44/48 - loss 0.18046888 - time (sec): 22.41 - samples/sec: 1435.74 - lr: 0.000020 - momentum: 0.000000
139
+ 2024-03-26 10:01:07,142 epoch 4 - iter 48/48 - loss 0.18200329 - time (sec): 23.89 - samples/sec: 1443.02 - lr: 0.000020 - momentum: 0.000000
140
+ 2024-03-26 10:01:07,142 ----------------------------------------------------------------------------------------------------
141
+ 2024-03-26 10:01:07,142 EPOCH 4 done: loss 0.1820 - lr: 0.000020
142
+ 2024-03-26 10:01:08,036 DEV : loss 0.1913670003414154 - f1-score (micro avg) 0.8849
143
+ 2024-03-26 10:01:08,037 saving best model
144
+ 2024-03-26 10:01:08,487 ----------------------------------------------------------------------------------------------------
145
+ 2024-03-26 10:01:10,926 epoch 5 - iter 4/48 - loss 0.12163748 - time (sec): 2.44 - samples/sec: 1304.60 - lr: 0.000020 - momentum: 0.000000
146
+ 2024-03-26 10:01:12,343 epoch 5 - iter 8/48 - loss 0.14118800 - time (sec): 3.85 - samples/sec: 1477.15 - lr: 0.000020 - momentum: 0.000000
147
+ 2024-03-26 10:01:13,798 epoch 5 - iter 12/48 - loss 0.14109407 - time (sec): 5.31 - samples/sec: 1552.37 - lr: 0.000019 - momentum: 0.000000
148
+ 2024-03-26 10:01:15,969 epoch 5 - iter 16/48 - loss 0.13294762 - time (sec): 7.48 - samples/sec: 1470.95 - lr: 0.000019 - momentum: 0.000000
149
+ 2024-03-26 10:01:18,018 epoch 5 - iter 20/48 - loss 0.14325906 - time (sec): 9.53 - samples/sec: 1477.67 - lr: 0.000019 - momentum: 0.000000
150
+ 2024-03-26 10:01:20,481 epoch 5 - iter 24/48 - loss 0.13499390 - time (sec): 11.99 - samples/sec: 1468.01 - lr: 0.000018 - momentum: 0.000000
151
+ 2024-03-26 10:01:23,022 epoch 5 - iter 28/48 - loss 0.12862412 - time (sec): 14.53 - samples/sec: 1448.39 - lr: 0.000018 - momentum: 0.000000
152
+ 2024-03-26 10:01:24,885 epoch 5 - iter 32/48 - loss 0.13148701 - time (sec): 16.40 - samples/sec: 1452.41 - lr: 0.000018 - momentum: 0.000000
153
+ 2024-03-26 10:01:26,694 epoch 5 - iter 36/48 - loss 0.12883528 - time (sec): 18.20 - samples/sec: 1452.72 - lr: 0.000018 - momentum: 0.000000
154
+ 2024-03-26 10:01:28,975 epoch 5 - iter 40/48 - loss 0.12796493 - time (sec): 20.49 - samples/sec: 1441.69 - lr: 0.000017 - momentum: 0.000000
155
+ 2024-03-26 10:01:30,915 epoch 5 - iter 44/48 - loss 0.13048640 - time (sec): 22.43 - samples/sec: 1440.37 - lr: 0.000017 - momentum: 0.000000
156
+ 2024-03-26 10:01:31,959 epoch 5 - iter 48/48 - loss 0.12997086 - time (sec): 23.47 - samples/sec: 1468.75 - lr: 0.000017 - momentum: 0.000000
157
+ 2024-03-26 10:01:31,960 ----------------------------------------------------------------------------------------------------
158
+ 2024-03-26 10:01:31,960 EPOCH 5 done: loss 0.1300 - lr: 0.000017
159
+ 2024-03-26 10:01:32,853 DEV : loss 0.17436812818050385 - f1-score (micro avg) 0.8919
160
+ 2024-03-26 10:01:32,854 saving best model
161
+ 2024-03-26 10:01:33,303 ----------------------------------------------------------------------------------------------------
162
+ 2024-03-26 10:01:35,888 epoch 6 - iter 4/48 - loss 0.09782980 - time (sec): 2.58 - samples/sec: 1230.90 - lr: 0.000017 - momentum: 0.000000
163
+ 2024-03-26 10:01:37,873 epoch 6 - iter 8/48 - loss 0.09889715 - time (sec): 4.57 - samples/sec: 1285.37 - lr: 0.000016 - momentum: 0.000000
164
+ 2024-03-26 10:01:39,433 epoch 6 - iter 12/48 - loss 0.10167303 - time (sec): 6.13 - samples/sec: 1439.69 - lr: 0.000016 - momentum: 0.000000
165
+ 2024-03-26 10:01:41,380 epoch 6 - iter 16/48 - loss 0.09465287 - time (sec): 8.07 - samples/sec: 1440.18 - lr: 0.000016 - momentum: 0.000000
166
+ 2024-03-26 10:01:42,448 epoch 6 - iter 20/48 - loss 0.09812166 - time (sec): 9.14 - samples/sec: 1528.27 - lr: 0.000015 - momentum: 0.000000
167
+ 2024-03-26 10:01:44,365 epoch 6 - iter 24/48 - loss 0.09748869 - time (sec): 11.06 - samples/sec: 1511.24 - lr: 0.000015 - momentum: 0.000000
168
+ 2024-03-26 10:01:45,502 epoch 6 - iter 28/48 - loss 0.09823753 - time (sec): 12.20 - samples/sec: 1559.55 - lr: 0.000015 - momentum: 0.000000
169
+ 2024-03-26 10:01:47,264 epoch 6 - iter 32/48 - loss 0.09442643 - time (sec): 13.96 - samples/sec: 1578.15 - lr: 0.000015 - momentum: 0.000000
170
+ 2024-03-26 10:01:49,666 epoch 6 - iter 36/48 - loss 0.10523613 - time (sec): 16.36 - samples/sec: 1552.08 - lr: 0.000014 - momentum: 0.000000
171
+ 2024-03-26 10:01:51,714 epoch 6 - iter 40/48 - loss 0.10211317 - time (sec): 18.41 - samples/sec: 1543.49 - lr: 0.000014 - momentum: 0.000000
172
+ 2024-03-26 10:01:53,581 epoch 6 - iter 44/48 - loss 0.10480042 - time (sec): 20.28 - samples/sec: 1551.43 - lr: 0.000014 - momentum: 0.000000
173
+ 2024-03-26 10:01:55,114 epoch 6 - iter 48/48 - loss 0.10675454 - time (sec): 21.81 - samples/sec: 1580.61 - lr: 0.000014 - momentum: 0.000000
174
+ 2024-03-26 10:01:55,115 ----------------------------------------------------------------------------------------------------
175
+ 2024-03-26 10:01:55,115 EPOCH 6 done: loss 0.1068 - lr: 0.000014
176
+ 2024-03-26 10:01:56,016 DEV : loss 0.16683053970336914 - f1-score (micro avg) 0.9087
177
+ 2024-03-26 10:01:56,017 saving best model
178
+ 2024-03-26 10:01:56,467 ----------------------------------------------------------------------------------------------------
179
+ 2024-03-26 10:01:58,637 epoch 7 - iter 4/48 - loss 0.09254774 - time (sec): 2.17 - samples/sec: 1274.49 - lr: 0.000013 - momentum: 0.000000
180
+ 2024-03-26 10:02:00,336 epoch 7 - iter 8/48 - loss 0.08838913 - time (sec): 3.87 - samples/sec: 1485.65 - lr: 0.000013 - momentum: 0.000000
181
+ 2024-03-26 10:02:02,391 epoch 7 - iter 12/48 - loss 0.07238019 - time (sec): 5.92 - samples/sec: 1446.76 - lr: 0.000013 - momentum: 0.000000
182
+ 2024-03-26 10:02:04,944 epoch 7 - iter 16/48 - loss 0.07196467 - time (sec): 8.48 - samples/sec: 1394.49 - lr: 0.000012 - momentum: 0.000000
183
+ 2024-03-26 10:02:07,622 epoch 7 - iter 20/48 - loss 0.07551862 - time (sec): 11.15 - samples/sec: 1401.35 - lr: 0.000012 - momentum: 0.000000
184
+ 2024-03-26 10:02:09,130 epoch 7 - iter 24/48 - loss 0.07740105 - time (sec): 12.66 - samples/sec: 1423.50 - lr: 0.000012 - momentum: 0.000000
185
+ 2024-03-26 10:02:11,213 epoch 7 - iter 28/48 - loss 0.07286143 - time (sec): 14.74 - samples/sec: 1442.91 - lr: 0.000012 - momentum: 0.000000
186
+ 2024-03-26 10:02:13,347 epoch 7 - iter 32/48 - loss 0.07581369 - time (sec): 16.88 - samples/sec: 1448.48 - lr: 0.000011 - momentum: 0.000000
187
+ 2024-03-26 10:02:15,550 epoch 7 - iter 36/48 - loss 0.08004892 - time (sec): 19.08 - samples/sec: 1434.63 - lr: 0.000011 - momentum: 0.000000
188
+ 2024-03-26 10:02:17,121 epoch 7 - iter 40/48 - loss 0.07736339 - time (sec): 20.65 - samples/sec: 1443.71 - lr: 0.000011 - momentum: 0.000000
189
+ 2024-03-26 10:02:18,764 epoch 7 - iter 44/48 - loss 0.08124466 - time (sec): 22.29 - samples/sec: 1463.02 - lr: 0.000010 - momentum: 0.000000
190
+ 2024-03-26 10:02:20,081 epoch 7 - iter 48/48 - loss 0.08186639 - time (sec): 23.61 - samples/sec: 1459.91 - lr: 0.000010 - momentum: 0.000000
191
+ 2024-03-26 10:02:20,081 ----------------------------------------------------------------------------------------------------
192
+ 2024-03-26 10:02:20,081 EPOCH 7 done: loss 0.0819 - lr: 0.000010
193
+ 2024-03-26 10:02:20,975 DEV : loss 0.16401655972003937 - f1-score (micro avg) 0.9157
194
+ 2024-03-26 10:02:20,976 saving best model
195
+ 2024-03-26 10:02:21,421 ----------------------------------------------------------------------------------------------------
196
+ 2024-03-26 10:02:23,755 epoch 8 - iter 4/48 - loss 0.06079097 - time (sec): 2.33 - samples/sec: 1260.57 - lr: 0.000010 - momentum: 0.000000
197
+ 2024-03-26 10:02:26,287 epoch 8 - iter 8/48 - loss 0.05516021 - time (sec): 4.86 - samples/sec: 1359.71 - lr: 0.000010 - momentum: 0.000000
198
+ 2024-03-26 10:02:28,279 epoch 8 - iter 12/48 - loss 0.05565549 - time (sec): 6.86 - samples/sec: 1342.24 - lr: 0.000009 - momentum: 0.000000
199
+ 2024-03-26 10:02:30,262 epoch 8 - iter 16/48 - loss 0.05522776 - time (sec): 8.84 - samples/sec: 1356.58 - lr: 0.000009 - momentum: 0.000000
200
+ 2024-03-26 10:02:31,782 epoch 8 - iter 20/48 - loss 0.05660406 - time (sec): 10.36 - samples/sec: 1380.14 - lr: 0.000009 - momentum: 0.000000
201
+ 2024-03-26 10:02:34,115 epoch 8 - iter 24/48 - loss 0.05700863 - time (sec): 12.69 - samples/sec: 1366.34 - lr: 0.000009 - momentum: 0.000000
202
+ 2024-03-26 10:02:36,248 epoch 8 - iter 28/48 - loss 0.05738097 - time (sec): 14.82 - samples/sec: 1359.64 - lr: 0.000008 - momentum: 0.000000
203
+ 2024-03-26 10:02:38,546 epoch 8 - iter 32/48 - loss 0.06585629 - time (sec): 17.12 - samples/sec: 1371.04 - lr: 0.000008 - momentum: 0.000000
204
+ 2024-03-26 10:02:41,710 epoch 8 - iter 36/48 - loss 0.06769677 - time (sec): 20.29 - samples/sec: 1322.88 - lr: 0.000008 - momentum: 0.000000
205
+ 2024-03-26 10:02:43,685 epoch 8 - iter 40/48 - loss 0.07192573 - time (sec): 22.26 - samples/sec: 1329.43 - lr: 0.000007 - momentum: 0.000000
206
+ 2024-03-26 10:02:44,478 epoch 8 - iter 44/48 - loss 0.07055192 - time (sec): 23.06 - samples/sec: 1378.00 - lr: 0.000007 - momentum: 0.000000
207
+ 2024-03-26 10:02:46,279 epoch 8 - iter 48/48 - loss 0.07020865 - time (sec): 24.86 - samples/sec: 1386.85 - lr: 0.000007 - momentum: 0.000000
208
+ 2024-03-26 10:02:46,280 ----------------------------------------------------------------------------------------------------
209
+ 2024-03-26 10:02:46,280 EPOCH 8 done: loss 0.0702 - lr: 0.000007
210
+ 2024-03-26 10:02:47,182 DEV : loss 0.15253271162509918 - f1-score (micro avg) 0.9208
211
+ 2024-03-26 10:02:47,183 saving best model
212
+ 2024-03-26 10:02:47,629 ----------------------------------------------------------------------------------------------------
213
+ 2024-03-26 10:02:50,314 epoch 9 - iter 4/48 - loss 0.04359022 - time (sec): 2.68 - samples/sec: 1228.92 - lr: 0.000007 - momentum: 0.000000
214
+ 2024-03-26 10:02:51,983 epoch 9 - iter 8/48 - loss 0.05473787 - time (sec): 4.35 - samples/sec: 1318.52 - lr: 0.000006 - momentum: 0.000000
215
+ 2024-03-26 10:02:54,086 epoch 9 - iter 12/48 - loss 0.06362136 - time (sec): 6.46 - samples/sec: 1390.53 - lr: 0.000006 - momentum: 0.000000
216
+ 2024-03-26 10:02:56,156 epoch 9 - iter 16/48 - loss 0.06414554 - time (sec): 8.52 - samples/sec: 1420.55 - lr: 0.000006 - momentum: 0.000000
217
+ 2024-03-26 10:02:58,469 epoch 9 - iter 20/48 - loss 0.05654201 - time (sec): 10.84 - samples/sec: 1395.91 - lr: 0.000006 - momentum: 0.000000
218
+ 2024-03-26 10:03:00,379 epoch 9 - iter 24/48 - loss 0.05696336 - time (sec): 12.75 - samples/sec: 1389.10 - lr: 0.000005 - momentum: 0.000000
219
+ 2024-03-26 10:03:03,558 epoch 9 - iter 28/48 - loss 0.05821121 - time (sec): 15.93 - samples/sec: 1341.28 - lr: 0.000005 - momentum: 0.000000
220
+ 2024-03-26 10:03:04,910 epoch 9 - iter 32/48 - loss 0.05796211 - time (sec): 17.28 - samples/sec: 1382.56 - lr: 0.000005 - momentum: 0.000000
221
+ 2024-03-26 10:03:07,265 epoch 9 - iter 36/48 - loss 0.05834651 - time (sec): 19.63 - samples/sec: 1372.55 - lr: 0.000004 - momentum: 0.000000
222
+ 2024-03-26 10:03:08,719 epoch 9 - iter 40/48 - loss 0.05878344 - time (sec): 21.09 - samples/sec: 1390.36 - lr: 0.000004 - momentum: 0.000000
223
+ 2024-03-26 10:03:10,215 epoch 9 - iter 44/48 - loss 0.06224717 - time (sec): 22.58 - samples/sec: 1408.57 - lr: 0.000004 - momentum: 0.000000
224
+ 2024-03-26 10:03:11,606 epoch 9 - iter 48/48 - loss 0.06229583 - time (sec): 23.98 - samples/sec: 1437.80 - lr: 0.000004 - momentum: 0.000000
225
+ 2024-03-26 10:03:11,606 ----------------------------------------------------------------------------------------------------
226
+ 2024-03-26 10:03:11,606 EPOCH 9 done: loss 0.0623 - lr: 0.000004
227
+ 2024-03-26 10:03:12,500 DEV : loss 0.15111036598682404 - f1-score (micro avg) 0.9286
228
+ 2024-03-26 10:03:12,501 saving best model
229
+ 2024-03-26 10:03:12,953 ----------------------------------------------------------------------------------------------------
230
+ 2024-03-26 10:03:15,317 epoch 10 - iter 4/48 - loss 0.03673501 - time (sec): 2.36 - samples/sec: 1393.43 - lr: 0.000003 - momentum: 0.000000
231
+ 2024-03-26 10:03:17,222 epoch 10 - iter 8/48 - loss 0.03572186 - time (sec): 4.27 - samples/sec: 1369.74 - lr: 0.000003 - momentum: 0.000000
232
+ 2024-03-26 10:03:18,396 epoch 10 - iter 12/48 - loss 0.05193565 - time (sec): 5.44 - samples/sec: 1533.06 - lr: 0.000003 - momentum: 0.000000
233
+ 2024-03-26 10:03:19,907 epoch 10 - iter 16/48 - loss 0.05828219 - time (sec): 6.95 - samples/sec: 1613.93 - lr: 0.000002 - momentum: 0.000000
234
+ 2024-03-26 10:03:21,592 epoch 10 - iter 20/48 - loss 0.05891099 - time (sec): 8.64 - samples/sec: 1652.86 - lr: 0.000002 - momentum: 0.000000
235
+ 2024-03-26 10:03:23,599 epoch 10 - iter 24/48 - loss 0.05669250 - time (sec): 10.64 - samples/sec: 1601.21 - lr: 0.000002 - momentum: 0.000000
236
+ 2024-03-26 10:03:25,706 epoch 10 - iter 28/48 - loss 0.05313062 - time (sec): 12.75 - samples/sec: 1564.46 - lr: 0.000002 - momentum: 0.000000
237
+ 2024-03-26 10:03:27,803 epoch 10 - iter 32/48 - loss 0.05458190 - time (sec): 14.85 - samples/sec: 1571.27 - lr: 0.000001 - momentum: 0.000000
238
+ 2024-03-26 10:03:29,185 epoch 10 - iter 36/48 - loss 0.05515192 - time (sec): 16.23 - samples/sec: 1571.16 - lr: 0.000001 - momentum: 0.000000
239
+ 2024-03-26 10:03:31,786 epoch 10 - iter 40/48 - loss 0.05263161 - time (sec): 18.83 - samples/sec: 1532.43 - lr: 0.000001 - momentum: 0.000000
240
+ 2024-03-26 10:03:34,283 epoch 10 - iter 44/48 - loss 0.05567045 - time (sec): 21.33 - samples/sec: 1507.63 - lr: 0.000001 - momentum: 0.000000
241
+ 2024-03-26 10:03:35,860 epoch 10 - iter 48/48 - loss 0.05630277 - time (sec): 22.91 - samples/sec: 1504.98 - lr: 0.000000 - momentum: 0.000000
242
+ 2024-03-26 10:03:35,860 ----------------------------------------------------------------------------------------------------
243
+ 2024-03-26 10:03:35,860 EPOCH 10 done: loss 0.0563 - lr: 0.000000
244
+ 2024-03-26 10:03:36,754 DEV : loss 0.1511855125427246 - f1-score (micro avg) 0.9331
245
+ 2024-03-26 10:03:36,755 saving best model
246
+ 2024-03-26 10:03:37,466 ----------------------------------------------------------------------------------------------------
247
+ 2024-03-26 10:03:37,466 Loading model from best epoch ...
248
+ 2024-03-26 10:03:38,398 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
249
+ 2024-03-26 10:03:39,146
250
+ Results:
251
+ - F-score (micro) 0.8995
252
+ - F-score (macro) 0.6833
253
+ - Accuracy 0.8219
254
+
255
+ By class:
256
+ precision recall f1-score support
257
+
258
+ Unternehmen 0.9137 0.8759 0.8944 266
259
+ Auslagerung 0.8566 0.8876 0.8718 249
260
+ Ort 0.9496 0.9851 0.9670 134
261
+ Software 0.0000 0.0000 0.0000 0
262
+
263
+ micro avg 0.8960 0.9029 0.8995 649
264
+ macro avg 0.6800 0.6871 0.6833 649
265
+ weighted avg 0.8992 0.9029 0.9007 649
266
+
267
+ 2024-03-26 10:03:39,146 ----------------------------------------------------------------------------------------------------