stefan-it commited on
Commit
710c26b
1 Parent(s): 52b1b99

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +266 -0
training.log ADDED
@@ -0,0 +1,266 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
2
+ 2024-03-26 12:02:56,881 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(30001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-11): 12 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=768, out_features=768, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=768, out_features=17, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
51
+ 2024-03-26 12:02:56,881 Corpus: 758 train + 94 dev + 96 test sentences
52
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
53
+ 2024-03-26 12:02:56,881 Train: 758 sentences
54
+ 2024-03-26 12:02:56,881 (train_with_dev=False, train_with_test=False)
55
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
56
+ 2024-03-26 12:02:56,881 Training Params:
57
+ 2024-03-26 12:02:56,881 - learning_rate: "3e-05"
58
+ 2024-03-26 12:02:56,881 - mini_batch_size: "16"
59
+ 2024-03-26 12:02:56,881 - max_epochs: "10"
60
+ 2024-03-26 12:02:56,881 - shuffle: "True"
61
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
62
+ 2024-03-26 12:02:56,881 Plugins:
63
+ 2024-03-26 12:02:56,881 - TensorboardLogger
64
+ 2024-03-26 12:02:56,881 - LinearScheduler | warmup_fraction: '0.1'
65
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
66
+ 2024-03-26 12:02:56,881 Final evaluation on model from best epoch (best-model.pt)
67
+ 2024-03-26 12:02:56,881 - metric: "('micro avg', 'f1-score')"
68
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
69
+ 2024-03-26 12:02:56,881 Computation:
70
+ 2024-03-26 12:02:56,881 - compute on device: cuda:0
71
+ 2024-03-26 12:02:56,881 - embedding storage: none
72
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
73
+ 2024-03-26 12:02:56,881 Model training base path: "flair-co-funer-german_bert_base-bs16-e10-lr3e-05-5"
74
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
75
+ 2024-03-26 12:02:56,881 ----------------------------------------------------------------------------------------------------
76
+ 2024-03-26 12:02:56,881 Logging anything other than scalars to TensorBoard is currently not supported.
77
+ 2024-03-26 12:02:58,395 epoch 1 - iter 4/48 - loss 3.12129509 - time (sec): 1.51 - samples/sec: 1732.52 - lr: 0.000002 - momentum: 0.000000
78
+ 2024-03-26 12:03:01,185 epoch 1 - iter 8/48 - loss 3.09794484 - time (sec): 4.30 - samples/sec: 1414.35 - lr: 0.000004 - momentum: 0.000000
79
+ 2024-03-26 12:03:03,091 epoch 1 - iter 12/48 - loss 3.10138069 - time (sec): 6.21 - samples/sec: 1434.38 - lr: 0.000007 - momentum: 0.000000
80
+ 2024-03-26 12:03:04,753 epoch 1 - iter 16/48 - loss 3.01723077 - time (sec): 7.87 - samples/sec: 1528.46 - lr: 0.000009 - momentum: 0.000000
81
+ 2024-03-26 12:03:06,948 epoch 1 - iter 20/48 - loss 2.88339031 - time (sec): 10.07 - samples/sec: 1497.84 - lr: 0.000012 - momentum: 0.000000
82
+ 2024-03-26 12:03:09,697 epoch 1 - iter 24/48 - loss 2.73555277 - time (sec): 12.82 - samples/sec: 1439.57 - lr: 0.000014 - momentum: 0.000000
83
+ 2024-03-26 12:03:11,383 epoch 1 - iter 28/48 - loss 2.62931161 - time (sec): 14.50 - samples/sec: 1447.29 - lr: 0.000017 - momentum: 0.000000
84
+ 2024-03-26 12:03:13,547 epoch 1 - iter 32/48 - loss 2.51120225 - time (sec): 16.67 - samples/sec: 1442.48 - lr: 0.000019 - momentum: 0.000000
85
+ 2024-03-26 12:03:15,134 epoch 1 - iter 36/48 - loss 2.42560844 - time (sec): 18.25 - samples/sec: 1462.79 - lr: 0.000022 - momentum: 0.000000
86
+ 2024-03-26 12:03:17,965 epoch 1 - iter 40/48 - loss 2.31439325 - time (sec): 21.08 - samples/sec: 1418.37 - lr: 0.000024 - momentum: 0.000000
87
+ 2024-03-26 12:03:19,175 epoch 1 - iter 44/48 - loss 2.22836577 - time (sec): 22.29 - samples/sec: 1441.81 - lr: 0.000027 - momentum: 0.000000
88
+ 2024-03-26 12:03:21,051 epoch 1 - iter 48/48 - loss 2.15578626 - time (sec): 24.17 - samples/sec: 1426.28 - lr: 0.000029 - momentum: 0.000000
89
+ 2024-03-26 12:03:21,051 ----------------------------------------------------------------------------------------------------
90
+ 2024-03-26 12:03:21,051 EPOCH 1 done: loss 2.1558 - lr: 0.000029
91
+ 2024-03-26 12:03:21,894 DEV : loss 0.7538434863090515 - f1-score (micro avg) 0.5155
92
+ 2024-03-26 12:03:21,895 saving best model
93
+ 2024-03-26 12:03:22,152 ----------------------------------------------------------------------------------------------------
94
+ 2024-03-26 12:03:24,936 epoch 2 - iter 4/48 - loss 0.89958872 - time (sec): 2.78 - samples/sec: 1240.73 - lr: 0.000030 - momentum: 0.000000
95
+ 2024-03-26 12:03:26,870 epoch 2 - iter 8/48 - loss 0.83054446 - time (sec): 4.72 - samples/sec: 1299.11 - lr: 0.000030 - momentum: 0.000000
96
+ 2024-03-26 12:03:28,843 epoch 2 - iter 12/48 - loss 0.78361854 - time (sec): 6.69 - samples/sec: 1333.57 - lr: 0.000029 - momentum: 0.000000
97
+ 2024-03-26 12:03:31,588 epoch 2 - iter 16/48 - loss 0.70998971 - time (sec): 9.44 - samples/sec: 1341.21 - lr: 0.000029 - momentum: 0.000000
98
+ 2024-03-26 12:03:33,023 epoch 2 - iter 20/48 - loss 0.67967096 - time (sec): 10.87 - samples/sec: 1378.25 - lr: 0.000029 - momentum: 0.000000
99
+ 2024-03-26 12:03:35,936 epoch 2 - iter 24/48 - loss 0.63814992 - time (sec): 13.78 - samples/sec: 1298.51 - lr: 0.000028 - momentum: 0.000000
100
+ 2024-03-26 12:03:37,606 epoch 2 - iter 28/48 - loss 0.62425146 - time (sec): 15.45 - samples/sec: 1330.52 - lr: 0.000028 - momentum: 0.000000
101
+ 2024-03-26 12:03:39,660 epoch 2 - iter 32/48 - loss 0.59545641 - time (sec): 17.51 - samples/sec: 1326.21 - lr: 0.000028 - momentum: 0.000000
102
+ 2024-03-26 12:03:41,485 epoch 2 - iter 36/48 - loss 0.58132540 - time (sec): 19.33 - samples/sec: 1355.74 - lr: 0.000028 - momentum: 0.000000
103
+ 2024-03-26 12:03:43,881 epoch 2 - iter 40/48 - loss 0.57243950 - time (sec): 21.73 - samples/sec: 1347.17 - lr: 0.000027 - momentum: 0.000000
104
+ 2024-03-26 12:03:46,129 epoch 2 - iter 44/48 - loss 0.55195339 - time (sec): 23.98 - samples/sec: 1352.77 - lr: 0.000027 - momentum: 0.000000
105
+ 2024-03-26 12:03:47,408 epoch 2 - iter 48/48 - loss 0.54352570 - time (sec): 25.26 - samples/sec: 1364.94 - lr: 0.000027 - momentum: 0.000000
106
+ 2024-03-26 12:03:47,408 ----------------------------------------------------------------------------------------------------
107
+ 2024-03-26 12:03:47,408 EPOCH 2 done: loss 0.5435 - lr: 0.000027
108
+ 2024-03-26 12:03:48,330 DEV : loss 0.33131173253059387 - f1-score (micro avg) 0.785
109
+ 2024-03-26 12:03:48,331 saving best model
110
+ 2024-03-26 12:03:48,763 ----------------------------------------------------------------------------------------------------
111
+ 2024-03-26 12:03:49,867 epoch 3 - iter 4/48 - loss 0.42482105 - time (sec): 1.10 - samples/sec: 2024.56 - lr: 0.000026 - momentum: 0.000000
112
+ 2024-03-26 12:03:51,807 epoch 3 - iter 8/48 - loss 0.40870155 - time (sec): 3.04 - samples/sec: 1619.65 - lr: 0.000026 - momentum: 0.000000
113
+ 2024-03-26 12:03:54,064 epoch 3 - iter 12/48 - loss 0.35237639 - time (sec): 5.30 - samples/sec: 1618.21 - lr: 0.000026 - momentum: 0.000000
114
+ 2024-03-26 12:03:56,038 epoch 3 - iter 16/48 - loss 0.34486620 - time (sec): 7.27 - samples/sec: 1563.96 - lr: 0.000026 - momentum: 0.000000
115
+ 2024-03-26 12:03:57,966 epoch 3 - iter 20/48 - loss 0.33509474 - time (sec): 9.20 - samples/sec: 1540.83 - lr: 0.000025 - momentum: 0.000000
116
+ 2024-03-26 12:03:59,958 epoch 3 - iter 24/48 - loss 0.31801538 - time (sec): 11.19 - samples/sec: 1497.60 - lr: 0.000025 - momentum: 0.000000
117
+ 2024-03-26 12:04:03,262 epoch 3 - iter 28/48 - loss 0.30314619 - time (sec): 14.50 - samples/sec: 1380.74 - lr: 0.000025 - momentum: 0.000000
118
+ 2024-03-26 12:04:04,816 epoch 3 - iter 32/48 - loss 0.30246473 - time (sec): 16.05 - samples/sec: 1403.32 - lr: 0.000025 - momentum: 0.000000
119
+ 2024-03-26 12:04:08,198 epoch 3 - iter 36/48 - loss 0.29146006 - time (sec): 19.43 - samples/sec: 1334.78 - lr: 0.000024 - momentum: 0.000000
120
+ 2024-03-26 12:04:10,657 epoch 3 - iter 40/48 - loss 0.28968058 - time (sec): 21.89 - samples/sec: 1336.63 - lr: 0.000024 - momentum: 0.000000
121
+ 2024-03-26 12:04:12,775 epoch 3 - iter 44/48 - loss 0.28024638 - time (sec): 24.01 - samples/sec: 1336.44 - lr: 0.000024 - momentum: 0.000000
122
+ 2024-03-26 12:04:14,398 epoch 3 - iter 48/48 - loss 0.27767314 - time (sec): 25.63 - samples/sec: 1344.83 - lr: 0.000023 - momentum: 0.000000
123
+ 2024-03-26 12:04:14,398 ----------------------------------------------------------------------------------------------------
124
+ 2024-03-26 12:04:14,398 EPOCH 3 done: loss 0.2777 - lr: 0.000023
125
+ 2024-03-26 12:04:15,322 DEV : loss 0.2567653954029083 - f1-score (micro avg) 0.8532
126
+ 2024-03-26 12:04:15,323 saving best model
127
+ 2024-03-26 12:04:15,739 ----------------------------------------------------------------------------------------------------
128
+ 2024-03-26 12:04:18,888 epoch 4 - iter 4/48 - loss 0.15456626 - time (sec): 3.15 - samples/sec: 1184.99 - lr: 0.000023 - momentum: 0.000000
129
+ 2024-03-26 12:04:20,372 epoch 4 - iter 8/48 - loss 0.19317901 - time (sec): 4.63 - samples/sec: 1343.11 - lr: 0.000023 - momentum: 0.000000
130
+ 2024-03-26 12:04:22,985 epoch 4 - iter 12/48 - loss 0.18252728 - time (sec): 7.24 - samples/sec: 1283.90 - lr: 0.000023 - momentum: 0.000000
131
+ 2024-03-26 12:04:25,682 epoch 4 - iter 16/48 - loss 0.17118891 - time (sec): 9.94 - samples/sec: 1277.14 - lr: 0.000022 - momentum: 0.000000
132
+ 2024-03-26 12:04:27,947 epoch 4 - iter 20/48 - loss 0.16696842 - time (sec): 12.21 - samples/sec: 1292.89 - lr: 0.000022 - momentum: 0.000000
133
+ 2024-03-26 12:04:29,478 epoch 4 - iter 24/48 - loss 0.16439449 - time (sec): 13.74 - samples/sec: 1327.14 - lr: 0.000022 - momentum: 0.000000
134
+ 2024-03-26 12:04:31,901 epoch 4 - iter 28/48 - loss 0.16802806 - time (sec): 16.16 - samples/sec: 1314.54 - lr: 0.000022 - momentum: 0.000000
135
+ 2024-03-26 12:04:34,935 epoch 4 - iter 32/48 - loss 0.16616945 - time (sec): 19.19 - samples/sec: 1305.61 - lr: 0.000021 - momentum: 0.000000
136
+ 2024-03-26 12:04:36,640 epoch 4 - iter 36/48 - loss 0.17149344 - time (sec): 20.90 - samples/sec: 1328.15 - lr: 0.000021 - momentum: 0.000000
137
+ 2024-03-26 12:04:37,626 epoch 4 - iter 40/48 - loss 0.17449164 - time (sec): 21.88 - samples/sec: 1372.17 - lr: 0.000021 - momentum: 0.000000
138
+ 2024-03-26 12:04:39,157 epoch 4 - iter 44/48 - loss 0.17147288 - time (sec): 23.42 - samples/sec: 1389.35 - lr: 0.000020 - momentum: 0.000000
139
+ 2024-03-26 12:04:40,044 epoch 4 - iter 48/48 - loss 0.17402778 - time (sec): 24.30 - samples/sec: 1418.47 - lr: 0.000020 - momentum: 0.000000
140
+ 2024-03-26 12:04:40,044 ----------------------------------------------------------------------------------------------------
141
+ 2024-03-26 12:04:40,044 EPOCH 4 done: loss 0.1740 - lr: 0.000020
142
+ 2024-03-26 12:04:40,972 DEV : loss 0.23388732969760895 - f1-score (micro avg) 0.8688
143
+ 2024-03-26 12:04:40,973 saving best model
144
+ 2024-03-26 12:04:41,384 ----------------------------------------------------------------------------------------------------
145
+ 2024-03-26 12:04:43,222 epoch 5 - iter 4/48 - loss 0.16130966 - time (sec): 1.84 - samples/sec: 1564.18 - lr: 0.000020 - momentum: 0.000000
146
+ 2024-03-26 12:04:45,120 epoch 5 - iter 8/48 - loss 0.13106811 - time (sec): 3.73 - samples/sec: 1661.67 - lr: 0.000020 - momentum: 0.000000
147
+ 2024-03-26 12:04:48,241 epoch 5 - iter 12/48 - loss 0.12117448 - time (sec): 6.86 - samples/sec: 1403.40 - lr: 0.000019 - momentum: 0.000000
148
+ 2024-03-26 12:04:49,616 epoch 5 - iter 16/48 - loss 0.11428761 - time (sec): 8.23 - samples/sec: 1444.70 - lr: 0.000019 - momentum: 0.000000
149
+ 2024-03-26 12:04:52,012 epoch 5 - iter 20/48 - loss 0.12659685 - time (sec): 10.63 - samples/sec: 1418.01 - lr: 0.000019 - momentum: 0.000000
150
+ 2024-03-26 12:04:54,173 epoch 5 - iter 24/48 - loss 0.12691480 - time (sec): 12.79 - samples/sec: 1390.07 - lr: 0.000018 - momentum: 0.000000
151
+ 2024-03-26 12:04:55,597 epoch 5 - iter 28/48 - loss 0.13411938 - time (sec): 14.21 - samples/sec: 1427.90 - lr: 0.000018 - momentum: 0.000000
152
+ 2024-03-26 12:04:57,007 epoch 5 - iter 32/48 - loss 0.13634792 - time (sec): 15.62 - samples/sec: 1459.36 - lr: 0.000018 - momentum: 0.000000
153
+ 2024-03-26 12:04:59,293 epoch 5 - iter 36/48 - loss 0.13690522 - time (sec): 17.91 - samples/sec: 1442.07 - lr: 0.000018 - momentum: 0.000000
154
+ 2024-03-26 12:05:01,169 epoch 5 - iter 40/48 - loss 0.13384191 - time (sec): 19.78 - samples/sec: 1441.02 - lr: 0.000017 - momentum: 0.000000
155
+ 2024-03-26 12:05:03,269 epoch 5 - iter 44/48 - loss 0.13274461 - time (sec): 21.88 - samples/sec: 1450.66 - lr: 0.000017 - momentum: 0.000000
156
+ 2024-03-26 12:05:05,441 epoch 5 - iter 48/48 - loss 0.12821407 - time (sec): 24.06 - samples/sec: 1433.05 - lr: 0.000017 - momentum: 0.000000
157
+ 2024-03-26 12:05:05,441 ----------------------------------------------------------------------------------------------------
158
+ 2024-03-26 12:05:05,441 EPOCH 5 done: loss 0.1282 - lr: 0.000017
159
+ 2024-03-26 12:05:06,374 DEV : loss 0.2178356647491455 - f1-score (micro avg) 0.893
160
+ 2024-03-26 12:05:06,375 saving best model
161
+ 2024-03-26 12:05:06,798 ----------------------------------------------------------------------------------------------------
162
+ 2024-03-26 12:05:08,730 epoch 6 - iter 4/48 - loss 0.09708101 - time (sec): 1.93 - samples/sec: 1421.89 - lr: 0.000017 - momentum: 0.000000
163
+ 2024-03-26 12:05:11,610 epoch 6 - iter 8/48 - loss 0.11059477 - time (sec): 4.81 - samples/sec: 1321.14 - lr: 0.000016 - momentum: 0.000000
164
+ 2024-03-26 12:05:13,501 epoch 6 - iter 12/48 - loss 0.11422747 - time (sec): 6.70 - samples/sec: 1347.27 - lr: 0.000016 - momentum: 0.000000
165
+ 2024-03-26 12:05:15,088 epoch 6 - iter 16/48 - loss 0.11763326 - time (sec): 8.29 - samples/sec: 1395.80 - lr: 0.000016 - momentum: 0.000000
166
+ 2024-03-26 12:05:17,827 epoch 6 - iter 20/48 - loss 0.11242158 - time (sec): 11.03 - samples/sec: 1322.13 - lr: 0.000015 - momentum: 0.000000
167
+ 2024-03-26 12:05:20,563 epoch 6 - iter 24/48 - loss 0.10277800 - time (sec): 13.76 - samples/sec: 1298.04 - lr: 0.000015 - momentum: 0.000000
168
+ 2024-03-26 12:05:23,079 epoch 6 - iter 28/48 - loss 0.10001828 - time (sec): 16.28 - samples/sec: 1274.53 - lr: 0.000015 - momentum: 0.000000
169
+ 2024-03-26 12:05:24,485 epoch 6 - iter 32/48 - loss 0.10667881 - time (sec): 17.69 - samples/sec: 1318.22 - lr: 0.000015 - momentum: 0.000000
170
+ 2024-03-26 12:05:26,389 epoch 6 - iter 36/48 - loss 0.10414212 - time (sec): 19.59 - samples/sec: 1329.32 - lr: 0.000014 - momentum: 0.000000
171
+ 2024-03-26 12:05:27,410 epoch 6 - iter 40/48 - loss 0.10365325 - time (sec): 20.61 - samples/sec: 1368.68 - lr: 0.000014 - momentum: 0.000000
172
+ 2024-03-26 12:05:29,930 epoch 6 - iter 44/48 - loss 0.10035451 - time (sec): 23.13 - samples/sec: 1345.14 - lr: 0.000014 - momentum: 0.000000
173
+ 2024-03-26 12:05:32,791 epoch 6 - iter 48/48 - loss 0.09524719 - time (sec): 25.99 - samples/sec: 1326.28 - lr: 0.000014 - momentum: 0.000000
174
+ 2024-03-26 12:05:32,791 ----------------------------------------------------------------------------------------------------
175
+ 2024-03-26 12:05:32,791 EPOCH 6 done: loss 0.0952 - lr: 0.000014
176
+ 2024-03-26 12:05:33,740 DEV : loss 0.19964341819286346 - f1-score (micro avg) 0.8942
177
+ 2024-03-26 12:05:33,741 saving best model
178
+ 2024-03-26 12:05:34,172 ----------------------------------------------------------------------------------------------------
179
+ 2024-03-26 12:05:36,357 epoch 7 - iter 4/48 - loss 0.05253045 - time (sec): 2.18 - samples/sec: 1333.19 - lr: 0.000013 - momentum: 0.000000
180
+ 2024-03-26 12:05:38,086 epoch 7 - iter 8/48 - loss 0.05348429 - time (sec): 3.91 - samples/sec: 1362.53 - lr: 0.000013 - momentum: 0.000000
181
+ 2024-03-26 12:05:39,541 epoch 7 - iter 12/48 - loss 0.08415717 - time (sec): 5.37 - samples/sec: 1414.39 - lr: 0.000013 - momentum: 0.000000
182
+ 2024-03-26 12:05:41,498 epoch 7 - iter 16/48 - loss 0.07694373 - time (sec): 7.32 - samples/sec: 1449.66 - lr: 0.000012 - momentum: 0.000000
183
+ 2024-03-26 12:05:43,817 epoch 7 - iter 20/48 - loss 0.09023838 - time (sec): 9.64 - samples/sec: 1502.91 - lr: 0.000012 - momentum: 0.000000
184
+ 2024-03-26 12:05:45,166 epoch 7 - iter 24/48 - loss 0.08683075 - time (sec): 10.99 - samples/sec: 1550.09 - lr: 0.000012 - momentum: 0.000000
185
+ 2024-03-26 12:05:47,411 epoch 7 - iter 28/48 - loss 0.08520615 - time (sec): 13.24 - samples/sec: 1506.85 - lr: 0.000012 - momentum: 0.000000
186
+ 2024-03-26 12:05:49,331 epoch 7 - iter 32/48 - loss 0.08645376 - time (sec): 15.16 - samples/sec: 1502.21 - lr: 0.000011 - momentum: 0.000000
187
+ 2024-03-26 12:05:51,355 epoch 7 - iter 36/48 - loss 0.08423135 - time (sec): 17.18 - samples/sec: 1471.25 - lr: 0.000011 - momentum: 0.000000
188
+ 2024-03-26 12:05:54,206 epoch 7 - iter 40/48 - loss 0.08031771 - time (sec): 20.03 - samples/sec: 1453.99 - lr: 0.000011 - momentum: 0.000000
189
+ 2024-03-26 12:05:55,736 epoch 7 - iter 44/48 - loss 0.08069773 - time (sec): 21.56 - samples/sec: 1468.81 - lr: 0.000010 - momentum: 0.000000
190
+ 2024-03-26 12:05:57,912 epoch 7 - iter 48/48 - loss 0.07961125 - time (sec): 23.74 - samples/sec: 1452.24 - lr: 0.000010 - momentum: 0.000000
191
+ 2024-03-26 12:05:57,912 ----------------------------------------------------------------------------------------------------
192
+ 2024-03-26 12:05:57,912 EPOCH 7 done: loss 0.0796 - lr: 0.000010
193
+ 2024-03-26 12:05:58,929 DEV : loss 0.2036362737417221 - f1-score (micro avg) 0.9066
194
+ 2024-03-26 12:05:58,930 saving best model
195
+ 2024-03-26 12:05:59,354 ----------------------------------------------------------------------------------------------------
196
+ 2024-03-26 12:06:01,640 epoch 8 - iter 4/48 - loss 0.10170390 - time (sec): 2.28 - samples/sec: 1221.01 - lr: 0.000010 - momentum: 0.000000
197
+ 2024-03-26 12:06:03,208 epoch 8 - iter 8/48 - loss 0.06787866 - time (sec): 3.85 - samples/sec: 1411.19 - lr: 0.000010 - momentum: 0.000000
198
+ 2024-03-26 12:06:06,239 epoch 8 - iter 12/48 - loss 0.05875894 - time (sec): 6.88 - samples/sec: 1307.39 - lr: 0.000009 - momentum: 0.000000
199
+ 2024-03-26 12:06:08,665 epoch 8 - iter 16/48 - loss 0.05794682 - time (sec): 9.31 - samples/sec: 1319.75 - lr: 0.000009 - momentum: 0.000000
200
+ 2024-03-26 12:06:10,144 epoch 8 - iter 20/48 - loss 0.05562347 - time (sec): 10.79 - samples/sec: 1377.00 - lr: 0.000009 - momentum: 0.000000
201
+ 2024-03-26 12:06:11,562 epoch 8 - iter 24/48 - loss 0.05757691 - time (sec): 12.21 - samples/sec: 1449.59 - lr: 0.000009 - momentum: 0.000000
202
+ 2024-03-26 12:06:12,869 epoch 8 - iter 28/48 - loss 0.05949732 - time (sec): 13.51 - samples/sec: 1513.12 - lr: 0.000008 - momentum: 0.000000
203
+ 2024-03-26 12:06:15,170 epoch 8 - iter 32/48 - loss 0.06077559 - time (sec): 15.81 - samples/sec: 1464.99 - lr: 0.000008 - momentum: 0.000000
204
+ 2024-03-26 12:06:17,741 epoch 8 - iter 36/48 - loss 0.05962745 - time (sec): 18.38 - samples/sec: 1424.12 - lr: 0.000008 - momentum: 0.000000
205
+ 2024-03-26 12:06:19,780 epoch 8 - iter 40/48 - loss 0.06048251 - time (sec): 20.42 - samples/sec: 1428.96 - lr: 0.000007 - momentum: 0.000000
206
+ 2024-03-26 12:06:21,921 epoch 8 - iter 44/48 - loss 0.06214127 - time (sec): 22.56 - samples/sec: 1415.58 - lr: 0.000007 - momentum: 0.000000
207
+ 2024-03-26 12:06:23,545 epoch 8 - iter 48/48 - loss 0.06199253 - time (sec): 24.19 - samples/sec: 1425.12 - lr: 0.000007 - momentum: 0.000000
208
+ 2024-03-26 12:06:23,545 ----------------------------------------------------------------------------------------------------
209
+ 2024-03-26 12:06:23,545 EPOCH 8 done: loss 0.0620 - lr: 0.000007
210
+ 2024-03-26 12:06:24,474 DEV : loss 0.18240593373775482 - f1-score (micro avg) 0.9167
211
+ 2024-03-26 12:06:24,475 saving best model
212
+ 2024-03-26 12:06:24,909 ----------------------------------------------------------------------------------------------------
213
+ 2024-03-26 12:06:27,685 epoch 9 - iter 4/48 - loss 0.05441554 - time (sec): 2.77 - samples/sec: 1262.13 - lr: 0.000007 - momentum: 0.000000
214
+ 2024-03-26 12:06:29,758 epoch 9 - iter 8/48 - loss 0.04400094 - time (sec): 4.85 - samples/sec: 1317.94 - lr: 0.000006 - momentum: 0.000000
215
+ 2024-03-26 12:06:32,758 epoch 9 - iter 12/48 - loss 0.04559467 - time (sec): 7.85 - samples/sec: 1239.88 - lr: 0.000006 - momentum: 0.000000
216
+ 2024-03-26 12:06:35,906 epoch 9 - iter 16/48 - loss 0.05720080 - time (sec): 10.99 - samples/sec: 1222.91 - lr: 0.000006 - momentum: 0.000000
217
+ 2024-03-26 12:06:36,775 epoch 9 - iter 20/48 - loss 0.05506132 - time (sec): 11.86 - samples/sec: 1313.05 - lr: 0.000006 - momentum: 0.000000
218
+ 2024-03-26 12:06:38,705 epoch 9 - iter 24/48 - loss 0.05366432 - time (sec): 13.79 - samples/sec: 1306.91 - lr: 0.000005 - momentum: 0.000000
219
+ 2024-03-26 12:06:40,743 epoch 9 - iter 28/48 - loss 0.05227759 - time (sec): 15.83 - samples/sec: 1322.70 - lr: 0.000005 - momentum: 0.000000
220
+ 2024-03-26 12:06:41,781 epoch 9 - iter 32/48 - loss 0.05472457 - time (sec): 16.87 - samples/sec: 1384.85 - lr: 0.000005 - momentum: 0.000000
221
+ 2024-03-26 12:06:42,962 epoch 9 - iter 36/48 - loss 0.05406804 - time (sec): 18.05 - samples/sec: 1435.35 - lr: 0.000004 - momentum: 0.000000
222
+ 2024-03-26 12:06:44,289 epoch 9 - iter 40/48 - loss 0.05402247 - time (sec): 19.38 - samples/sec: 1463.70 - lr: 0.000004 - momentum: 0.000000
223
+ 2024-03-26 12:06:47,367 epoch 9 - iter 44/48 - loss 0.05348785 - time (sec): 22.46 - samples/sec: 1436.47 - lr: 0.000004 - momentum: 0.000000
224
+ 2024-03-26 12:06:48,928 epoch 9 - iter 48/48 - loss 0.05219946 - time (sec): 24.02 - samples/sec: 1435.34 - lr: 0.000004 - momentum: 0.000000
225
+ 2024-03-26 12:06:48,928 ----------------------------------------------------------------------------------------------------
226
+ 2024-03-26 12:06:48,928 EPOCH 9 done: loss 0.0522 - lr: 0.000004
227
+ 2024-03-26 12:06:49,863 DEV : loss 0.18798069655895233 - f1-score (micro avg) 0.9247
228
+ 2024-03-26 12:06:49,864 saving best model
229
+ 2024-03-26 12:06:50,308 ----------------------------------------------------------------------------------------------------
230
+ 2024-03-26 12:06:53,223 epoch 10 - iter 4/48 - loss 0.04554635 - time (sec): 2.91 - samples/sec: 1274.34 - lr: 0.000003 - momentum: 0.000000
231
+ 2024-03-26 12:06:55,200 epoch 10 - iter 8/48 - loss 0.04254462 - time (sec): 4.89 - samples/sec: 1321.40 - lr: 0.000003 - momentum: 0.000000
232
+ 2024-03-26 12:06:57,403 epoch 10 - iter 12/48 - loss 0.04412043 - time (sec): 7.09 - samples/sec: 1282.08 - lr: 0.000003 - momentum: 0.000000
233
+ 2024-03-26 12:06:59,990 epoch 10 - iter 16/48 - loss 0.04292351 - time (sec): 9.68 - samples/sec: 1233.56 - lr: 0.000002 - momentum: 0.000000
234
+ 2024-03-26 12:07:02,611 epoch 10 - iter 20/48 - loss 0.04280469 - time (sec): 12.30 - samples/sec: 1237.58 - lr: 0.000002 - momentum: 0.000000
235
+ 2024-03-26 12:07:04,033 epoch 10 - iter 24/48 - loss 0.04266044 - time (sec): 13.72 - samples/sec: 1299.71 - lr: 0.000002 - momentum: 0.000000
236
+ 2024-03-26 12:07:04,956 epoch 10 - iter 28/48 - loss 0.04409514 - time (sec): 14.65 - samples/sec: 1367.63 - lr: 0.000002 - momentum: 0.000000
237
+ 2024-03-26 12:07:06,939 epoch 10 - iter 32/48 - loss 0.04788591 - time (sec): 16.63 - samples/sec: 1386.99 - lr: 0.000001 - momentum: 0.000000
238
+ 2024-03-26 12:07:09,238 epoch 10 - iter 36/48 - loss 0.04807467 - time (sec): 18.93 - samples/sec: 1366.69 - lr: 0.000001 - momentum: 0.000000
239
+ 2024-03-26 12:07:10,954 epoch 10 - iter 40/48 - loss 0.04909381 - time (sec): 20.64 - samples/sec: 1391.12 - lr: 0.000001 - momentum: 0.000000
240
+ 2024-03-26 12:07:14,181 epoch 10 - iter 44/48 - loss 0.04792634 - time (sec): 23.87 - samples/sec: 1372.93 - lr: 0.000001 - momentum: 0.000000
241
+ 2024-03-26 12:07:14,899 epoch 10 - iter 48/48 - loss 0.04830699 - time (sec): 24.59 - samples/sec: 1401.87 - lr: 0.000000 - momentum: 0.000000
242
+ 2024-03-26 12:07:14,900 ----------------------------------------------------------------------------------------------------
243
+ 2024-03-26 12:07:14,900 EPOCH 10 done: loss 0.0483 - lr: 0.000000
244
+ 2024-03-26 12:07:15,835 DEV : loss 0.18821127712726593 - f1-score (micro avg) 0.9204
245
+ 2024-03-26 12:07:16,096 ----------------------------------------------------------------------------------------------------
246
+ 2024-03-26 12:07:16,097 Loading model from best epoch ...
247
+ 2024-03-26 12:07:16,803 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
248
+ 2024-03-26 12:07:17,653
249
+ Results:
250
+ - F-score (micro) 0.9094
251
+ - F-score (macro) 0.6907
252
+ - Accuracy 0.8362
253
+
254
+ By class:
255
+ precision recall f1-score support
256
+
257
+ Unternehmen 0.8923 0.8722 0.8821 266
258
+ Auslagerung 0.8976 0.9157 0.9066 249
259
+ Ort 0.9635 0.9851 0.9742 134
260
+ Software 0.0000 0.0000 0.0000 0
261
+
262
+ micro avg 0.9066 0.9122 0.9094 649
263
+ macro avg 0.6884 0.6932 0.6907 649
264
+ weighted avg 0.9091 0.9122 0.9105 649
265
+
266
+ 2024-03-26 12:07:17,653 ----------------------------------------------------------------------------------------------------