stefan-it commited on
Commit
9b55224
1 Parent(s): def1694

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +263 -0
training.log ADDED
@@ -0,0 +1,263 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-03-26 16:08:22,645 ----------------------------------------------------------------------------------------------------
2
+ 2024-03-26 16:08:22,645 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(31103, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-11): 12 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=768, out_features=768, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=768, out_features=17, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
51
+ 2024-03-26 16:08:22,646 Corpus: 758 train + 94 dev + 96 test sentences
52
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
53
+ 2024-03-26 16:08:22,646 Train: 758 sentences
54
+ 2024-03-26 16:08:22,646 (train_with_dev=False, train_with_test=False)
55
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
56
+ 2024-03-26 16:08:22,646 Training Params:
57
+ 2024-03-26 16:08:22,646 - learning_rate: "5e-05"
58
+ 2024-03-26 16:08:22,646 - mini_batch_size: "16"
59
+ 2024-03-26 16:08:22,646 - max_epochs: "10"
60
+ 2024-03-26 16:08:22,646 - shuffle: "True"
61
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
62
+ 2024-03-26 16:08:22,646 Plugins:
63
+ 2024-03-26 16:08:22,646 - TensorboardLogger
64
+ 2024-03-26 16:08:22,646 - LinearScheduler | warmup_fraction: '0.1'
65
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
66
+ 2024-03-26 16:08:22,646 Final evaluation on model from best epoch (best-model.pt)
67
+ 2024-03-26 16:08:22,646 - metric: "('micro avg', 'f1-score')"
68
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
69
+ 2024-03-26 16:08:22,646 Computation:
70
+ 2024-03-26 16:08:22,646 - compute on device: cuda:0
71
+ 2024-03-26 16:08:22,646 - embedding storage: none
72
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
73
+ 2024-03-26 16:08:22,646 Model training base path: "flair-co-funer-german_dbmdz_bert_base-bs16-e10-lr5e-05-4"
74
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
75
+ 2024-03-26 16:08:22,646 ----------------------------------------------------------------------------------------------------
76
+ 2024-03-26 16:08:22,646 Logging anything other than scalars to TensorBoard is currently not supported.
77
+ 2024-03-26 16:08:24,078 epoch 1 - iter 4/48 - loss 3.01262885 - time (sec): 1.43 - samples/sec: 1822.58 - lr: 0.000003 - momentum: 0.000000
78
+ 2024-03-26 16:08:25,865 epoch 1 - iter 8/48 - loss 2.93779010 - time (sec): 3.22 - samples/sec: 1591.43 - lr: 0.000007 - momentum: 0.000000
79
+ 2024-03-26 16:08:27,186 epoch 1 - iter 12/48 - loss 2.84117781 - time (sec): 4.54 - samples/sec: 1608.59 - lr: 0.000011 - momentum: 0.000000
80
+ 2024-03-26 16:08:29,699 epoch 1 - iter 16/48 - loss 2.67114193 - time (sec): 7.05 - samples/sec: 1517.00 - lr: 0.000016 - momentum: 0.000000
81
+ 2024-03-26 16:08:31,782 epoch 1 - iter 20/48 - loss 2.52522104 - time (sec): 9.14 - samples/sec: 1499.44 - lr: 0.000020 - momentum: 0.000000
82
+ 2024-03-26 16:08:34,428 epoch 1 - iter 24/48 - loss 2.38547459 - time (sec): 11.78 - samples/sec: 1436.60 - lr: 0.000024 - momentum: 0.000000
83
+ 2024-03-26 16:08:36,903 epoch 1 - iter 28/48 - loss 2.25744819 - time (sec): 14.26 - samples/sec: 1422.94 - lr: 0.000028 - momentum: 0.000000
84
+ 2024-03-26 16:08:38,768 epoch 1 - iter 32/48 - loss 2.16271842 - time (sec): 16.12 - samples/sec: 1418.81 - lr: 0.000032 - momentum: 0.000000
85
+ 2024-03-26 16:08:39,648 epoch 1 - iter 36/48 - loss 2.09134758 - time (sec): 17.00 - samples/sec: 1469.24 - lr: 0.000036 - momentum: 0.000000
86
+ 2024-03-26 16:08:41,504 epoch 1 - iter 40/48 - loss 1.97889116 - time (sec): 18.86 - samples/sec: 1476.93 - lr: 0.000041 - momentum: 0.000000
87
+ 2024-03-26 16:08:43,518 epoch 1 - iter 44/48 - loss 1.85096110 - time (sec): 20.87 - samples/sec: 1496.25 - lr: 0.000045 - momentum: 0.000000
88
+ 2024-03-26 16:08:45,225 epoch 1 - iter 48/48 - loss 1.75575348 - time (sec): 22.58 - samples/sec: 1526.72 - lr: 0.000049 - momentum: 0.000000
89
+ 2024-03-26 16:08:45,226 ----------------------------------------------------------------------------------------------------
90
+ 2024-03-26 16:08:45,226 EPOCH 1 done: loss 1.7558 - lr: 0.000049
91
+ 2024-03-26 16:08:46,127 DEV : loss 0.5416426658630371 - f1-score (micro avg) 0.6062
92
+ 2024-03-26 16:08:46,128 saving best model
93
+ 2024-03-26 16:08:46,395 ----------------------------------------------------------------------------------------------------
94
+ 2024-03-26 16:08:47,624 epoch 2 - iter 4/48 - loss 0.77185346 - time (sec): 1.23 - samples/sec: 1926.84 - lr: 0.000050 - momentum: 0.000000
95
+ 2024-03-26 16:08:49,867 epoch 2 - iter 8/48 - loss 0.57038653 - time (sec): 3.47 - samples/sec: 1571.80 - lr: 0.000049 - momentum: 0.000000
96
+ 2024-03-26 16:08:51,649 epoch 2 - iter 12/48 - loss 0.54365800 - time (sec): 5.25 - samples/sec: 1622.30 - lr: 0.000049 - momentum: 0.000000
97
+ 2024-03-26 16:08:54,035 epoch 2 - iter 16/48 - loss 0.48384659 - time (sec): 7.64 - samples/sec: 1476.77 - lr: 0.000048 - momentum: 0.000000
98
+ 2024-03-26 16:08:57,445 epoch 2 - iter 20/48 - loss 0.43932906 - time (sec): 11.05 - samples/sec: 1335.42 - lr: 0.000048 - momentum: 0.000000
99
+ 2024-03-26 16:08:58,924 epoch 2 - iter 24/48 - loss 0.44234508 - time (sec): 12.53 - samples/sec: 1392.30 - lr: 0.000047 - momentum: 0.000000
100
+ 2024-03-26 16:09:01,574 epoch 2 - iter 28/48 - loss 0.42993359 - time (sec): 15.18 - samples/sec: 1364.10 - lr: 0.000047 - momentum: 0.000000
101
+ 2024-03-26 16:09:04,268 epoch 2 - iter 32/48 - loss 0.41106428 - time (sec): 17.87 - samples/sec: 1365.74 - lr: 0.000046 - momentum: 0.000000
102
+ 2024-03-26 16:09:06,350 epoch 2 - iter 36/48 - loss 0.40862728 - time (sec): 19.95 - samples/sec: 1355.21 - lr: 0.000046 - momentum: 0.000000
103
+ 2024-03-26 16:09:08,824 epoch 2 - iter 40/48 - loss 0.39796027 - time (sec): 22.43 - samples/sec: 1345.81 - lr: 0.000046 - momentum: 0.000000
104
+ 2024-03-26 16:09:09,882 epoch 2 - iter 44/48 - loss 0.39281994 - time (sec): 23.49 - samples/sec: 1380.57 - lr: 0.000045 - momentum: 0.000000
105
+ 2024-03-26 16:09:11,038 epoch 2 - iter 48/48 - loss 0.38932787 - time (sec): 24.64 - samples/sec: 1398.88 - lr: 0.000045 - momentum: 0.000000
106
+ 2024-03-26 16:09:11,038 ----------------------------------------------------------------------------------------------------
107
+ 2024-03-26 16:09:11,038 EPOCH 2 done: loss 0.3893 - lr: 0.000045
108
+ 2024-03-26 16:09:11,956 DEV : loss 0.2764347195625305 - f1-score (micro avg) 0.8381
109
+ 2024-03-26 16:09:11,957 saving best model
110
+ 2024-03-26 16:09:12,423 ----------------------------------------------------------------------------------------------------
111
+ 2024-03-26 16:09:14,415 epoch 3 - iter 4/48 - loss 0.24386437 - time (sec): 1.99 - samples/sec: 1232.85 - lr: 0.000044 - momentum: 0.000000
112
+ 2024-03-26 16:09:15,968 epoch 3 - iter 8/48 - loss 0.20474886 - time (sec): 3.55 - samples/sec: 1351.18 - lr: 0.000044 - momentum: 0.000000
113
+ 2024-03-26 16:09:18,550 epoch 3 - iter 12/48 - loss 0.21783118 - time (sec): 6.13 - samples/sec: 1269.82 - lr: 0.000043 - momentum: 0.000000
114
+ 2024-03-26 16:09:20,565 epoch 3 - iter 16/48 - loss 0.22185263 - time (sec): 8.14 - samples/sec: 1309.79 - lr: 0.000043 - momentum: 0.000000
115
+ 2024-03-26 16:09:22,454 epoch 3 - iter 20/48 - loss 0.21419391 - time (sec): 10.03 - samples/sec: 1380.34 - lr: 0.000042 - momentum: 0.000000
116
+ 2024-03-26 16:09:24,671 epoch 3 - iter 24/48 - loss 0.20404993 - time (sec): 12.25 - samples/sec: 1395.87 - lr: 0.000042 - momentum: 0.000000
117
+ 2024-03-26 16:09:27,147 epoch 3 - iter 28/48 - loss 0.19865318 - time (sec): 14.72 - samples/sec: 1354.68 - lr: 0.000041 - momentum: 0.000000
118
+ 2024-03-26 16:09:29,704 epoch 3 - iter 32/48 - loss 0.19439264 - time (sec): 17.28 - samples/sec: 1330.55 - lr: 0.000041 - momentum: 0.000000
119
+ 2024-03-26 16:09:31,824 epoch 3 - iter 36/48 - loss 0.19188968 - time (sec): 19.40 - samples/sec: 1334.36 - lr: 0.000040 - momentum: 0.000000
120
+ 2024-03-26 16:09:34,132 epoch 3 - iter 40/48 - loss 0.19897681 - time (sec): 21.71 - samples/sec: 1349.97 - lr: 0.000040 - momentum: 0.000000
121
+ 2024-03-26 16:09:36,662 epoch 3 - iter 44/48 - loss 0.19355398 - time (sec): 24.24 - samples/sec: 1332.94 - lr: 0.000040 - momentum: 0.000000
122
+ 2024-03-26 16:09:38,164 epoch 3 - iter 48/48 - loss 0.19348732 - time (sec): 25.74 - samples/sec: 1339.20 - lr: 0.000039 - momentum: 0.000000
123
+ 2024-03-26 16:09:38,164 ----------------------------------------------------------------------------------------------------
124
+ 2024-03-26 16:09:38,164 EPOCH 3 done: loss 0.1935 - lr: 0.000039
125
+ 2024-03-26 16:09:39,074 DEV : loss 0.20766779780387878 - f1-score (micro avg) 0.8816
126
+ 2024-03-26 16:09:39,075 saving best model
127
+ 2024-03-26 16:09:39,542 ----------------------------------------------------------------------------------------------------
128
+ 2024-03-26 16:09:42,527 epoch 4 - iter 4/48 - loss 0.08032683 - time (sec): 2.98 - samples/sec: 1221.45 - lr: 0.000039 - momentum: 0.000000
129
+ 2024-03-26 16:09:43,828 epoch 4 - iter 8/48 - loss 0.10812663 - time (sec): 4.29 - samples/sec: 1372.40 - lr: 0.000038 - momentum: 0.000000
130
+ 2024-03-26 16:09:45,899 epoch 4 - iter 12/48 - loss 0.11534098 - time (sec): 6.36 - samples/sec: 1451.14 - lr: 0.000038 - momentum: 0.000000
131
+ 2024-03-26 16:09:48,436 epoch 4 - iter 16/48 - loss 0.12087435 - time (sec): 8.89 - samples/sec: 1369.92 - lr: 0.000037 - momentum: 0.000000
132
+ 2024-03-26 16:09:49,422 epoch 4 - iter 20/48 - loss 0.12131514 - time (sec): 9.88 - samples/sec: 1454.05 - lr: 0.000037 - momentum: 0.000000
133
+ 2024-03-26 16:09:50,812 epoch 4 - iter 24/48 - loss 0.12433997 - time (sec): 11.27 - samples/sec: 1499.92 - lr: 0.000036 - momentum: 0.000000
134
+ 2024-03-26 16:09:53,909 epoch 4 - iter 28/48 - loss 0.11878673 - time (sec): 14.37 - samples/sec: 1404.90 - lr: 0.000036 - momentum: 0.000000
135
+ 2024-03-26 16:09:56,376 epoch 4 - iter 32/48 - loss 0.13063775 - time (sec): 16.83 - samples/sec: 1397.04 - lr: 0.000035 - momentum: 0.000000
136
+ 2024-03-26 16:09:57,889 epoch 4 - iter 36/48 - loss 0.13267124 - time (sec): 18.35 - samples/sec: 1432.32 - lr: 0.000035 - momentum: 0.000000
137
+ 2024-03-26 16:09:59,872 epoch 4 - iter 40/48 - loss 0.13049658 - time (sec): 20.33 - samples/sec: 1446.19 - lr: 0.000034 - momentum: 0.000000
138
+ 2024-03-26 16:10:01,757 epoch 4 - iter 44/48 - loss 0.13040292 - time (sec): 22.21 - samples/sec: 1460.05 - lr: 0.000034 - momentum: 0.000000
139
+ 2024-03-26 16:10:02,795 epoch 4 - iter 48/48 - loss 0.13154702 - time (sec): 23.25 - samples/sec: 1482.56 - lr: 0.000034 - momentum: 0.000000
140
+ 2024-03-26 16:10:02,795 ----------------------------------------------------------------------------------------------------
141
+ 2024-03-26 16:10:02,795 EPOCH 4 done: loss 0.1315 - lr: 0.000034
142
+ 2024-03-26 16:10:03,696 DEV : loss 0.19951747357845306 - f1-score (micro avg) 0.8896
143
+ 2024-03-26 16:10:03,697 saving best model
144
+ 2024-03-26 16:10:04,129 ----------------------------------------------------------------------------------------------------
145
+ 2024-03-26 16:10:05,252 epoch 5 - iter 4/48 - loss 0.14415653 - time (sec): 1.12 - samples/sec: 2264.36 - lr: 0.000033 - momentum: 0.000000
146
+ 2024-03-26 16:10:07,124 epoch 5 - iter 8/48 - loss 0.12247090 - time (sec): 2.99 - samples/sec: 1730.13 - lr: 0.000033 - momentum: 0.000000
147
+ 2024-03-26 16:10:09,227 epoch 5 - iter 12/48 - loss 0.11573894 - time (sec): 5.10 - samples/sec: 1569.88 - lr: 0.000032 - momentum: 0.000000
148
+ 2024-03-26 16:10:11,487 epoch 5 - iter 16/48 - loss 0.11183223 - time (sec): 7.36 - samples/sec: 1507.43 - lr: 0.000032 - momentum: 0.000000
149
+ 2024-03-26 16:10:13,739 epoch 5 - iter 20/48 - loss 0.11225416 - time (sec): 9.61 - samples/sec: 1423.95 - lr: 0.000031 - momentum: 0.000000
150
+ 2024-03-26 16:10:15,904 epoch 5 - iter 24/48 - loss 0.10842710 - time (sec): 11.78 - samples/sec: 1442.87 - lr: 0.000031 - momentum: 0.000000
151
+ 2024-03-26 16:10:17,505 epoch 5 - iter 28/48 - loss 0.10753626 - time (sec): 13.38 - samples/sec: 1471.51 - lr: 0.000030 - momentum: 0.000000
152
+ 2024-03-26 16:10:19,595 epoch 5 - iter 32/48 - loss 0.09913212 - time (sec): 15.47 - samples/sec: 1493.75 - lr: 0.000030 - momentum: 0.000000
153
+ 2024-03-26 16:10:20,989 epoch 5 - iter 36/48 - loss 0.09830247 - time (sec): 16.86 - samples/sec: 1518.37 - lr: 0.000029 - momentum: 0.000000
154
+ 2024-03-26 16:10:23,543 epoch 5 - iter 40/48 - loss 0.09324862 - time (sec): 19.41 - samples/sec: 1484.92 - lr: 0.000029 - momentum: 0.000000
155
+ 2024-03-26 16:10:26,471 epoch 5 - iter 44/48 - loss 0.09129787 - time (sec): 22.34 - samples/sec: 1433.54 - lr: 0.000029 - momentum: 0.000000
156
+ 2024-03-26 16:10:27,976 epoch 5 - iter 48/48 - loss 0.09261795 - time (sec): 23.85 - samples/sec: 1445.56 - lr: 0.000028 - momentum: 0.000000
157
+ 2024-03-26 16:10:27,976 ----------------------------------------------------------------------------------------------------
158
+ 2024-03-26 16:10:27,976 EPOCH 5 done: loss 0.0926 - lr: 0.000028
159
+ 2024-03-26 16:10:28,892 DEV : loss 0.14850705862045288 - f1-score (micro avg) 0.9164
160
+ 2024-03-26 16:10:28,893 saving best model
161
+ 2024-03-26 16:10:29,355 ----------------------------------------------------------------------------------------------------
162
+ 2024-03-26 16:10:31,222 epoch 6 - iter 4/48 - loss 0.13258076 - time (sec): 1.87 - samples/sec: 1575.07 - lr: 0.000028 - momentum: 0.000000
163
+ 2024-03-26 16:10:32,940 epoch 6 - iter 8/48 - loss 0.09926176 - time (sec): 3.58 - samples/sec: 1617.46 - lr: 0.000027 - momentum: 0.000000
164
+ 2024-03-26 16:10:35,242 epoch 6 - iter 12/48 - loss 0.08551746 - time (sec): 5.89 - samples/sec: 1497.95 - lr: 0.000027 - momentum: 0.000000
165
+ 2024-03-26 16:10:36,805 epoch 6 - iter 16/48 - loss 0.07810042 - time (sec): 7.45 - samples/sec: 1521.11 - lr: 0.000026 - momentum: 0.000000
166
+ 2024-03-26 16:10:39,357 epoch 6 - iter 20/48 - loss 0.06921398 - time (sec): 10.00 - samples/sec: 1436.37 - lr: 0.000026 - momentum: 0.000000
167
+ 2024-03-26 16:10:41,411 epoch 6 - iter 24/48 - loss 0.07353045 - time (sec): 12.06 - samples/sec: 1450.81 - lr: 0.000025 - momentum: 0.000000
168
+ 2024-03-26 16:10:44,022 epoch 6 - iter 28/48 - loss 0.07578696 - time (sec): 14.67 - samples/sec: 1427.31 - lr: 0.000025 - momentum: 0.000000
169
+ 2024-03-26 16:10:46,062 epoch 6 - iter 32/48 - loss 0.07287692 - time (sec): 16.71 - samples/sec: 1406.71 - lr: 0.000024 - momentum: 0.000000
170
+ 2024-03-26 16:10:47,166 epoch 6 - iter 36/48 - loss 0.07368053 - time (sec): 17.81 - samples/sec: 1456.47 - lr: 0.000024 - momentum: 0.000000
171
+ 2024-03-26 16:10:49,354 epoch 6 - iter 40/48 - loss 0.07384585 - time (sec): 20.00 - samples/sec: 1445.89 - lr: 0.000023 - momentum: 0.000000
172
+ 2024-03-26 16:10:50,961 epoch 6 - iter 44/48 - loss 0.07766855 - time (sec): 21.61 - samples/sec: 1469.58 - lr: 0.000023 - momentum: 0.000000
173
+ 2024-03-26 16:10:52,736 epoch 6 - iter 48/48 - loss 0.07515371 - time (sec): 23.38 - samples/sec: 1474.37 - lr: 0.000023 - momentum: 0.000000
174
+ 2024-03-26 16:10:52,737 ----------------------------------------------------------------------------------------------------
175
+ 2024-03-26 16:10:52,737 EPOCH 6 done: loss 0.0752 - lr: 0.000023
176
+ 2024-03-26 16:10:53,670 DEV : loss 0.1686367392539978 - f1-score (micro avg) 0.9132
177
+ 2024-03-26 16:10:53,671 ----------------------------------------------------------------------------------------------------
178
+ 2024-03-26 16:10:55,214 epoch 7 - iter 4/48 - loss 0.06489205 - time (sec): 1.54 - samples/sec: 1813.65 - lr: 0.000022 - momentum: 0.000000
179
+ 2024-03-26 16:10:57,331 epoch 7 - iter 8/48 - loss 0.05200996 - time (sec): 3.66 - samples/sec: 1672.13 - lr: 0.000022 - momentum: 0.000000
180
+ 2024-03-26 16:10:59,589 epoch 7 - iter 12/48 - loss 0.04830219 - time (sec): 5.92 - samples/sec: 1488.57 - lr: 0.000021 - momentum: 0.000000
181
+ 2024-03-26 16:11:00,762 epoch 7 - iter 16/48 - loss 0.05594283 - time (sec): 7.09 - samples/sec: 1587.29 - lr: 0.000021 - momentum: 0.000000
182
+ 2024-03-26 16:11:02,874 epoch 7 - iter 20/48 - loss 0.05610748 - time (sec): 9.20 - samples/sec: 1558.74 - lr: 0.000020 - momentum: 0.000000
183
+ 2024-03-26 16:11:04,378 epoch 7 - iter 24/48 - loss 0.05230434 - time (sec): 10.71 - samples/sec: 1607.68 - lr: 0.000020 - momentum: 0.000000
184
+ 2024-03-26 16:11:06,473 epoch 7 - iter 28/48 - loss 0.05244246 - time (sec): 12.80 - samples/sec: 1568.08 - lr: 0.000019 - momentum: 0.000000
185
+ 2024-03-26 16:11:09,234 epoch 7 - iter 32/48 - loss 0.05443414 - time (sec): 15.56 - samples/sec: 1496.51 - lr: 0.000019 - momentum: 0.000000
186
+ 2024-03-26 16:11:11,178 epoch 7 - iter 36/48 - loss 0.05325156 - time (sec): 17.51 - samples/sec: 1498.95 - lr: 0.000018 - momentum: 0.000000
187
+ 2024-03-26 16:11:12,289 epoch 7 - iter 40/48 - loss 0.05598614 - time (sec): 18.62 - samples/sec: 1530.78 - lr: 0.000018 - momentum: 0.000000
188
+ 2024-03-26 16:11:14,877 epoch 7 - iter 44/48 - loss 0.05697896 - time (sec): 21.21 - samples/sec: 1511.93 - lr: 0.000017 - momentum: 0.000000
189
+ 2024-03-26 16:11:15,981 epoch 7 - iter 48/48 - loss 0.05793243 - time (sec): 22.31 - samples/sec: 1545.19 - lr: 0.000017 - momentum: 0.000000
190
+ 2024-03-26 16:11:15,981 ----------------------------------------------------------------------------------------------------
191
+ 2024-03-26 16:11:15,981 EPOCH 7 done: loss 0.0579 - lr: 0.000017
192
+ 2024-03-26 16:11:16,917 DEV : loss 0.17538714408874512 - f1-score (micro avg) 0.9283
193
+ 2024-03-26 16:11:16,919 saving best model
194
+ 2024-03-26 16:11:17,406 ----------------------------------------------------------------------------------------------------
195
+ 2024-03-26 16:11:19,517 epoch 8 - iter 4/48 - loss 0.03432799 - time (sec): 2.11 - samples/sec: 1314.03 - lr: 0.000017 - momentum: 0.000000
196
+ 2024-03-26 16:11:22,126 epoch 8 - iter 8/48 - loss 0.02884183 - time (sec): 4.72 - samples/sec: 1279.85 - lr: 0.000016 - momentum: 0.000000
197
+ 2024-03-26 16:11:23,791 epoch 8 - iter 12/48 - loss 0.02928206 - time (sec): 6.38 - samples/sec: 1328.50 - lr: 0.000016 - momentum: 0.000000
198
+ 2024-03-26 16:11:26,403 epoch 8 - iter 16/48 - loss 0.03796476 - time (sec): 8.99 - samples/sec: 1279.87 - lr: 0.000015 - momentum: 0.000000
199
+ 2024-03-26 16:11:28,042 epoch 8 - iter 20/48 - loss 0.03819963 - time (sec): 10.63 - samples/sec: 1336.03 - lr: 0.000015 - momentum: 0.000000
200
+ 2024-03-26 16:11:29,501 epoch 8 - iter 24/48 - loss 0.04424570 - time (sec): 12.09 - samples/sec: 1405.57 - lr: 0.000014 - momentum: 0.000000
201
+ 2024-03-26 16:11:31,359 epoch 8 - iter 28/48 - loss 0.04679237 - time (sec): 13.95 - samples/sec: 1429.37 - lr: 0.000014 - momentum: 0.000000
202
+ 2024-03-26 16:11:34,000 epoch 8 - iter 32/48 - loss 0.04614638 - time (sec): 16.59 - samples/sec: 1415.89 - lr: 0.000013 - momentum: 0.000000
203
+ 2024-03-26 16:11:36,402 epoch 8 - iter 36/48 - loss 0.04645476 - time (sec): 18.99 - samples/sec: 1407.37 - lr: 0.000013 - momentum: 0.000000
204
+ 2024-03-26 16:11:38,597 epoch 8 - iter 40/48 - loss 0.04547388 - time (sec): 21.19 - samples/sec: 1388.60 - lr: 0.000012 - momentum: 0.000000
205
+ 2024-03-26 16:11:40,837 epoch 8 - iter 44/48 - loss 0.04403015 - time (sec): 23.43 - samples/sec: 1378.78 - lr: 0.000012 - momentum: 0.000000
206
+ 2024-03-26 16:11:42,394 epoch 8 - iter 48/48 - loss 0.04472061 - time (sec): 24.99 - samples/sec: 1379.63 - lr: 0.000011 - momentum: 0.000000
207
+ 2024-03-26 16:11:42,394 ----------------------------------------------------------------------------------------------------
208
+ 2024-03-26 16:11:42,394 EPOCH 8 done: loss 0.0447 - lr: 0.000011
209
+ 2024-03-26 16:11:43,318 DEV : loss 0.17525753378868103 - f1-score (micro avg) 0.9262
210
+ 2024-03-26 16:11:43,319 ----------------------------------------------------------------------------------------------------
211
+ 2024-03-26 16:11:45,267 epoch 9 - iter 4/48 - loss 0.04784010 - time (sec): 1.95 - samples/sec: 1483.07 - lr: 0.000011 - momentum: 0.000000
212
+ 2024-03-26 16:11:48,416 epoch 9 - iter 8/48 - loss 0.04131757 - time (sec): 5.10 - samples/sec: 1235.19 - lr: 0.000011 - momentum: 0.000000
213
+ 2024-03-26 16:11:50,055 epoch 9 - iter 12/48 - loss 0.03445261 - time (sec): 6.74 - samples/sec: 1284.06 - lr: 0.000010 - momentum: 0.000000
214
+ 2024-03-26 16:11:51,929 epoch 9 - iter 16/48 - loss 0.03873747 - time (sec): 8.61 - samples/sec: 1327.88 - lr: 0.000010 - momentum: 0.000000
215
+ 2024-03-26 16:11:54,798 epoch 9 - iter 20/48 - loss 0.03519583 - time (sec): 11.48 - samples/sec: 1293.89 - lr: 0.000009 - momentum: 0.000000
216
+ 2024-03-26 16:11:56,315 epoch 9 - iter 24/48 - loss 0.03423279 - time (sec): 13.00 - samples/sec: 1342.66 - lr: 0.000009 - momentum: 0.000000
217
+ 2024-03-26 16:11:58,252 epoch 9 - iter 28/48 - loss 0.03557512 - time (sec): 14.93 - samples/sec: 1368.03 - lr: 0.000008 - momentum: 0.000000
218
+ 2024-03-26 16:12:00,560 epoch 9 - iter 32/48 - loss 0.03504241 - time (sec): 17.24 - samples/sec: 1347.42 - lr: 0.000008 - momentum: 0.000000
219
+ 2024-03-26 16:12:01,859 epoch 9 - iter 36/48 - loss 0.03774454 - time (sec): 18.54 - samples/sec: 1378.84 - lr: 0.000007 - momentum: 0.000000
220
+ 2024-03-26 16:12:05,037 epoch 9 - iter 40/48 - loss 0.03723664 - time (sec): 21.72 - samples/sec: 1332.48 - lr: 0.000007 - momentum: 0.000000
221
+ 2024-03-26 16:12:07,153 epoch 9 - iter 44/48 - loss 0.03449321 - time (sec): 23.83 - samples/sec: 1355.22 - lr: 0.000006 - momentum: 0.000000
222
+ 2024-03-26 16:12:08,137 epoch 9 - iter 48/48 - loss 0.03640978 - time (sec): 24.82 - samples/sec: 1389.01 - lr: 0.000006 - momentum: 0.000000
223
+ 2024-03-26 16:12:08,138 ----------------------------------------------------------------------------------------------------
224
+ 2024-03-26 16:12:08,138 EPOCH 9 done: loss 0.0364 - lr: 0.000006
225
+ 2024-03-26 16:12:09,067 DEV : loss 0.18227042257785797 - f1-score (micro avg) 0.9233
226
+ 2024-03-26 16:12:09,069 ----------------------------------------------------------------------------------------------------
227
+ 2024-03-26 16:12:10,940 epoch 10 - iter 4/48 - loss 0.04465021 - time (sec): 1.87 - samples/sec: 1382.64 - lr: 0.000006 - momentum: 0.000000
228
+ 2024-03-26 16:12:13,722 epoch 10 - iter 8/48 - loss 0.03046547 - time (sec): 4.65 - samples/sec: 1243.44 - lr: 0.000005 - momentum: 0.000000
229
+ 2024-03-26 16:12:15,750 epoch 10 - iter 12/48 - loss 0.03486064 - time (sec): 6.68 - samples/sec: 1304.29 - lr: 0.000005 - momentum: 0.000000
230
+ 2024-03-26 16:12:17,777 epoch 10 - iter 16/48 - loss 0.02941260 - time (sec): 8.71 - samples/sec: 1397.19 - lr: 0.000004 - momentum: 0.000000
231
+ 2024-03-26 16:12:18,649 epoch 10 - iter 20/48 - loss 0.02796427 - time (sec): 9.58 - samples/sec: 1473.68 - lr: 0.000004 - momentum: 0.000000
232
+ 2024-03-26 16:12:20,332 epoch 10 - iter 24/48 - loss 0.02677033 - time (sec): 11.26 - samples/sec: 1501.88 - lr: 0.000003 - momentum: 0.000000
233
+ 2024-03-26 16:12:21,273 epoch 10 - iter 28/48 - loss 0.02641570 - time (sec): 12.20 - samples/sec: 1565.87 - lr: 0.000003 - momentum: 0.000000
234
+ 2024-03-26 16:12:23,596 epoch 10 - iter 32/48 - loss 0.02539590 - time (sec): 14.53 - samples/sec: 1531.59 - lr: 0.000002 - momentum: 0.000000
235
+ 2024-03-26 16:12:26,100 epoch 10 - iter 36/48 - loss 0.02979074 - time (sec): 17.03 - samples/sec: 1497.01 - lr: 0.000002 - momentum: 0.000000
236
+ 2024-03-26 16:12:27,989 epoch 10 - iter 40/48 - loss 0.03110433 - time (sec): 18.92 - samples/sec: 1491.39 - lr: 0.000001 - momentum: 0.000000
237
+ 2024-03-26 16:12:30,564 epoch 10 - iter 44/48 - loss 0.03045518 - time (sec): 21.49 - samples/sec: 1479.44 - lr: 0.000001 - momentum: 0.000000
238
+ 2024-03-26 16:12:32,168 epoch 10 - iter 48/48 - loss 0.03060311 - time (sec): 23.10 - samples/sec: 1492.44 - lr: 0.000000 - momentum: 0.000000
239
+ 2024-03-26 16:12:32,168 ----------------------------------------------------------------------------------------------------
240
+ 2024-03-26 16:12:32,168 EPOCH 10 done: loss 0.0306 - lr: 0.000000
241
+ 2024-03-26 16:12:33,117 DEV : loss 0.18582072854042053 - f1-score (micro avg) 0.9242
242
+ 2024-03-26 16:12:33,406 ----------------------------------------------------------------------------------------------------
243
+ 2024-03-26 16:12:33,406 Loading model from best epoch ...
244
+ 2024-03-26 16:12:34,300 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
245
+ 2024-03-26 16:12:35,055
246
+ Results:
247
+ - F-score (micro) 0.8902
248
+ - F-score (macro) 0.6786
249
+ - Accuracy 0.8055
250
+
251
+ By class:
252
+ precision recall f1-score support
253
+
254
+ Unternehmen 0.8750 0.8684 0.8717 266
255
+ Auslagerung 0.8500 0.8876 0.8684 249
256
+ Ort 0.9635 0.9851 0.9742 134
257
+ Software 0.0000 0.0000 0.0000 0
258
+
259
+ micro avg 0.8808 0.8998 0.8902 649
260
+ macro avg 0.6721 0.6853 0.6786 649
261
+ weighted avg 0.8837 0.8998 0.8916 649
262
+
263
+ 2024-03-26 16:12:35,055 ----------------------------------------------------------------------------------------------------