stefan-it commited on
Commit
7fd2927
1 Parent(s): e7da946

Upload folder using huggingface_hub

Browse files
best-model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2bda6a8ed42c6d3407c151c39c95ad53f3f08c354e1c8e44c10e765a4e21967b
3
+ size 19050210
dev.tsv ADDED
The diff for this file is too large to render. See raw diff
 
loss.tsv ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
2
+ 1 17:55:18 0.0000 1.5580 0.4794 0.0000 0.0000 0.0000 0.0000
3
+ 2 17:55:34 0.0000 0.5351 0.3752 0.1622 0.0047 0.0091 0.0046
4
+ 3 17:55:50 0.0000 0.4632 0.3417 0.2674 0.0782 0.1210 0.0653
5
+ 4 17:56:06 0.0000 0.4114 0.3258 0.3429 0.1689 0.2263 0.1312
6
+ 5 17:56:21 0.0000 0.3805 0.3155 0.3617 0.2291 0.2805 0.1699
7
+ 6 17:56:37 0.0000 0.3704 0.3080 0.3702 0.2721 0.3137 0.1946
8
+ 7 17:56:52 0.0000 0.3549 0.3086 0.3854 0.2838 0.3269 0.2038
9
+ 8 17:57:08 0.0000 0.3436 0.3109 0.3906 0.2807 0.3267 0.2034
10
+ 9 17:57:23 0.0000 0.3423 0.3028 0.3801 0.3010 0.3360 0.2115
11
+ 10 17:57:39 0.0000 0.3344 0.3014 0.3772 0.3026 0.3358 0.2118
runs/events.out.tfevents.1697651706.46dc0c540dd0.2878.6 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:42ef199019750597885dbb7b7473510f7017a1aa1757fc02f808a568c590c94d
3
+ size 253592
test.tsv ADDED
The diff for this file is too large to render. See raw diff
 
training.log ADDED
@@ -0,0 +1,246 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-18 17:55:06,196 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-18 17:55:06,196 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(32001, 128)
7
+ (position_embeddings): Embedding(512, 128)
8
+ (token_type_embeddings): Embedding(2, 128)
9
+ (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-1): 2 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=128, out_features=128, bias=True)
18
+ (key): Linear(in_features=128, out_features=128, bias=True)
19
+ (value): Linear(in_features=128, out_features=128, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=128, out_features=128, bias=True)
24
+ (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=128, out_features=512, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=512, out_features=128, bias=True)
34
+ (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=128, out_features=128, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=128, out_features=21, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2023-10-18 17:55:06,196 ----------------------------------------------------------------------------------------------------
51
+ 2023-10-18 17:55:06,196 MultiCorpus: 3575 train + 1235 dev + 1266 test sentences
52
+ - NER_HIPE_2022 Corpus: 3575 train + 1235 dev + 1266 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/de/with_doc_seperator
53
+ 2023-10-18 17:55:06,196 ----------------------------------------------------------------------------------------------------
54
+ 2023-10-18 17:55:06,196 Train: 3575 sentences
55
+ 2023-10-18 17:55:06,196 (train_with_dev=False, train_with_test=False)
56
+ 2023-10-18 17:55:06,196 ----------------------------------------------------------------------------------------------------
57
+ 2023-10-18 17:55:06,196 Training Params:
58
+ 2023-10-18 17:55:06,196 - learning_rate: "3e-05"
59
+ 2023-10-18 17:55:06,196 - mini_batch_size: "8"
60
+ 2023-10-18 17:55:06,196 - max_epochs: "10"
61
+ 2023-10-18 17:55:06,196 - shuffle: "True"
62
+ 2023-10-18 17:55:06,196 ----------------------------------------------------------------------------------------------------
63
+ 2023-10-18 17:55:06,196 Plugins:
64
+ 2023-10-18 17:55:06,196 - TensorboardLogger
65
+ 2023-10-18 17:55:06,196 - LinearScheduler | warmup_fraction: '0.1'
66
+ 2023-10-18 17:55:06,196 ----------------------------------------------------------------------------------------------------
67
+ 2023-10-18 17:55:06,196 Final evaluation on model from best epoch (best-model.pt)
68
+ 2023-10-18 17:55:06,196 - metric: "('micro avg', 'f1-score')"
69
+ 2023-10-18 17:55:06,197 ----------------------------------------------------------------------------------------------------
70
+ 2023-10-18 17:55:06,197 Computation:
71
+ 2023-10-18 17:55:06,197 - compute on device: cuda:0
72
+ 2023-10-18 17:55:06,197 - embedding storage: none
73
+ 2023-10-18 17:55:06,197 ----------------------------------------------------------------------------------------------------
74
+ 2023-10-18 17:55:06,197 Model training base path: "hmbench-hipe2020/de-dbmdz/bert-tiny-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2"
75
+ 2023-10-18 17:55:06,197 ----------------------------------------------------------------------------------------------------
76
+ 2023-10-18 17:55:06,197 ----------------------------------------------------------------------------------------------------
77
+ 2023-10-18 17:55:06,197 Logging anything other than scalars to TensorBoard is currently not supported.
78
+ 2023-10-18 17:55:07,276 epoch 1 - iter 44/447 - loss 3.21526268 - time (sec): 1.08 - samples/sec: 8384.48 - lr: 0.000003 - momentum: 0.000000
79
+ 2023-10-18 17:55:08,312 epoch 1 - iter 88/447 - loss 3.11068415 - time (sec): 2.12 - samples/sec: 8839.97 - lr: 0.000006 - momentum: 0.000000
80
+ 2023-10-18 17:55:09,320 epoch 1 - iter 132/447 - loss 2.96404838 - time (sec): 3.12 - samples/sec: 8377.12 - lr: 0.000009 - momentum: 0.000000
81
+ 2023-10-18 17:55:10,315 epoch 1 - iter 176/447 - loss 2.74904136 - time (sec): 4.12 - samples/sec: 8216.71 - lr: 0.000012 - momentum: 0.000000
82
+ 2023-10-18 17:55:11,325 epoch 1 - iter 220/447 - loss 2.49019133 - time (sec): 5.13 - samples/sec: 8152.73 - lr: 0.000015 - momentum: 0.000000
83
+ 2023-10-18 17:55:12,348 epoch 1 - iter 264/447 - loss 2.21707471 - time (sec): 6.15 - samples/sec: 8184.32 - lr: 0.000018 - momentum: 0.000000
84
+ 2023-10-18 17:55:13,366 epoch 1 - iter 308/447 - loss 1.98807103 - time (sec): 7.17 - samples/sec: 8266.22 - lr: 0.000021 - momentum: 0.000000
85
+ 2023-10-18 17:55:14,401 epoch 1 - iter 352/447 - loss 1.79499806 - time (sec): 8.20 - samples/sec: 8382.75 - lr: 0.000024 - momentum: 0.000000
86
+ 2023-10-18 17:55:15,414 epoch 1 - iter 396/447 - loss 1.66282855 - time (sec): 9.22 - samples/sec: 8369.73 - lr: 0.000027 - momentum: 0.000000
87
+ 2023-10-18 17:55:16,456 epoch 1 - iter 440/447 - loss 1.56914304 - time (sec): 10.26 - samples/sec: 8314.49 - lr: 0.000029 - momentum: 0.000000
88
+ 2023-10-18 17:55:16,622 ----------------------------------------------------------------------------------------------------
89
+ 2023-10-18 17:55:16,623 EPOCH 1 done: loss 1.5580 - lr: 0.000029
90
+ 2023-10-18 17:55:18,857 DEV : loss 0.4793676435947418 - f1-score (micro avg) 0.0
91
+ 2023-10-18 17:55:18,886 ----------------------------------------------------------------------------------------------------
92
+ 2023-10-18 17:55:19,932 epoch 2 - iter 44/447 - loss 0.53138539 - time (sec): 1.04 - samples/sec: 7990.03 - lr: 0.000030 - momentum: 0.000000
93
+ 2023-10-18 17:55:20,938 epoch 2 - iter 88/447 - loss 0.56693289 - time (sec): 2.05 - samples/sec: 8359.24 - lr: 0.000029 - momentum: 0.000000
94
+ 2023-10-18 17:55:21,950 epoch 2 - iter 132/447 - loss 0.55886578 - time (sec): 3.06 - samples/sec: 8124.75 - lr: 0.000029 - momentum: 0.000000
95
+ 2023-10-18 17:55:22,968 epoch 2 - iter 176/447 - loss 0.55590184 - time (sec): 4.08 - samples/sec: 8087.50 - lr: 0.000029 - momentum: 0.000000
96
+ 2023-10-18 17:55:24,032 epoch 2 - iter 220/447 - loss 0.56384809 - time (sec): 5.14 - samples/sec: 8272.06 - lr: 0.000028 - momentum: 0.000000
97
+ 2023-10-18 17:55:24,999 epoch 2 - iter 264/447 - loss 0.55035800 - time (sec): 6.11 - samples/sec: 8342.82 - lr: 0.000028 - momentum: 0.000000
98
+ 2023-10-18 17:55:26,073 epoch 2 - iter 308/447 - loss 0.54447525 - time (sec): 7.19 - samples/sec: 8485.09 - lr: 0.000028 - momentum: 0.000000
99
+ 2023-10-18 17:55:27,099 epoch 2 - iter 352/447 - loss 0.54060510 - time (sec): 8.21 - samples/sec: 8347.52 - lr: 0.000027 - momentum: 0.000000
100
+ 2023-10-18 17:55:28,159 epoch 2 - iter 396/447 - loss 0.53839069 - time (sec): 9.27 - samples/sec: 8313.01 - lr: 0.000027 - momentum: 0.000000
101
+ 2023-10-18 17:55:29,206 epoch 2 - iter 440/447 - loss 0.53391658 - time (sec): 10.32 - samples/sec: 8278.84 - lr: 0.000027 - momentum: 0.000000
102
+ 2023-10-18 17:55:29,356 ----------------------------------------------------------------------------------------------------
103
+ 2023-10-18 17:55:29,356 EPOCH 2 done: loss 0.5351 - lr: 0.000027
104
+ 2023-10-18 17:55:34,588 DEV : loss 0.37521103024482727 - f1-score (micro avg) 0.0091
105
+ 2023-10-18 17:55:34,615 saving best model
106
+ 2023-10-18 17:55:34,648 ----------------------------------------------------------------------------------------------------
107
+ 2023-10-18 17:55:35,674 epoch 3 - iter 44/447 - loss 0.48728016 - time (sec): 1.02 - samples/sec: 8784.04 - lr: 0.000026 - momentum: 0.000000
108
+ 2023-10-18 17:55:36,702 epoch 3 - iter 88/447 - loss 0.48785073 - time (sec): 2.05 - samples/sec: 8542.35 - lr: 0.000026 - momentum: 0.000000
109
+ 2023-10-18 17:55:37,757 epoch 3 - iter 132/447 - loss 0.48141366 - time (sec): 3.11 - samples/sec: 8435.54 - lr: 0.000026 - momentum: 0.000000
110
+ 2023-10-18 17:55:38,791 epoch 3 - iter 176/447 - loss 0.49254271 - time (sec): 4.14 - samples/sec: 8208.75 - lr: 0.000025 - momentum: 0.000000
111
+ 2023-10-18 17:55:39,800 epoch 3 - iter 220/447 - loss 0.48009553 - time (sec): 5.15 - samples/sec: 8137.02 - lr: 0.000025 - momentum: 0.000000
112
+ 2023-10-18 17:55:40,791 epoch 3 - iter 264/447 - loss 0.48043658 - time (sec): 6.14 - samples/sec: 8202.99 - lr: 0.000025 - momentum: 0.000000
113
+ 2023-10-18 17:55:41,817 epoch 3 - iter 308/447 - loss 0.46945156 - time (sec): 7.17 - samples/sec: 8223.78 - lr: 0.000024 - momentum: 0.000000
114
+ 2023-10-18 17:55:42,923 epoch 3 - iter 352/447 - loss 0.47147629 - time (sec): 8.27 - samples/sec: 8205.12 - lr: 0.000024 - momentum: 0.000000
115
+ 2023-10-18 17:55:43,958 epoch 3 - iter 396/447 - loss 0.46777770 - time (sec): 9.31 - samples/sec: 8238.31 - lr: 0.000024 - momentum: 0.000000
116
+ 2023-10-18 17:55:44,956 epoch 3 - iter 440/447 - loss 0.46362258 - time (sec): 10.31 - samples/sec: 8291.65 - lr: 0.000023 - momentum: 0.000000
117
+ 2023-10-18 17:55:45,101 ----------------------------------------------------------------------------------------------------
118
+ 2023-10-18 17:55:45,101 EPOCH 3 done: loss 0.4632 - lr: 0.000023
119
+ 2023-10-18 17:55:50,289 DEV : loss 0.3417363464832306 - f1-score (micro avg) 0.121
120
+ 2023-10-18 17:55:50,316 saving best model
121
+ 2023-10-18 17:55:50,353 ----------------------------------------------------------------------------------------------------
122
+ 2023-10-18 17:55:51,337 epoch 4 - iter 44/447 - loss 0.45316300 - time (sec): 0.98 - samples/sec: 8248.65 - lr: 0.000023 - momentum: 0.000000
123
+ 2023-10-18 17:55:52,461 epoch 4 - iter 88/447 - loss 0.40894581 - time (sec): 2.11 - samples/sec: 8613.06 - lr: 0.000023 - momentum: 0.000000
124
+ 2023-10-18 17:55:53,478 epoch 4 - iter 132/447 - loss 0.41135455 - time (sec): 3.12 - samples/sec: 8514.01 - lr: 0.000022 - momentum: 0.000000
125
+ 2023-10-18 17:55:54,490 epoch 4 - iter 176/447 - loss 0.41549718 - time (sec): 4.14 - samples/sec: 8527.39 - lr: 0.000022 - momentum: 0.000000
126
+ 2023-10-18 17:55:55,517 epoch 4 - iter 220/447 - loss 0.40581863 - time (sec): 5.16 - samples/sec: 8532.55 - lr: 0.000022 - momentum: 0.000000
127
+ 2023-10-18 17:55:56,476 epoch 4 - iter 264/447 - loss 0.41064648 - time (sec): 6.12 - samples/sec: 8543.42 - lr: 0.000021 - momentum: 0.000000
128
+ 2023-10-18 17:55:57,474 epoch 4 - iter 308/447 - loss 0.40762911 - time (sec): 7.12 - samples/sec: 8506.55 - lr: 0.000021 - momentum: 0.000000
129
+ 2023-10-18 17:55:58,507 epoch 4 - iter 352/447 - loss 0.41255573 - time (sec): 8.15 - samples/sec: 8467.32 - lr: 0.000021 - momentum: 0.000000
130
+ 2023-10-18 17:55:59,560 epoch 4 - iter 396/447 - loss 0.41309072 - time (sec): 9.21 - samples/sec: 8372.24 - lr: 0.000020 - momentum: 0.000000
131
+ 2023-10-18 17:56:00,588 epoch 4 - iter 440/447 - loss 0.41180366 - time (sec): 10.23 - samples/sec: 8331.22 - lr: 0.000020 - momentum: 0.000000
132
+ 2023-10-18 17:56:00,754 ----------------------------------------------------------------------------------------------------
133
+ 2023-10-18 17:56:00,754 EPOCH 4 done: loss 0.4114 - lr: 0.000020
134
+ 2023-10-18 17:56:06,011 DEV : loss 0.3258393108844757 - f1-score (micro avg) 0.2263
135
+ 2023-10-18 17:56:06,039 saving best model
136
+ 2023-10-18 17:56:06,080 ----------------------------------------------------------------------------------------------------
137
+ 2023-10-18 17:56:07,143 epoch 5 - iter 44/447 - loss 0.39673676 - time (sec): 1.06 - samples/sec: 8067.49 - lr: 0.000020 - momentum: 0.000000
138
+ 2023-10-18 17:56:08,174 epoch 5 - iter 88/447 - loss 0.38054984 - time (sec): 2.09 - samples/sec: 8516.95 - lr: 0.000019 - momentum: 0.000000
139
+ 2023-10-18 17:56:09,178 epoch 5 - iter 132/447 - loss 0.38129747 - time (sec): 3.10 - samples/sec: 8224.78 - lr: 0.000019 - momentum: 0.000000
140
+ 2023-10-18 17:56:10,186 epoch 5 - iter 176/447 - loss 0.38376034 - time (sec): 4.10 - samples/sec: 8330.52 - lr: 0.000019 - momentum: 0.000000
141
+ 2023-10-18 17:56:11,164 epoch 5 - iter 220/447 - loss 0.38176883 - time (sec): 5.08 - samples/sec: 8263.63 - lr: 0.000018 - momentum: 0.000000
142
+ 2023-10-18 17:56:12,170 epoch 5 - iter 264/447 - loss 0.39071888 - time (sec): 6.09 - samples/sec: 8230.29 - lr: 0.000018 - momentum: 0.000000
143
+ 2023-10-18 17:56:13,215 epoch 5 - iter 308/447 - loss 0.39004818 - time (sec): 7.13 - samples/sec: 8307.87 - lr: 0.000018 - momentum: 0.000000
144
+ 2023-10-18 17:56:14,251 epoch 5 - iter 352/447 - loss 0.39166105 - time (sec): 8.17 - samples/sec: 8394.62 - lr: 0.000017 - momentum: 0.000000
145
+ 2023-10-18 17:56:15,268 epoch 5 - iter 396/447 - loss 0.38803979 - time (sec): 9.19 - samples/sec: 8403.08 - lr: 0.000017 - momentum: 0.000000
146
+ 2023-10-18 17:56:16,234 epoch 5 - iter 440/447 - loss 0.38221068 - time (sec): 10.15 - samples/sec: 8356.57 - lr: 0.000017 - momentum: 0.000000
147
+ 2023-10-18 17:56:16,414 ----------------------------------------------------------------------------------------------------
148
+ 2023-10-18 17:56:16,414 EPOCH 5 done: loss 0.3805 - lr: 0.000017
149
+ 2023-10-18 17:56:21,341 DEV : loss 0.31553900241851807 - f1-score (micro avg) 0.2805
150
+ 2023-10-18 17:56:21,370 saving best model
151
+ 2023-10-18 17:56:21,411 ----------------------------------------------------------------------------------------------------
152
+ 2023-10-18 17:56:22,414 epoch 6 - iter 44/447 - loss 0.36401615 - time (sec): 1.00 - samples/sec: 8342.53 - lr: 0.000016 - momentum: 0.000000
153
+ 2023-10-18 17:56:23,388 epoch 6 - iter 88/447 - loss 0.36764697 - time (sec): 1.98 - samples/sec: 8250.88 - lr: 0.000016 - momentum: 0.000000
154
+ 2023-10-18 17:56:24,398 epoch 6 - iter 132/447 - loss 0.35531431 - time (sec): 2.99 - samples/sec: 8010.00 - lr: 0.000016 - momentum: 0.000000
155
+ 2023-10-18 17:56:25,413 epoch 6 - iter 176/447 - loss 0.37688784 - time (sec): 4.00 - samples/sec: 8058.37 - lr: 0.000015 - momentum: 0.000000
156
+ 2023-10-18 17:56:26,407 epoch 6 - iter 220/447 - loss 0.37863630 - time (sec): 5.00 - samples/sec: 8094.73 - lr: 0.000015 - momentum: 0.000000
157
+ 2023-10-18 17:56:27,494 epoch 6 - iter 264/447 - loss 0.38822234 - time (sec): 6.08 - samples/sec: 8250.16 - lr: 0.000015 - momentum: 0.000000
158
+ 2023-10-18 17:56:28,473 epoch 6 - iter 308/447 - loss 0.38211922 - time (sec): 7.06 - samples/sec: 8350.66 - lr: 0.000014 - momentum: 0.000000
159
+ 2023-10-18 17:56:29,470 epoch 6 - iter 352/447 - loss 0.36918547 - time (sec): 8.06 - samples/sec: 8369.52 - lr: 0.000014 - momentum: 0.000000
160
+ 2023-10-18 17:56:30,866 epoch 6 - iter 396/447 - loss 0.37353967 - time (sec): 9.45 - samples/sec: 8139.09 - lr: 0.000014 - momentum: 0.000000
161
+ 2023-10-18 17:56:31,890 epoch 6 - iter 440/447 - loss 0.37112856 - time (sec): 10.48 - samples/sec: 8148.79 - lr: 0.000013 - momentum: 0.000000
162
+ 2023-10-18 17:56:32,042 ----------------------------------------------------------------------------------------------------
163
+ 2023-10-18 17:56:32,042 EPOCH 6 done: loss 0.3704 - lr: 0.000013
164
+ 2023-10-18 17:56:36,973 DEV : loss 0.3080426752567291 - f1-score (micro avg) 0.3137
165
+ 2023-10-18 17:56:37,001 saving best model
166
+ 2023-10-18 17:56:37,033 ----------------------------------------------------------------------------------------------------
167
+ 2023-10-18 17:56:38,063 epoch 7 - iter 44/447 - loss 0.34458627 - time (sec): 1.03 - samples/sec: 8532.09 - lr: 0.000013 - momentum: 0.000000
168
+ 2023-10-18 17:56:39,058 epoch 7 - iter 88/447 - loss 0.36276274 - time (sec): 2.02 - samples/sec: 8477.47 - lr: 0.000013 - momentum: 0.000000
169
+ 2023-10-18 17:56:40,050 epoch 7 - iter 132/447 - loss 0.36409525 - time (sec): 3.02 - samples/sec: 8458.28 - lr: 0.000012 - momentum: 0.000000
170
+ 2023-10-18 17:56:41,028 epoch 7 - iter 176/447 - loss 0.36044782 - time (sec): 3.99 - samples/sec: 8392.40 - lr: 0.000012 - momentum: 0.000000
171
+ 2023-10-18 17:56:42,036 epoch 7 - iter 220/447 - loss 0.35380827 - time (sec): 5.00 - samples/sec: 8372.21 - lr: 0.000012 - momentum: 0.000000
172
+ 2023-10-18 17:56:43,123 epoch 7 - iter 264/447 - loss 0.35154941 - time (sec): 6.09 - samples/sec: 8383.26 - lr: 0.000011 - momentum: 0.000000
173
+ 2023-10-18 17:56:44,136 epoch 7 - iter 308/447 - loss 0.35471366 - time (sec): 7.10 - samples/sec: 8397.72 - lr: 0.000011 - momentum: 0.000000
174
+ 2023-10-18 17:56:45,256 epoch 7 - iter 352/447 - loss 0.35780965 - time (sec): 8.22 - samples/sec: 8385.37 - lr: 0.000011 - momentum: 0.000000
175
+ 2023-10-18 17:56:46,261 epoch 7 - iter 396/447 - loss 0.35812707 - time (sec): 9.23 - samples/sec: 8401.29 - lr: 0.000010 - momentum: 0.000000
176
+ 2023-10-18 17:56:47,236 epoch 7 - iter 440/447 - loss 0.35473205 - time (sec): 10.20 - samples/sec: 8363.23 - lr: 0.000010 - momentum: 0.000000
177
+ 2023-10-18 17:56:47,395 ----------------------------------------------------------------------------------------------------
178
+ 2023-10-18 17:56:47,395 EPOCH 7 done: loss 0.3549 - lr: 0.000010
179
+ 2023-10-18 17:56:52,657 DEV : loss 0.3085794150829315 - f1-score (micro avg) 0.3269
180
+ 2023-10-18 17:56:52,684 saving best model
181
+ 2023-10-18 17:56:52,717 ----------------------------------------------------------------------------------------------------
182
+ 2023-10-18 17:56:53,703 epoch 8 - iter 44/447 - loss 0.36921770 - time (sec): 0.99 - samples/sec: 8919.36 - lr: 0.000010 - momentum: 0.000000
183
+ 2023-10-18 17:56:54,749 epoch 8 - iter 88/447 - loss 0.35527406 - time (sec): 2.03 - samples/sec: 8941.97 - lr: 0.000009 - momentum: 0.000000
184
+ 2023-10-18 17:56:55,762 epoch 8 - iter 132/447 - loss 0.36100556 - time (sec): 3.04 - samples/sec: 8581.87 - lr: 0.000009 - momentum: 0.000000
185
+ 2023-10-18 17:56:56,773 epoch 8 - iter 176/447 - loss 0.35302564 - time (sec): 4.06 - samples/sec: 8608.51 - lr: 0.000009 - momentum: 0.000000
186
+ 2023-10-18 17:56:57,805 epoch 8 - iter 220/447 - loss 0.34583755 - time (sec): 5.09 - samples/sec: 8697.64 - lr: 0.000008 - momentum: 0.000000
187
+ 2023-10-18 17:56:58,817 epoch 8 - iter 264/447 - loss 0.34574712 - time (sec): 6.10 - samples/sec: 8621.15 - lr: 0.000008 - momentum: 0.000000
188
+ 2023-10-18 17:56:59,826 epoch 8 - iter 308/447 - loss 0.34898159 - time (sec): 7.11 - samples/sec: 8491.51 - lr: 0.000008 - momentum: 0.000000
189
+ 2023-10-18 17:57:00,842 epoch 8 - iter 352/447 - loss 0.34473749 - time (sec): 8.12 - samples/sec: 8519.03 - lr: 0.000007 - momentum: 0.000000
190
+ 2023-10-18 17:57:01,853 epoch 8 - iter 396/447 - loss 0.35007107 - time (sec): 9.14 - samples/sec: 8506.48 - lr: 0.000007 - momentum: 0.000000
191
+ 2023-10-18 17:57:02,814 epoch 8 - iter 440/447 - loss 0.34545478 - time (sec): 10.10 - samples/sec: 8429.17 - lr: 0.000007 - momentum: 0.000000
192
+ 2023-10-18 17:57:02,969 ----------------------------------------------------------------------------------------------------
193
+ 2023-10-18 17:57:02,969 EPOCH 8 done: loss 0.3436 - lr: 0.000007
194
+ 2023-10-18 17:57:08,280 DEV : loss 0.31093230843544006 - f1-score (micro avg) 0.3267
195
+ 2023-10-18 17:57:08,308 ----------------------------------------------------------------------------------------------------
196
+ 2023-10-18 17:57:09,296 epoch 9 - iter 44/447 - loss 0.27711530 - time (sec): 0.99 - samples/sec: 8467.89 - lr: 0.000006 - momentum: 0.000000
197
+ 2023-10-18 17:57:10,263 epoch 9 - iter 88/447 - loss 0.30403006 - time (sec): 1.95 - samples/sec: 8410.40 - lr: 0.000006 - momentum: 0.000000
198
+ 2023-10-18 17:57:11,341 epoch 9 - iter 132/447 - loss 0.32196509 - time (sec): 3.03 - samples/sec: 8560.05 - lr: 0.000006 - momentum: 0.000000
199
+ 2023-10-18 17:57:12,347 epoch 9 - iter 176/447 - loss 0.33477590 - time (sec): 4.04 - samples/sec: 8658.73 - lr: 0.000005 - momentum: 0.000000
200
+ 2023-10-18 17:57:13,354 epoch 9 - iter 220/447 - loss 0.33986829 - time (sec): 5.05 - samples/sec: 8487.89 - lr: 0.000005 - momentum: 0.000000
201
+ 2023-10-18 17:57:14,336 epoch 9 - iter 264/447 - loss 0.34157564 - time (sec): 6.03 - samples/sec: 8451.12 - lr: 0.000005 - momentum: 0.000000
202
+ 2023-10-18 17:57:15,340 epoch 9 - iter 308/447 - loss 0.34210988 - time (sec): 7.03 - samples/sec: 8489.03 - lr: 0.000004 - momentum: 0.000000
203
+ 2023-10-18 17:57:16,382 epoch 9 - iter 352/447 - loss 0.33347179 - time (sec): 8.07 - samples/sec: 8556.03 - lr: 0.000004 - momentum: 0.000000
204
+ 2023-10-18 17:57:17,378 epoch 9 - iter 396/447 - loss 0.34133189 - time (sec): 9.07 - samples/sec: 8512.57 - lr: 0.000004 - momentum: 0.000000
205
+ 2023-10-18 17:57:18,400 epoch 9 - iter 440/447 - loss 0.34251923 - time (sec): 10.09 - samples/sec: 8463.40 - lr: 0.000003 - momentum: 0.000000
206
+ 2023-10-18 17:57:18,555 ----------------------------------------------------------------------------------------------------
207
+ 2023-10-18 17:57:18,555 EPOCH 9 done: loss 0.3423 - lr: 0.000003
208
+ 2023-10-18 17:57:23,821 DEV : loss 0.3027634918689728 - f1-score (micro avg) 0.336
209
+ 2023-10-18 17:57:23,848 saving best model
210
+ 2023-10-18 17:57:23,883 ----------------------------------------------------------------------------------------------------
211
+ 2023-10-18 17:57:24,923 epoch 10 - iter 44/447 - loss 0.32917961 - time (sec): 1.04 - samples/sec: 9385.39 - lr: 0.000003 - momentum: 0.000000
212
+ 2023-10-18 17:57:25,908 epoch 10 - iter 88/447 - loss 0.34487478 - time (sec): 2.02 - samples/sec: 8835.74 - lr: 0.000003 - momentum: 0.000000
213
+ 2023-10-18 17:57:26,943 epoch 10 - iter 132/447 - loss 0.32864293 - time (sec): 3.06 - samples/sec: 8583.72 - lr: 0.000002 - momentum: 0.000000
214
+ 2023-10-18 17:57:27,980 epoch 10 - iter 176/447 - loss 0.33560550 - time (sec): 4.10 - samples/sec: 8595.93 - lr: 0.000002 - momentum: 0.000000
215
+ 2023-10-18 17:57:28,966 epoch 10 - iter 220/447 - loss 0.33553915 - time (sec): 5.08 - samples/sec: 8386.68 - lr: 0.000002 - momentum: 0.000000
216
+ 2023-10-18 17:57:29,980 epoch 10 - iter 264/447 - loss 0.33465517 - time (sec): 6.10 - samples/sec: 8469.08 - lr: 0.000001 - momentum: 0.000000
217
+ 2023-10-18 17:57:30,990 epoch 10 - iter 308/447 - loss 0.33940697 - time (sec): 7.11 - samples/sec: 8554.99 - lr: 0.000001 - momentum: 0.000000
218
+ 2023-10-18 17:57:31,891 epoch 10 - iter 352/447 - loss 0.34259012 - time (sec): 8.01 - samples/sec: 8631.10 - lr: 0.000001 - momentum: 0.000000
219
+ 2023-10-18 17:57:32,846 epoch 10 - iter 396/447 - loss 0.33938014 - time (sec): 8.96 - samples/sec: 8600.64 - lr: 0.000000 - momentum: 0.000000
220
+ 2023-10-18 17:57:33,869 epoch 10 - iter 440/447 - loss 0.33500316 - time (sec): 9.99 - samples/sec: 8516.18 - lr: 0.000000 - momentum: 0.000000
221
+ 2023-10-18 17:57:34,041 ----------------------------------------------------------------------------------------------------
222
+ 2023-10-18 17:57:34,041 EPOCH 10 done: loss 0.3344 - lr: 0.000000
223
+ 2023-10-18 17:57:39,029 DEV : loss 0.30143699049949646 - f1-score (micro avg) 0.3358
224
+ 2023-10-18 17:57:39,089 ----------------------------------------------------------------------------------------------------
225
+ 2023-10-18 17:57:39,089 Loading model from best epoch ...
226
+ 2023-10-18 17:57:39,170 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-prod, B-prod, E-prod, I-prod, S-time, B-time, E-time, I-time
227
+ 2023-10-18 17:57:41,510
228
+ Results:
229
+ - F-score (micro) 0.3516
230
+ - F-score (macro) 0.1343
231
+ - Accuracy 0.2215
232
+
233
+ By class:
234
+ precision recall f1-score support
235
+
236
+ loc 0.4709 0.5436 0.5047 596
237
+ pers 0.1922 0.1471 0.1667 333
238
+ org 0.0000 0.0000 0.0000 132
239
+ prod 0.0000 0.0000 0.0000 66
240
+ time 0.0000 0.0000 0.0000 49
241
+
242
+ micro avg 0.3943 0.3172 0.3516 1176
243
+ macro avg 0.1326 0.1382 0.1343 1176
244
+ weighted avg 0.2931 0.3172 0.3030 1176
245
+
246
+ 2023-10-18 17:57:41,510 ----------------------------------------------------------------------------------------------------