stefan-it commited on
Commit
6da7fd8
1 Parent(s): 7113fa3

Upload folder using huggingface_hub

Browse files
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4/best-model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8eab3d215d32bcb8a3a9080cef6f0356d0e4eb7ba8ee28fc55c431895eadaef0
3
+ size 443334288
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4/dev.tsv ADDED
The diff for this file is too large to render. See raw diff
 
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4/final-model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:07b0c53c9d262b3246cfef9ddd00ac49d48da65c713de24001a084e046369129
3
+ size 443334491
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4/loss.tsv ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
2
+ 1 15:10:30 0.0000 0.6324 0.1381 0.7131 0.7274 0.7202 0.5861
3
+ 2 15:13:08 0.0000 0.1269 0.1146 0.7268 0.8058 0.7643 0.6496
4
+ 3 15:15:46 0.0000 0.0706 0.1263 0.7484 0.8247 0.7847 0.6767
5
+ 4 15:18:25 0.0000 0.0478 0.1450 0.8085 0.8414 0.8246 0.7254
6
+ 5 15:21:03 0.0000 0.0336 0.1630 0.8072 0.8345 0.8206 0.7234
7
+ 6 15:23:41 0.0000 0.0251 0.1828 0.7889 0.8305 0.8092 0.7087
8
+ 7 15:26:18 0.0000 0.0198 0.1995 0.7971 0.8236 0.8101 0.7119
9
+ 8 15:28:56 0.0000 0.0142 0.1940 0.8119 0.8408 0.8261 0.7311
10
+ 9 15:31:34 0.0000 0.0105 0.2051 0.8101 0.8356 0.8227 0.7277
11
+ 10 15:34:11 0.0000 0.0083 0.2059 0.8116 0.8316 0.8215 0.7246
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4/test.tsv ADDED
The diff for this file is too large to render. See raw diff
 
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4/training.log ADDED
@@ -0,0 +1,242 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-09-04 15:07:57,477 ----------------------------------------------------------------------------------------------------
2
+ 2023-09-04 15:07:57,478 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(32001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-11): 12 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=768, out_features=768, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=768, out_features=21, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2023-09-04 15:07:57,478 ----------------------------------------------------------------------------------------------------
51
+ 2023-09-04 15:07:57,478 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences
52
+ - NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator
53
+ 2023-09-04 15:07:57,478 ----------------------------------------------------------------------------------------------------
54
+ 2023-09-04 15:07:57,478 Train: 5901 sentences
55
+ 2023-09-04 15:07:57,478 (train_with_dev=False, train_with_test=False)
56
+ 2023-09-04 15:07:57,478 ----------------------------------------------------------------------------------------------------
57
+ 2023-09-04 15:07:57,478 Training Params:
58
+ 2023-09-04 15:07:57,478 - learning_rate: "3e-05"
59
+ 2023-09-04 15:07:57,478 - mini_batch_size: "8"
60
+ 2023-09-04 15:07:57,479 - max_epochs: "10"
61
+ 2023-09-04 15:07:57,479 - shuffle: "True"
62
+ 2023-09-04 15:07:57,479 ----------------------------------------------------------------------------------------------------
63
+ 2023-09-04 15:07:57,479 Plugins:
64
+ 2023-09-04 15:07:57,479 - LinearScheduler | warmup_fraction: '0.1'
65
+ 2023-09-04 15:07:57,479 ----------------------------------------------------------------------------------------------------
66
+ 2023-09-04 15:07:57,479 Final evaluation on model from best epoch (best-model.pt)
67
+ 2023-09-04 15:07:57,479 - metric: "('micro avg', 'f1-score')"
68
+ 2023-09-04 15:07:57,479 ----------------------------------------------------------------------------------------------------
69
+ 2023-09-04 15:07:57,479 Computation:
70
+ 2023-09-04 15:07:57,479 - compute on device: cuda:0
71
+ 2023-09-04 15:07:57,479 - embedding storage: none
72
+ 2023-09-04 15:07:57,479 ----------------------------------------------------------------------------------------------------
73
+ 2023-09-04 15:07:57,479 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4"
74
+ 2023-09-04 15:07:57,479 ----------------------------------------------------------------------------------------------------
75
+ 2023-09-04 15:07:57,479 ----------------------------------------------------------------------------------------------------
76
+ 2023-09-04 15:08:11,872 epoch 1 - iter 73/738 - loss 2.86248977 - time (sec): 14.39 - samples/sec: 1188.37 - lr: 0.000003 - momentum: 0.000000
77
+ 2023-09-04 15:08:28,062 epoch 1 - iter 146/738 - loss 1.87378558 - time (sec): 30.58 - samples/sec: 1176.56 - lr: 0.000006 - momentum: 0.000000
78
+ 2023-09-04 15:08:41,224 epoch 1 - iter 219/738 - loss 1.44076320 - time (sec): 43.74 - samples/sec: 1190.68 - lr: 0.000009 - momentum: 0.000000
79
+ 2023-09-04 15:08:53,992 epoch 1 - iter 292/738 - loss 1.19510265 - time (sec): 56.51 - samples/sec: 1200.16 - lr: 0.000012 - momentum: 0.000000
80
+ 2023-09-04 15:09:06,658 epoch 1 - iter 365/738 - loss 1.02747532 - time (sec): 69.18 - samples/sec: 1206.88 - lr: 0.000015 - momentum: 0.000000
81
+ 2023-09-04 15:09:20,422 epoch 1 - iter 438/738 - loss 0.90817250 - time (sec): 82.94 - samples/sec: 1205.48 - lr: 0.000018 - momentum: 0.000000
82
+ 2023-09-04 15:09:32,092 epoch 1 - iter 511/738 - loss 0.82740367 - time (sec): 94.61 - samples/sec: 1209.87 - lr: 0.000021 - momentum: 0.000000
83
+ 2023-09-04 15:09:45,830 epoch 1 - iter 584/738 - loss 0.75485949 - time (sec): 108.35 - samples/sec: 1204.68 - lr: 0.000024 - momentum: 0.000000
84
+ 2023-09-04 15:09:59,433 epoch 1 - iter 657/738 - loss 0.69355197 - time (sec): 121.95 - samples/sec: 1203.67 - lr: 0.000027 - momentum: 0.000000
85
+ 2023-09-04 15:10:14,975 epoch 1 - iter 730/738 - loss 0.63642854 - time (sec): 137.49 - samples/sec: 1199.07 - lr: 0.000030 - momentum: 0.000000
86
+ 2023-09-04 15:10:16,267 ----------------------------------------------------------------------------------------------------
87
+ 2023-09-04 15:10:16,267 EPOCH 1 done: loss 0.6324 - lr: 0.000030
88
+ 2023-09-04 15:10:30,427 DEV : loss 0.13814392685890198 - f1-score (micro avg) 0.7202
89
+ 2023-09-04 15:10:30,455 saving best model
90
+ 2023-09-04 15:10:30,930 ----------------------------------------------------------------------------------------------------
91
+ 2023-09-04 15:10:43,191 epoch 2 - iter 73/738 - loss 0.15642957 - time (sec): 12.26 - samples/sec: 1196.70 - lr: 0.000030 - momentum: 0.000000
92
+ 2023-09-04 15:10:55,459 epoch 2 - iter 146/738 - loss 0.15187407 - time (sec): 24.53 - samples/sec: 1213.71 - lr: 0.000029 - momentum: 0.000000
93
+ 2023-09-04 15:11:10,504 epoch 2 - iter 219/738 - loss 0.14818061 - time (sec): 39.57 - samples/sec: 1210.82 - lr: 0.000029 - momentum: 0.000000
94
+ 2023-09-04 15:11:23,998 epoch 2 - iter 292/738 - loss 0.14577470 - time (sec): 53.07 - samples/sec: 1207.19 - lr: 0.000029 - momentum: 0.000000
95
+ 2023-09-04 15:11:37,517 epoch 2 - iter 365/738 - loss 0.14070710 - time (sec): 66.59 - samples/sec: 1204.25 - lr: 0.000028 - momentum: 0.000000
96
+ 2023-09-04 15:11:51,786 epoch 2 - iter 438/738 - loss 0.13613518 - time (sec): 80.85 - samples/sec: 1204.78 - lr: 0.000028 - momentum: 0.000000
97
+ 2023-09-04 15:12:05,952 epoch 2 - iter 511/738 - loss 0.13285248 - time (sec): 95.02 - samples/sec: 1194.66 - lr: 0.000028 - momentum: 0.000000
98
+ 2023-09-04 15:12:19,375 epoch 2 - iter 584/738 - loss 0.13200454 - time (sec): 108.44 - samples/sec: 1196.77 - lr: 0.000027 - momentum: 0.000000
99
+ 2023-09-04 15:12:35,244 epoch 2 - iter 657/738 - loss 0.12861740 - time (sec): 124.31 - samples/sec: 1191.28 - lr: 0.000027 - momentum: 0.000000
100
+ 2023-09-04 15:12:49,248 epoch 2 - iter 730/738 - loss 0.12694763 - time (sec): 138.32 - samples/sec: 1190.88 - lr: 0.000027 - momentum: 0.000000
101
+ 2023-09-04 15:12:50,559 ----------------------------------------------------------------------------------------------------
102
+ 2023-09-04 15:12:50,559 EPOCH 2 done: loss 0.1269 - lr: 0.000027
103
+ 2023-09-04 15:13:08,319 DEV : loss 0.11461225152015686 - f1-score (micro avg) 0.7643
104
+ 2023-09-04 15:13:08,347 saving best model
105
+ 2023-09-04 15:13:09,723 ----------------------------------------------------------------------------------------------------
106
+ 2023-09-04 15:13:23,207 epoch 3 - iter 73/738 - loss 0.08262600 - time (sec): 13.48 - samples/sec: 1206.96 - lr: 0.000026 - momentum: 0.000000
107
+ 2023-09-04 15:13:37,922 epoch 3 - iter 146/738 - loss 0.07783046 - time (sec): 28.20 - samples/sec: 1199.25 - lr: 0.000026 - momentum: 0.000000
108
+ 2023-09-04 15:13:50,649 epoch 3 - iter 219/738 - loss 0.07457843 - time (sec): 40.92 - samples/sec: 1202.63 - lr: 0.000026 - momentum: 0.000000
109
+ 2023-09-04 15:14:05,710 epoch 3 - iter 292/738 - loss 0.08068010 - time (sec): 55.99 - samples/sec: 1198.49 - lr: 0.000025 - momentum: 0.000000
110
+ 2023-09-04 15:14:18,785 epoch 3 - iter 365/738 - loss 0.07733871 - time (sec): 69.06 - samples/sec: 1199.55 - lr: 0.000025 - momentum: 0.000000
111
+ 2023-09-04 15:14:32,483 epoch 3 - iter 438/738 - loss 0.07510957 - time (sec): 82.76 - samples/sec: 1193.29 - lr: 0.000025 - momentum: 0.000000
112
+ 2023-09-04 15:14:45,867 epoch 3 - iter 511/738 - loss 0.07428255 - time (sec): 96.14 - samples/sec: 1198.61 - lr: 0.000024 - momentum: 0.000000
113
+ 2023-09-04 15:15:00,677 epoch 3 - iter 584/738 - loss 0.07258361 - time (sec): 110.95 - samples/sec: 1195.74 - lr: 0.000024 - momentum: 0.000000
114
+ 2023-09-04 15:15:13,673 epoch 3 - iter 657/738 - loss 0.07018245 - time (sec): 123.95 - samples/sec: 1196.07 - lr: 0.000024 - momentum: 0.000000
115
+ 2023-09-04 15:15:27,981 epoch 3 - iter 730/738 - loss 0.07051097 - time (sec): 138.26 - samples/sec: 1193.67 - lr: 0.000023 - momentum: 0.000000
116
+ 2023-09-04 15:15:29,057 ----------------------------------------------------------------------------------------------------
117
+ 2023-09-04 15:15:29,058 EPOCH 3 done: loss 0.0706 - lr: 0.000023
118
+ 2023-09-04 15:15:46,860 DEV : loss 0.12629321217536926 - f1-score (micro avg) 0.7847
119
+ 2023-09-04 15:15:46,888 saving best model
120
+ 2023-09-04 15:15:48,249 ----------------------------------------------------------------------------------------------------
121
+ 2023-09-04 15:16:02,993 epoch 4 - iter 73/738 - loss 0.04289321 - time (sec): 14.74 - samples/sec: 1210.88 - lr: 0.000023 - momentum: 0.000000
122
+ 2023-09-04 15:16:15,752 epoch 4 - iter 146/738 - loss 0.04495463 - time (sec): 27.50 - samples/sec: 1204.09 - lr: 0.000023 - momentum: 0.000000
123
+ 2023-09-04 15:16:32,912 epoch 4 - iter 219/738 - loss 0.04311179 - time (sec): 44.66 - samples/sec: 1184.67 - lr: 0.000022 - momentum: 0.000000
124
+ 2023-09-04 15:16:47,592 epoch 4 - iter 292/738 - loss 0.04838265 - time (sec): 59.34 - samples/sec: 1173.84 - lr: 0.000022 - momentum: 0.000000
125
+ 2023-09-04 15:17:00,520 epoch 4 - iter 365/738 - loss 0.04798678 - time (sec): 72.27 - samples/sec: 1181.69 - lr: 0.000022 - momentum: 0.000000
126
+ 2023-09-04 15:17:15,753 epoch 4 - iter 438/738 - loss 0.04663458 - time (sec): 87.50 - samples/sec: 1182.61 - lr: 0.000021 - momentum: 0.000000
127
+ 2023-09-04 15:17:28,483 epoch 4 - iter 511/738 - loss 0.04610997 - time (sec): 100.23 - samples/sec: 1187.93 - lr: 0.000021 - momentum: 0.000000
128
+ 2023-09-04 15:17:41,349 epoch 4 - iter 584/738 - loss 0.04766936 - time (sec): 113.10 - samples/sec: 1185.21 - lr: 0.000021 - momentum: 0.000000
129
+ 2023-09-04 15:17:53,035 epoch 4 - iter 657/738 - loss 0.04878602 - time (sec): 124.78 - samples/sec: 1191.55 - lr: 0.000020 - momentum: 0.000000
130
+ 2023-09-04 15:18:06,125 epoch 4 - iter 730/738 - loss 0.04813588 - time (sec): 137.87 - samples/sec: 1195.65 - lr: 0.000020 - momentum: 0.000000
131
+ 2023-09-04 15:18:07,361 ----------------------------------------------------------------------------------------------------
132
+ 2023-09-04 15:18:07,361 EPOCH 4 done: loss 0.0478 - lr: 0.000020
133
+ 2023-09-04 15:18:25,148 DEV : loss 0.1449851542711258 - f1-score (micro avg) 0.8246
134
+ 2023-09-04 15:18:25,176 saving best model
135
+ 2023-09-04 15:18:26,491 ----------------------------------------------------------------------------------------------------
136
+ 2023-09-04 15:18:40,829 epoch 5 - iter 73/738 - loss 0.03757261 - time (sec): 14.34 - samples/sec: 1175.64 - lr: 0.000020 - momentum: 0.000000
137
+ 2023-09-04 15:18:53,223 epoch 5 - iter 146/738 - loss 0.03398262 - time (sec): 26.73 - samples/sec: 1202.24 - lr: 0.000019 - momentum: 0.000000
138
+ 2023-09-04 15:19:05,893 epoch 5 - iter 219/738 - loss 0.03453452 - time (sec): 39.40 - samples/sec: 1221.49 - lr: 0.000019 - momentum: 0.000000
139
+ 2023-09-04 15:19:20,376 epoch 5 - iter 292/738 - loss 0.03535832 - time (sec): 53.88 - samples/sec: 1214.70 - lr: 0.000019 - momentum: 0.000000
140
+ 2023-09-04 15:19:34,656 epoch 5 - iter 365/738 - loss 0.03277805 - time (sec): 68.16 - samples/sec: 1196.75 - lr: 0.000018 - momentum: 0.000000
141
+ 2023-09-04 15:19:48,355 epoch 5 - iter 438/738 - loss 0.03204077 - time (sec): 81.86 - samples/sec: 1192.57 - lr: 0.000018 - momentum: 0.000000
142
+ 2023-09-04 15:20:05,121 epoch 5 - iter 511/738 - loss 0.03323374 - time (sec): 98.63 - samples/sec: 1183.15 - lr: 0.000018 - momentum: 0.000000
143
+ 2023-09-04 15:20:16,779 epoch 5 - iter 584/738 - loss 0.03395649 - time (sec): 110.29 - samples/sec: 1193.38 - lr: 0.000017 - momentum: 0.000000
144
+ 2023-09-04 15:20:31,468 epoch 5 - iter 657/738 - loss 0.03392424 - time (sec): 124.98 - samples/sec: 1189.35 - lr: 0.000017 - momentum: 0.000000
145
+ 2023-09-04 15:20:44,868 epoch 5 - iter 730/738 - loss 0.03386304 - time (sec): 138.38 - samples/sec: 1191.00 - lr: 0.000017 - momentum: 0.000000
146
+ 2023-09-04 15:20:46,072 ----------------------------------------------------------------------------------------------------
147
+ 2023-09-04 15:20:46,072 EPOCH 5 done: loss 0.0336 - lr: 0.000017
148
+ 2023-09-04 15:21:03,892 DEV : loss 0.16297703981399536 - f1-score (micro avg) 0.8206
149
+ 2023-09-04 15:21:03,921 ----------------------------------------------------------------------------------------------------
150
+ 2023-09-04 15:21:18,215 epoch 6 - iter 73/738 - loss 0.02179470 - time (sec): 14.29 - samples/sec: 1178.36 - lr: 0.000016 - momentum: 0.000000
151
+ 2023-09-04 15:21:32,516 epoch 6 - iter 146/738 - loss 0.02530039 - time (sec): 28.59 - samples/sec: 1155.41 - lr: 0.000016 - momentum: 0.000000
152
+ 2023-09-04 15:21:44,583 epoch 6 - iter 219/738 - loss 0.02335614 - time (sec): 40.66 - samples/sec: 1171.33 - lr: 0.000016 - momentum: 0.000000
153
+ 2023-09-04 15:21:57,035 epoch 6 - iter 292/738 - loss 0.02558086 - time (sec): 53.11 - samples/sec: 1182.77 - lr: 0.000015 - momentum: 0.000000
154
+ 2023-09-04 15:22:11,708 epoch 6 - iter 365/738 - loss 0.02563490 - time (sec): 67.79 - samples/sec: 1178.91 - lr: 0.000015 - momentum: 0.000000
155
+ 2023-09-04 15:22:22,999 epoch 6 - iter 438/738 - loss 0.02483109 - time (sec): 79.08 - samples/sec: 1188.60 - lr: 0.000015 - momentum: 0.000000
156
+ 2023-09-04 15:22:37,280 epoch 6 - iter 511/738 - loss 0.02354249 - time (sec): 93.36 - samples/sec: 1190.62 - lr: 0.000014 - momentum: 0.000000
157
+ 2023-09-04 15:22:52,561 epoch 6 - iter 584/738 - loss 0.02407573 - time (sec): 108.64 - samples/sec: 1191.50 - lr: 0.000014 - momentum: 0.000000
158
+ 2023-09-04 15:23:09,298 epoch 6 - iter 657/738 - loss 0.02428604 - time (sec): 125.37 - samples/sec: 1189.00 - lr: 0.000014 - momentum: 0.000000
159
+ 2023-09-04 15:23:22,753 epoch 6 - iter 730/738 - loss 0.02498198 - time (sec): 138.83 - samples/sec: 1189.25 - lr: 0.000013 - momentum: 0.000000
160
+ 2023-09-04 15:23:23,779 ----------------------------------------------------------------------------------------------------
161
+ 2023-09-04 15:23:23,779 EPOCH 6 done: loss 0.0251 - lr: 0.000013
162
+ 2023-09-04 15:23:41,509 DEV : loss 0.1827543079853058 - f1-score (micro avg) 0.8092
163
+ 2023-09-04 15:23:41,538 ----------------------------------------------------------------------------------------------------
164
+ 2023-09-04 15:23:54,237 epoch 7 - iter 73/738 - loss 0.02181177 - time (sec): 12.70 - samples/sec: 1212.83 - lr: 0.000013 - momentum: 0.000000
165
+ 2023-09-04 15:24:06,417 epoch 7 - iter 146/738 - loss 0.02084610 - time (sec): 24.88 - samples/sec: 1200.09 - lr: 0.000013 - momentum: 0.000000
166
+ 2023-09-04 15:24:20,568 epoch 7 - iter 219/738 - loss 0.02110233 - time (sec): 39.03 - samples/sec: 1209.11 - lr: 0.000012 - momentum: 0.000000
167
+ 2023-09-04 15:24:33,733 epoch 7 - iter 292/738 - loss 0.01901752 - time (sec): 52.19 - samples/sec: 1204.34 - lr: 0.000012 - momentum: 0.000000
168
+ 2023-09-04 15:24:48,456 epoch 7 - iter 365/738 - loss 0.02024199 - time (sec): 66.92 - samples/sec: 1188.39 - lr: 0.000012 - momentum: 0.000000
169
+ 2023-09-04 15:25:02,191 epoch 7 - iter 438/738 - loss 0.02027023 - time (sec): 80.65 - samples/sec: 1189.40 - lr: 0.000011 - momentum: 0.000000
170
+ 2023-09-04 15:25:15,277 epoch 7 - iter 511/738 - loss 0.02061434 - time (sec): 93.74 - samples/sec: 1196.34 - lr: 0.000011 - momentum: 0.000000
171
+ 2023-09-04 15:25:29,003 epoch 7 - iter 584/738 - loss 0.02034981 - time (sec): 107.46 - samples/sec: 1196.32 - lr: 0.000011 - momentum: 0.000000
172
+ 2023-09-04 15:25:45,325 epoch 7 - iter 657/738 - loss 0.02041326 - time (sec): 123.79 - samples/sec: 1195.65 - lr: 0.000010 - momentum: 0.000000
173
+ 2023-09-04 15:25:59,500 epoch 7 - iter 730/738 - loss 0.01986392 - time (sec): 137.96 - samples/sec: 1191.73 - lr: 0.000010 - momentum: 0.000000
174
+ 2023-09-04 15:26:01,204 ----------------------------------------------------------------------------------------------------
175
+ 2023-09-04 15:26:01,204 EPOCH 7 done: loss 0.0198 - lr: 0.000010
176
+ 2023-09-04 15:26:18,945 DEV : loss 0.19946980476379395 - f1-score (micro avg) 0.8101
177
+ 2023-09-04 15:26:18,974 ----------------------------------------------------------------------------------------------------
178
+ 2023-09-04 15:26:33,611 epoch 8 - iter 73/738 - loss 0.01320934 - time (sec): 14.63 - samples/sec: 1199.27 - lr: 0.000010 - momentum: 0.000000
179
+ 2023-09-04 15:26:46,284 epoch 8 - iter 146/738 - loss 0.01087234 - time (sec): 27.31 - samples/sec: 1190.62 - lr: 0.000009 - momentum: 0.000000
180
+ 2023-09-04 15:27:00,406 epoch 8 - iter 219/738 - loss 0.01118795 - time (sec): 41.43 - samples/sec: 1193.36 - lr: 0.000009 - momentum: 0.000000
181
+ 2023-09-04 15:27:13,072 epoch 8 - iter 292/738 - loss 0.01173749 - time (sec): 54.10 - samples/sec: 1196.64 - lr: 0.000009 - momentum: 0.000000
182
+ 2023-09-04 15:27:27,562 epoch 8 - iter 365/738 - loss 0.01497626 - time (sec): 68.59 - samples/sec: 1182.80 - lr: 0.000008 - momentum: 0.000000
183
+ 2023-09-04 15:27:43,186 epoch 8 - iter 438/738 - loss 0.01439217 - time (sec): 84.21 - samples/sec: 1175.93 - lr: 0.000008 - momentum: 0.000000
184
+ 2023-09-04 15:27:54,652 epoch 8 - iter 511/738 - loss 0.01407249 - time (sec): 95.68 - samples/sec: 1189.87 - lr: 0.000008 - momentum: 0.000000
185
+ 2023-09-04 15:28:09,602 epoch 8 - iter 584/738 - loss 0.01366217 - time (sec): 110.63 - samples/sec: 1183.80 - lr: 0.000007 - momentum: 0.000000
186
+ 2023-09-04 15:28:22,514 epoch 8 - iter 657/738 - loss 0.01343997 - time (sec): 123.54 - samples/sec: 1186.85 - lr: 0.000007 - momentum: 0.000000
187
+ 2023-09-04 15:28:37,442 epoch 8 - iter 730/738 - loss 0.01419631 - time (sec): 138.47 - samples/sec: 1190.92 - lr: 0.000007 - momentum: 0.000000
188
+ 2023-09-04 15:28:38,663 ----------------------------------------------------------------------------------------------------
189
+ 2023-09-04 15:28:38,664 EPOCH 8 done: loss 0.0142 - lr: 0.000007
190
+ 2023-09-04 15:28:56,470 DEV : loss 0.1939600259065628 - f1-score (micro avg) 0.8261
191
+ 2023-09-04 15:28:56,500 saving best model
192
+ 2023-09-04 15:28:57,845 ----------------------------------------------------------------------------------------------------
193
+ 2023-09-04 15:29:11,878 epoch 9 - iter 73/738 - loss 0.01087719 - time (sec): 14.03 - samples/sec: 1194.75 - lr: 0.000006 - momentum: 0.000000
194
+ 2023-09-04 15:29:26,380 epoch 9 - iter 146/738 - loss 0.01176628 - time (sec): 28.53 - samples/sec: 1173.05 - lr: 0.000006 - momentum: 0.000000
195
+ 2023-09-04 15:29:38,093 epoch 9 - iter 219/738 - loss 0.01045319 - time (sec): 40.25 - samples/sec: 1199.96 - lr: 0.000006 - momentum: 0.000000
196
+ 2023-09-04 15:29:50,788 epoch 9 - iter 292/738 - loss 0.01059609 - time (sec): 52.94 - samples/sec: 1200.20 - lr: 0.000005 - momentum: 0.000000
197
+ 2023-09-04 15:30:04,953 epoch 9 - iter 365/738 - loss 0.01182372 - time (sec): 67.11 - samples/sec: 1182.59 - lr: 0.000005 - momentum: 0.000000
198
+ 2023-09-04 15:30:20,455 epoch 9 - iter 438/738 - loss 0.01097159 - time (sec): 82.61 - samples/sec: 1174.63 - lr: 0.000005 - momentum: 0.000000
199
+ 2023-09-04 15:30:35,578 epoch 9 - iter 511/738 - loss 0.01034011 - time (sec): 97.73 - samples/sec: 1173.99 - lr: 0.000004 - momentum: 0.000000
200
+ 2023-09-04 15:30:48,048 epoch 9 - iter 584/738 - loss 0.01093906 - time (sec): 110.20 - samples/sec: 1182.16 - lr: 0.000004 - momentum: 0.000000
201
+ 2023-09-04 15:31:01,277 epoch 9 - iter 657/738 - loss 0.01060596 - time (sec): 123.43 - samples/sec: 1182.92 - lr: 0.000004 - momentum: 0.000000
202
+ 2023-09-04 15:31:15,726 epoch 9 - iter 730/738 - loss 0.01058406 - time (sec): 137.88 - samples/sec: 1193.37 - lr: 0.000003 - momentum: 0.000000
203
+ 2023-09-04 15:31:17,063 ----------------------------------------------------------------------------------------------------
204
+ 2023-09-04 15:31:17,064 EPOCH 9 done: loss 0.0105 - lr: 0.000003
205
+ 2023-09-04 15:31:34,810 DEV : loss 0.20513701438903809 - f1-score (micro avg) 0.8227
206
+ 2023-09-04 15:31:34,839 ----------------------------------------------------------------------------------------------------
207
+ 2023-09-04 15:31:48,240 epoch 10 - iter 73/738 - loss 0.00152471 - time (sec): 13.40 - samples/sec: 1191.48 - lr: 0.000003 - momentum: 0.000000
208
+ 2023-09-04 15:32:01,756 epoch 10 - iter 146/738 - loss 0.00902703 - time (sec): 26.92 - samples/sec: 1210.66 - lr: 0.000003 - momentum: 0.000000
209
+ 2023-09-04 15:32:13,703 epoch 10 - iter 219/738 - loss 0.00963469 - time (sec): 38.86 - samples/sec: 1221.31 - lr: 0.000002 - momentum: 0.000000
210
+ 2023-09-04 15:32:28,695 epoch 10 - iter 292/738 - loss 0.01078399 - time (sec): 53.85 - samples/sec: 1218.41 - lr: 0.000002 - momentum: 0.000000
211
+ 2023-09-04 15:32:44,713 epoch 10 - iter 365/738 - loss 0.01129022 - time (sec): 69.87 - samples/sec: 1201.46 - lr: 0.000002 - momentum: 0.000000
212
+ 2023-09-04 15:32:57,832 epoch 10 - iter 438/738 - loss 0.01051066 - time (sec): 82.99 - samples/sec: 1200.25 - lr: 0.000001 - momentum: 0.000000
213
+ 2023-09-04 15:33:11,789 epoch 10 - iter 511/738 - loss 0.00970689 - time (sec): 96.95 - samples/sec: 1203.46 - lr: 0.000001 - momentum: 0.000000
214
+ 2023-09-04 15:33:27,378 epoch 10 - iter 584/738 - loss 0.00932251 - time (sec): 112.54 - samples/sec: 1194.23 - lr: 0.000001 - momentum: 0.000000
215
+ 2023-09-04 15:33:40,681 epoch 10 - iter 657/738 - loss 0.00881038 - time (sec): 125.84 - samples/sec: 1193.25 - lr: 0.000000 - momentum: 0.000000
216
+ 2023-09-04 15:33:52,640 epoch 10 - iter 730/738 - loss 0.00838896 - time (sec): 137.80 - samples/sec: 1195.44 - lr: 0.000000 - momentum: 0.000000
217
+ 2023-09-04 15:33:53,938 ----------------------------------------------------------------------------------------------------
218
+ 2023-09-04 15:33:53,938 EPOCH 10 done: loss 0.0083 - lr: 0.000000
219
+ 2023-09-04 15:34:11,671 DEV : loss 0.20588278770446777 - f1-score (micro avg) 0.8215
220
+ 2023-09-04 15:34:12,180 ----------------------------------------------------------------------------------------------------
221
+ 2023-09-04 15:34:12,181 Loading model from best epoch ...
222
+ 2023-09-04 15:34:14,139 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod
223
+ 2023-09-04 15:34:29,363
224
+ Results:
225
+ - F-score (micro) 0.7885
226
+ - F-score (macro) 0.688
227
+ - Accuracy 0.6729
228
+
229
+ By class:
230
+ precision recall f1-score support
231
+
232
+ loc 0.8694 0.8613 0.8653 858
233
+ pers 0.7423 0.8045 0.7721 537
234
+ org 0.4765 0.6136 0.5364 132
235
+ time 0.5231 0.6296 0.5714 54
236
+ prod 0.7193 0.6721 0.6949 61
237
+
238
+ micro avg 0.7697 0.8082 0.7885 1642
239
+ macro avg 0.6661 0.7162 0.6880 1642
240
+ weighted avg 0.7793 0.8082 0.7924 1642
241
+
242
+ 2023-09-04 15:34:29,363 ----------------------------------------------------------------------------------------------------