Upload folder using huggingface_hub
Browse files- hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/best-model.pt +3 -0
- hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/dev.tsv +0 -0
- hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/final-model.pt +3 -0
- hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/loss.tsv +11 -0
- hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/test.tsv +0 -0
- hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/training.log +244 -0
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0b0ad79c44744c1dafd86e5bf2a85cdabcc177000df5b3675cfdc7c5d978f650
|
3 |
+
size 443334288
|
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/final-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:12cc79d121c394b42b292e08a4b70d4019acbdc60ebdbed55a7a26865eafb1a5
|
3 |
+
size 443334491
|
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 13:14:25 0.0000 0.5892 0.1587 0.6165 0.6987 0.6550 0.5098
|
3 |
+
2 13:17:03 0.0000 0.1282 0.1271 0.7215 0.8070 0.7618 0.6422
|
4 |
+
3 13:19:41 0.0000 0.0701 0.1342 0.8198 0.8053 0.8125 0.7123
|
5 |
+
4 13:22:20 0.0000 0.0480 0.1354 0.8137 0.8230 0.8183 0.7221
|
6 |
+
5 13:24:59 0.0000 0.0338 0.1619 0.7995 0.8517 0.8247 0.7211
|
7 |
+
6 13:27:36 0.0000 0.0241 0.1825 0.8040 0.8436 0.8234 0.7274
|
8 |
+
7 13:30:13 0.0000 0.0162 0.1901 0.8071 0.8482 0.8271 0.7306
|
9 |
+
8 13:32:51 0.0000 0.0122 0.1996 0.8367 0.8568 0.8466 0.7533
|
10 |
+
9 13:35:30 0.0000 0.0105 0.2048 0.8151 0.8557 0.8349 0.7411
|
11 |
+
10 13:38:07 0.0000 0.0072 0.2036 0.8169 0.8534 0.8347 0.7420
|
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3/training.log
ADDED
@@ -0,0 +1,244 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-09-04 13:11:51,975 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-09-04 13:11:51,976 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=768, out_features=21, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-09-04 13:11:51,976 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-09-04 13:11:51,976 MultiCorpus: 5901 train + 1287 dev + 1505 test sentences
|
52 |
+
- NER_HIPE_2022 Corpus: 5901 train + 1287 dev + 1505 test sentences - /app/.flair/datasets/ner_hipe_2022/v2.1/hipe2020/fr/with_doc_seperator
|
53 |
+
2023-09-04 13:11:51,977 ----------------------------------------------------------------------------------------------------
|
54 |
+
2023-09-04 13:11:51,977 Train: 5901 sentences
|
55 |
+
2023-09-04 13:11:51,977 (train_with_dev=False, train_with_test=False)
|
56 |
+
2023-09-04 13:11:51,977 ----------------------------------------------------------------------------------------------------
|
57 |
+
2023-09-04 13:11:51,977 Training Params:
|
58 |
+
2023-09-04 13:11:51,977 - learning_rate: "3e-05"
|
59 |
+
2023-09-04 13:11:51,977 - mini_batch_size: "8"
|
60 |
+
2023-09-04 13:11:51,977 - max_epochs: "10"
|
61 |
+
2023-09-04 13:11:51,977 - shuffle: "True"
|
62 |
+
2023-09-04 13:11:51,977 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-09-04 13:11:51,977 Plugins:
|
64 |
+
2023-09-04 13:11:51,977 - LinearScheduler | warmup_fraction: '0.1'
|
65 |
+
2023-09-04 13:11:51,977 ----------------------------------------------------------------------------------------------------
|
66 |
+
2023-09-04 13:11:51,977 Final evaluation on model from best epoch (best-model.pt)
|
67 |
+
2023-09-04 13:11:51,977 - metric: "('micro avg', 'f1-score')"
|
68 |
+
2023-09-04 13:11:51,977 ----------------------------------------------------------------------------------------------------
|
69 |
+
2023-09-04 13:11:51,977 Computation:
|
70 |
+
2023-09-04 13:11:51,977 - compute on device: cuda:0
|
71 |
+
2023-09-04 13:11:51,977 - embedding storage: none
|
72 |
+
2023-09-04 13:11:51,977 ----------------------------------------------------------------------------------------------------
|
73 |
+
2023-09-04 13:11:51,978 Model training base path: "hmbench-hipe2020/fr-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-3"
|
74 |
+
2023-09-04 13:11:51,978 ----------------------------------------------------------------------------------------------------
|
75 |
+
2023-09-04 13:11:51,978 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-09-04 13:12:03,953 epoch 1 - iter 73/738 - loss 2.75602228 - time (sec): 11.97 - samples/sec: 1273.50 - lr: 0.000003 - momentum: 0.000000
|
77 |
+
2023-09-04 13:12:17,588 epoch 1 - iter 146/738 - loss 1.78975133 - time (sec): 25.61 - samples/sec: 1274.41 - lr: 0.000006 - momentum: 0.000000
|
78 |
+
2023-09-04 13:12:30,242 epoch 1 - iter 219/738 - loss 1.38693502 - time (sec): 38.26 - samples/sec: 1246.97 - lr: 0.000009 - momentum: 0.000000
|
79 |
+
2023-09-04 13:12:43,939 epoch 1 - iter 292/738 - loss 1.13179773 - time (sec): 51.96 - samples/sec: 1232.31 - lr: 0.000012 - momentum: 0.000000
|
80 |
+
2023-09-04 13:12:57,996 epoch 1 - iter 365/738 - loss 0.96224049 - time (sec): 66.02 - samples/sec: 1228.92 - lr: 0.000015 - momentum: 0.000000
|
81 |
+
2023-09-04 13:13:12,327 epoch 1 - iter 438/738 - loss 0.84747624 - time (sec): 80.35 - samples/sec: 1221.50 - lr: 0.000018 - momentum: 0.000000
|
82 |
+
2023-09-04 13:13:27,827 epoch 1 - iter 511/738 - loss 0.75714596 - time (sec): 95.85 - samples/sec: 1207.80 - lr: 0.000021 - momentum: 0.000000
|
83 |
+
2023-09-04 13:13:40,696 epoch 1 - iter 584/738 - loss 0.69586826 - time (sec): 108.72 - samples/sec: 1208.13 - lr: 0.000024 - momentum: 0.000000
|
84 |
+
2023-09-04 13:13:56,760 epoch 1 - iter 657/738 - loss 0.63385064 - time (sec): 124.78 - samples/sec: 1196.11 - lr: 0.000027 - momentum: 0.000000
|
85 |
+
2023-09-04 13:14:09,623 epoch 1 - iter 730/738 - loss 0.59220811 - time (sec): 137.64 - samples/sec: 1195.82 - lr: 0.000030 - momentum: 0.000000
|
86 |
+
2023-09-04 13:14:11,411 ----------------------------------------------------------------------------------------------------
|
87 |
+
2023-09-04 13:14:11,412 EPOCH 1 done: loss 0.5892 - lr: 0.000030
|
88 |
+
2023-09-04 13:14:25,706 DEV : loss 0.1587315797805786 - f1-score (micro avg) 0.655
|
89 |
+
2023-09-04 13:14:25,738 saving best model
|
90 |
+
2023-09-04 13:14:26,224 ----------------------------------------------------------------------------------------------------
|
91 |
+
2023-09-04 13:14:39,513 epoch 2 - iter 73/738 - loss 0.14661112 - time (sec): 13.29 - samples/sec: 1235.59 - lr: 0.000030 - momentum: 0.000000
|
92 |
+
2023-09-04 13:14:54,190 epoch 2 - iter 146/738 - loss 0.14644812 - time (sec): 27.96 - samples/sec: 1196.07 - lr: 0.000029 - momentum: 0.000000
|
93 |
+
2023-09-04 13:15:09,943 epoch 2 - iter 219/738 - loss 0.14424972 - time (sec): 43.72 - samples/sec: 1185.36 - lr: 0.000029 - momentum: 0.000000
|
94 |
+
2023-09-04 13:15:22,687 epoch 2 - iter 292/738 - loss 0.13918826 - time (sec): 56.46 - samples/sec: 1192.27 - lr: 0.000029 - momentum: 0.000000
|
95 |
+
2023-09-04 13:15:35,226 epoch 2 - iter 365/738 - loss 0.13897294 - time (sec): 69.00 - samples/sec: 1202.28 - lr: 0.000028 - momentum: 0.000000
|
96 |
+
2023-09-04 13:15:51,089 epoch 2 - iter 438/738 - loss 0.13695682 - time (sec): 84.86 - samples/sec: 1190.76 - lr: 0.000028 - momentum: 0.000000
|
97 |
+
2023-09-04 13:16:04,920 epoch 2 - iter 511/738 - loss 0.13451500 - time (sec): 98.70 - samples/sec: 1190.42 - lr: 0.000028 - momentum: 0.000000
|
98 |
+
2023-09-04 13:16:18,494 epoch 2 - iter 584/738 - loss 0.13136503 - time (sec): 112.27 - samples/sec: 1187.92 - lr: 0.000027 - momentum: 0.000000
|
99 |
+
2023-09-04 13:16:31,506 epoch 2 - iter 657/738 - loss 0.12943913 - time (sec): 125.28 - samples/sec: 1190.07 - lr: 0.000027 - momentum: 0.000000
|
100 |
+
2023-09-04 13:16:44,391 epoch 2 - iter 730/738 - loss 0.12776621 - time (sec): 138.17 - samples/sec: 1194.73 - lr: 0.000027 - momentum: 0.000000
|
101 |
+
2023-09-04 13:16:45,552 ----------------------------------------------------------------------------------------------------
|
102 |
+
2023-09-04 13:16:45,552 EPOCH 2 done: loss 0.1282 - lr: 0.000027
|
103 |
+
2023-09-04 13:17:03,310 DEV : loss 0.12707410752773285 - f1-score (micro avg) 0.7618
|
104 |
+
2023-09-04 13:17:03,352 saving best model
|
105 |
+
2023-09-04 13:17:04,706 ----------------------------------------------------------------------------------------------------
|
106 |
+
2023-09-04 13:17:16,977 epoch 3 - iter 73/738 - loss 0.05888291 - time (sec): 12.27 - samples/sec: 1236.67 - lr: 0.000026 - momentum: 0.000000
|
107 |
+
2023-09-04 13:17:30,274 epoch 3 - iter 146/738 - loss 0.06740253 - time (sec): 25.57 - samples/sec: 1244.96 - lr: 0.000026 - momentum: 0.000000
|
108 |
+
2023-09-04 13:17:43,955 epoch 3 - iter 219/738 - loss 0.06793000 - time (sec): 39.25 - samples/sec: 1228.69 - lr: 0.000026 - momentum: 0.000000
|
109 |
+
2023-09-04 13:17:59,973 epoch 3 - iter 292/738 - loss 0.06543958 - time (sec): 55.27 - samples/sec: 1220.52 - lr: 0.000025 - momentum: 0.000000
|
110 |
+
2023-09-04 13:18:14,484 epoch 3 - iter 365/738 - loss 0.06924468 - time (sec): 69.78 - samples/sec: 1201.50 - lr: 0.000025 - momentum: 0.000000
|
111 |
+
2023-09-04 13:18:27,108 epoch 3 - iter 438/738 - loss 0.06805337 - time (sec): 82.40 - samples/sec: 1207.20 - lr: 0.000025 - momentum: 0.000000
|
112 |
+
2023-09-04 13:18:42,057 epoch 3 - iter 511/738 - loss 0.07060851 - time (sec): 97.35 - samples/sec: 1199.40 - lr: 0.000024 - momentum: 0.000000
|
113 |
+
2023-09-04 13:18:55,060 epoch 3 - iter 584/738 - loss 0.06855825 - time (sec): 110.35 - samples/sec: 1201.96 - lr: 0.000024 - momentum: 0.000000
|
114 |
+
2023-09-04 13:19:09,159 epoch 3 - iter 657/738 - loss 0.07127784 - time (sec): 124.45 - samples/sec: 1196.62 - lr: 0.000024 - momentum: 0.000000
|
115 |
+
2023-09-04 13:19:22,578 epoch 3 - iter 730/738 - loss 0.07001988 - time (sec): 137.87 - samples/sec: 1193.91 - lr: 0.000023 - momentum: 0.000000
|
116 |
+
2023-09-04 13:19:24,071 ----------------------------------------------------------------------------------------------------
|
117 |
+
2023-09-04 13:19:24,071 EPOCH 3 done: loss 0.0701 - lr: 0.000023
|
118 |
+
2023-09-04 13:19:41,811 DEV : loss 0.13421885669231415 - f1-score (micro avg) 0.8125
|
119 |
+
2023-09-04 13:19:41,840 saving best model
|
120 |
+
2023-09-04 13:19:43,182 ----------------------------------------------------------------------------------------------------
|
121 |
+
2023-09-04 13:19:57,692 epoch 4 - iter 73/738 - loss 0.05014642 - time (sec): 14.51 - samples/sec: 1191.67 - lr: 0.000023 - momentum: 0.000000
|
122 |
+
2023-09-04 13:20:11,003 epoch 4 - iter 146/738 - loss 0.04616425 - time (sec): 27.82 - samples/sec: 1209.40 - lr: 0.000023 - momentum: 0.000000
|
123 |
+
2023-09-04 13:20:25,225 epoch 4 - iter 219/738 - loss 0.04529066 - time (sec): 42.04 - samples/sec: 1208.84 - lr: 0.000022 - momentum: 0.000000
|
124 |
+
2023-09-04 13:20:38,127 epoch 4 - iter 292/738 - loss 0.04577257 - time (sec): 54.94 - samples/sec: 1212.18 - lr: 0.000022 - momentum: 0.000000
|
125 |
+
2023-09-04 13:20:50,019 epoch 4 - iter 365/738 - loss 0.04568850 - time (sec): 66.84 - samples/sec: 1219.99 - lr: 0.000022 - momentum: 0.000000
|
126 |
+
2023-09-04 13:21:04,952 epoch 4 - iter 438/738 - loss 0.04500584 - time (sec): 81.77 - samples/sec: 1206.64 - lr: 0.000021 - momentum: 0.000000
|
127 |
+
2023-09-04 13:21:21,135 epoch 4 - iter 511/738 - loss 0.04786348 - time (sec): 97.95 - samples/sec: 1200.02 - lr: 0.000021 - momentum: 0.000000
|
128 |
+
2023-09-04 13:21:33,905 epoch 4 - iter 584/738 - loss 0.04748241 - time (sec): 110.72 - samples/sec: 1198.21 - lr: 0.000021 - momentum: 0.000000
|
129 |
+
2023-09-04 13:21:47,311 epoch 4 - iter 657/738 - loss 0.04592685 - time (sec): 124.13 - samples/sec: 1195.38 - lr: 0.000020 - momentum: 0.000000
|
130 |
+
2023-09-04 13:22:00,502 epoch 4 - iter 730/738 - loss 0.04810729 - time (sec): 137.32 - samples/sec: 1199.11 - lr: 0.000020 - momentum: 0.000000
|
131 |
+
2023-09-04 13:22:02,199 ----------------------------------------------------------------------------------------------------
|
132 |
+
2023-09-04 13:22:02,199 EPOCH 4 done: loss 0.0480 - lr: 0.000020
|
133 |
+
2023-09-04 13:22:19,999 DEV : loss 0.13538895547389984 - f1-score (micro avg) 0.8183
|
134 |
+
2023-09-04 13:22:20,028 saving best model
|
135 |
+
2023-09-04 13:22:21,380 ----------------------------------------------------------------------------------------------------
|
136 |
+
2023-09-04 13:22:36,051 epoch 5 - iter 73/738 - loss 0.04671179 - time (sec): 14.67 - samples/sec: 1233.47 - lr: 0.000020 - momentum: 0.000000
|
137 |
+
2023-09-04 13:22:49,208 epoch 5 - iter 146/738 - loss 0.03446949 - time (sec): 27.83 - samples/sec: 1221.96 - lr: 0.000019 - momentum: 0.000000
|
138 |
+
2023-09-04 13:23:04,474 epoch 5 - iter 219/738 - loss 0.03659458 - time (sec): 43.09 - samples/sec: 1192.53 - lr: 0.000019 - momentum: 0.000000
|
139 |
+
2023-09-04 13:23:17,370 epoch 5 - iter 292/738 - loss 0.03531527 - time (sec): 55.99 - samples/sec: 1192.59 - lr: 0.000019 - momentum: 0.000000
|
140 |
+
2023-09-04 13:23:29,921 epoch 5 - iter 365/738 - loss 0.03436687 - time (sec): 68.54 - samples/sec: 1200.03 - lr: 0.000018 - momentum: 0.000000
|
141 |
+
2023-09-04 13:23:43,300 epoch 5 - iter 438/738 - loss 0.03255600 - time (sec): 81.92 - samples/sec: 1193.43 - lr: 0.000018 - momentum: 0.000000
|
142 |
+
2023-09-04 13:23:57,599 epoch 5 - iter 511/738 - loss 0.03506066 - time (sec): 96.22 - samples/sec: 1187.46 - lr: 0.000018 - momentum: 0.000000
|
143 |
+
2023-09-04 13:24:11,271 epoch 5 - iter 584/738 - loss 0.03440222 - time (sec): 109.89 - samples/sec: 1183.17 - lr: 0.000017 - momentum: 0.000000
|
144 |
+
2023-09-04 13:24:23,715 epoch 5 - iter 657/738 - loss 0.03342787 - time (sec): 122.33 - samples/sec: 1189.46 - lr: 0.000017 - momentum: 0.000000
|
145 |
+
2023-09-04 13:24:39,828 epoch 5 - iter 730/738 - loss 0.03369684 - time (sec): 138.45 - samples/sec: 1189.46 - lr: 0.000017 - momentum: 0.000000
|
146 |
+
2023-09-04 13:24:41,217 ----------------------------------------------------------------------------------------------------
|
147 |
+
2023-09-04 13:24:41,217 EPOCH 5 done: loss 0.0338 - lr: 0.000017
|
148 |
+
2023-09-04 13:24:58,989 DEV : loss 0.1618889570236206 - f1-score (micro avg) 0.8247
|
149 |
+
2023-09-04 13:24:59,017 saving best model
|
150 |
+
2023-09-04 13:25:00,709 ----------------------------------------------------------------------------------------------------
|
151 |
+
2023-09-04 13:25:15,781 epoch 6 - iter 73/738 - loss 0.02828833 - time (sec): 15.07 - samples/sec: 1198.38 - lr: 0.000016 - momentum: 0.000000
|
152 |
+
2023-09-04 13:25:30,703 epoch 6 - iter 146/738 - loss 0.02403498 - time (sec): 29.99 - samples/sec: 1181.15 - lr: 0.000016 - momentum: 0.000000
|
153 |
+
2023-09-04 13:25:46,210 epoch 6 - iter 219/738 - loss 0.02668753 - time (sec): 45.50 - samples/sec: 1181.33 - lr: 0.000016 - momentum: 0.000000
|
154 |
+
2023-09-04 13:25:59,184 epoch 6 - iter 292/738 - loss 0.02491381 - time (sec): 58.47 - samples/sec: 1191.98 - lr: 0.000015 - momentum: 0.000000
|
155 |
+
2023-09-04 13:26:12,291 epoch 6 - iter 365/738 - loss 0.02476714 - time (sec): 71.58 - samples/sec: 1196.85 - lr: 0.000015 - momentum: 0.000000
|
156 |
+
2023-09-04 13:26:26,470 epoch 6 - iter 438/738 - loss 0.02513810 - time (sec): 85.76 - samples/sec: 1189.15 - lr: 0.000015 - momentum: 0.000000
|
157 |
+
2023-09-04 13:26:40,199 epoch 6 - iter 511/738 - loss 0.02481393 - time (sec): 99.49 - samples/sec: 1188.50 - lr: 0.000014 - momentum: 0.000000
|
158 |
+
2023-09-04 13:26:52,644 epoch 6 - iter 584/738 - loss 0.02428163 - time (sec): 111.93 - samples/sec: 1191.01 - lr: 0.000014 - momentum: 0.000000
|
159 |
+
2023-09-04 13:27:05,379 epoch 6 - iter 657/738 - loss 0.02398217 - time (sec): 124.67 - samples/sec: 1193.36 - lr: 0.000014 - momentum: 0.000000
|
160 |
+
2023-09-04 13:27:17,979 epoch 6 - iter 730/738 - loss 0.02425206 - time (sec): 137.27 - samples/sec: 1198.33 - lr: 0.000013 - momentum: 0.000000
|
161 |
+
2023-09-04 13:27:19,256 ----------------------------------------------------------------------------------------------------
|
162 |
+
2023-09-04 13:27:19,256 EPOCH 6 done: loss 0.0241 - lr: 0.000013
|
163 |
+
2023-09-04 13:27:36,813 DEV : loss 0.1825319528579712 - f1-score (micro avg) 0.8234
|
164 |
+
2023-09-04 13:27:36,842 ----------------------------------------------------------------------------------------------------
|
165 |
+
2023-09-04 13:27:48,635 epoch 7 - iter 73/738 - loss 0.00914290 - time (sec): 11.79 - samples/sec: 1304.15 - lr: 0.000013 - momentum: 0.000000
|
166 |
+
2023-09-04 13:28:03,792 epoch 7 - iter 146/738 - loss 0.01069055 - time (sec): 26.95 - samples/sec: 1224.80 - lr: 0.000013 - momentum: 0.000000
|
167 |
+
2023-09-04 13:28:15,680 epoch 7 - iter 219/738 - loss 0.01521915 - time (sec): 38.84 - samples/sec: 1230.65 - lr: 0.000012 - momentum: 0.000000
|
168 |
+
2023-09-04 13:28:31,130 epoch 7 - iter 292/738 - loss 0.01541385 - time (sec): 54.29 - samples/sec: 1200.87 - lr: 0.000012 - momentum: 0.000000
|
169 |
+
2023-09-04 13:28:43,444 epoch 7 - iter 365/738 - loss 0.01508426 - time (sec): 66.60 - samples/sec: 1211.95 - lr: 0.000012 - momentum: 0.000000
|
170 |
+
2023-09-04 13:28:55,691 epoch 7 - iter 438/738 - loss 0.01526207 - time (sec): 78.85 - samples/sec: 1215.71 - lr: 0.000011 - momentum: 0.000000
|
171 |
+
2023-09-04 13:29:13,342 epoch 7 - iter 511/738 - loss 0.01443015 - time (sec): 96.50 - samples/sec: 1198.49 - lr: 0.000011 - momentum: 0.000000
|
172 |
+
2023-09-04 13:29:27,917 epoch 7 - iter 584/738 - loss 0.01549634 - time (sec): 111.07 - samples/sec: 1196.09 - lr: 0.000011 - momentum: 0.000000
|
173 |
+
2023-09-04 13:29:42,011 epoch 7 - iter 657/738 - loss 0.01657500 - time (sec): 125.17 - samples/sec: 1193.80 - lr: 0.000010 - momentum: 0.000000
|
174 |
+
2023-09-04 13:29:54,812 epoch 7 - iter 730/738 - loss 0.01625377 - time (sec): 137.97 - samples/sec: 1195.47 - lr: 0.000010 - momentum: 0.000000
|
175 |
+
2023-09-04 13:29:55,998 ----------------------------------------------------------------------------------------------------
|
176 |
+
2023-09-04 13:29:55,998 EPOCH 7 done: loss 0.0162 - lr: 0.000010
|
177 |
+
2023-09-04 13:30:13,528 DEV : loss 0.19014191627502441 - f1-score (micro avg) 0.8271
|
178 |
+
2023-09-04 13:30:13,559 saving best model
|
179 |
+
2023-09-04 13:30:14,932 ----------------------------------------------------------------------------------------------------
|
180 |
+
2023-09-04 13:30:28,527 epoch 8 - iter 73/738 - loss 0.01108812 - time (sec): 13.59 - samples/sec: 1234.53 - lr: 0.000010 - momentum: 0.000000
|
181 |
+
2023-09-04 13:30:43,067 epoch 8 - iter 146/738 - loss 0.01146445 - time (sec): 28.13 - samples/sec: 1216.69 - lr: 0.000009 - momentum: 0.000000
|
182 |
+
2023-09-04 13:30:58,044 epoch 8 - iter 219/738 - loss 0.01584998 - time (sec): 43.11 - samples/sec: 1214.91 - lr: 0.000009 - momentum: 0.000000
|
183 |
+
2023-09-04 13:31:12,736 epoch 8 - iter 292/738 - loss 0.01561122 - time (sec): 57.80 - samples/sec: 1195.87 - lr: 0.000009 - momentum: 0.000000
|
184 |
+
2023-09-04 13:31:26,522 epoch 8 - iter 365/738 - loss 0.01502835 - time (sec): 71.59 - samples/sec: 1193.29 - lr: 0.000008 - momentum: 0.000000
|
185 |
+
2023-09-04 13:31:40,747 epoch 8 - iter 438/738 - loss 0.01318786 - time (sec): 85.81 - samples/sec: 1193.34 - lr: 0.000008 - momentum: 0.000000
|
186 |
+
2023-09-04 13:31:53,389 epoch 8 - iter 511/738 - loss 0.01283994 - time (sec): 98.46 - samples/sec: 1199.78 - lr: 0.000008 - momentum: 0.000000
|
187 |
+
2023-09-04 13:32:05,728 epoch 8 - iter 584/738 - loss 0.01185692 - time (sec): 110.79 - samples/sec: 1197.11 - lr: 0.000007 - momentum: 0.000000
|
188 |
+
2023-09-04 13:32:17,932 epoch 8 - iter 657/738 - loss 0.01228898 - time (sec): 123.00 - samples/sec: 1202.88 - lr: 0.000007 - momentum: 0.000000
|
189 |
+
2023-09-04 13:32:31,626 epoch 8 - iter 730/738 - loss 0.01226298 - time (sec): 136.69 - samples/sec: 1202.13 - lr: 0.000007 - momentum: 0.000000
|
190 |
+
2023-09-04 13:32:33,521 ----------------------------------------------------------------------------------------------------
|
191 |
+
2023-09-04 13:32:33,522 EPOCH 8 done: loss 0.0122 - lr: 0.000007
|
192 |
+
2023-09-04 13:32:51,210 DEV : loss 0.19962483644485474 - f1-score (micro avg) 0.8466
|
193 |
+
2023-09-04 13:32:51,239 saving best model
|
194 |
+
2023-09-04 13:32:52,602 ----------------------------------------------------------------------------------------------------
|
195 |
+
2023-09-04 13:33:08,735 epoch 9 - iter 73/738 - loss 0.01247255 - time (sec): 16.13 - samples/sec: 1107.03 - lr: 0.000006 - momentum: 0.000000
|
196 |
+
2023-09-04 13:33:23,042 epoch 9 - iter 146/738 - loss 0.00811625 - time (sec): 30.44 - samples/sec: 1144.30 - lr: 0.000006 - momentum: 0.000000
|
197 |
+
2023-09-04 13:33:36,126 epoch 9 - iter 219/738 - loss 0.00728959 - time (sec): 43.52 - samples/sec: 1155.49 - lr: 0.000006 - momentum: 0.000000
|
198 |
+
2023-09-04 13:33:49,468 epoch 9 - iter 292/738 - loss 0.00705434 - time (sec): 56.86 - samples/sec: 1172.72 - lr: 0.000005 - momentum: 0.000000
|
199 |
+
2023-09-04 13:34:01,292 epoch 9 - iter 365/738 - loss 0.00947955 - time (sec): 68.69 - samples/sec: 1182.24 - lr: 0.000005 - momentum: 0.000000
|
200 |
+
2023-09-04 13:34:15,930 epoch 9 - iter 438/738 - loss 0.01064723 - time (sec): 83.33 - samples/sec: 1177.40 - lr: 0.000005 - momentum: 0.000000
|
201 |
+
2023-09-04 13:34:30,665 epoch 9 - iter 511/738 - loss 0.01062089 - time (sec): 98.06 - samples/sec: 1181.74 - lr: 0.000004 - momentum: 0.000000
|
202 |
+
2023-09-04 13:34:43,192 epoch 9 - iter 584/738 - loss 0.01063647 - time (sec): 110.59 - samples/sec: 1194.00 - lr: 0.000004 - momentum: 0.000000
|
203 |
+
2023-09-04 13:34:56,263 epoch 9 - iter 657/738 - loss 0.01080669 - time (sec): 123.66 - samples/sec: 1199.00 - lr: 0.000004 - momentum: 0.000000
|
204 |
+
2023-09-04 13:35:10,415 epoch 9 - iter 730/738 - loss 0.01055278 - time (sec): 137.81 - samples/sec: 1194.07 - lr: 0.000003 - momentum: 0.000000
|
205 |
+
2023-09-04 13:35:12,137 ----------------------------------------------------------------------------------------------------
|
206 |
+
2023-09-04 13:35:12,137 EPOCH 9 done: loss 0.0105 - lr: 0.000003
|
207 |
+
2023-09-04 13:35:29,981 DEV : loss 0.20482125878334045 - f1-score (micro avg) 0.8349
|
208 |
+
2023-09-04 13:35:30,010 ----------------------------------------------------------------------------------------------------
|
209 |
+
2023-09-04 13:35:44,528 epoch 10 - iter 73/738 - loss 0.00718317 - time (sec): 14.52 - samples/sec: 1234.09 - lr: 0.000003 - momentum: 0.000000
|
210 |
+
2023-09-04 13:35:56,824 epoch 10 - iter 146/738 - loss 0.00740199 - time (sec): 26.81 - samples/sec: 1227.37 - lr: 0.000003 - momentum: 0.000000
|
211 |
+
2023-09-04 13:36:11,371 epoch 10 - iter 219/738 - loss 0.00534209 - time (sec): 41.36 - samples/sec: 1207.24 - lr: 0.000002 - momentum: 0.000000
|
212 |
+
2023-09-04 13:36:24,891 epoch 10 - iter 292/738 - loss 0.00518679 - time (sec): 54.88 - samples/sec: 1189.48 - lr: 0.000002 - momentum: 0.000000
|
213 |
+
2023-09-04 13:36:40,128 epoch 10 - iter 365/738 - loss 0.00715314 - time (sec): 70.12 - samples/sec: 1197.55 - lr: 0.000002 - momentum: 0.000000
|
214 |
+
2023-09-04 13:36:53,702 epoch 10 - iter 438/738 - loss 0.00673732 - time (sec): 83.69 - samples/sec: 1195.61 - lr: 0.000001 - momentum: 0.000000
|
215 |
+
2023-09-04 13:37:07,468 epoch 10 - iter 511/738 - loss 0.00696276 - time (sec): 97.46 - samples/sec: 1196.19 - lr: 0.000001 - momentum: 0.000000
|
216 |
+
2023-09-04 13:37:21,101 epoch 10 - iter 584/738 - loss 0.00713198 - time (sec): 111.09 - samples/sec: 1199.97 - lr: 0.000001 - momentum: 0.000000
|
217 |
+
2023-09-04 13:37:33,771 epoch 10 - iter 657/738 - loss 0.00680166 - time (sec): 123.76 - samples/sec: 1204.49 - lr: 0.000000 - momentum: 0.000000
|
218 |
+
2023-09-04 13:37:48,728 epoch 10 - iter 730/738 - loss 0.00710272 - time (sec): 138.72 - samples/sec: 1190.21 - lr: 0.000000 - momentum: 0.000000
|
219 |
+
2023-09-04 13:37:49,854 ----------------------------------------------------------------------------------------------------
|
220 |
+
2023-09-04 13:37:49,855 EPOCH 10 done: loss 0.0072 - lr: 0.000000
|
221 |
+
2023-09-04 13:38:07,696 DEV : loss 0.20363713800907135 - f1-score (micro avg) 0.8347
|
222 |
+
2023-09-04 13:38:08,211 ----------------------------------------------------------------------------------------------------
|
223 |
+
2023-09-04 13:38:08,212 Loading model from best epoch ...
|
224 |
+
2023-09-04 13:38:10,073 SequenceTagger predicts: Dictionary with 21 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org, S-time, B-time, E-time, I-time, S-prod, B-prod, E-prod, I-prod
|
225 |
+
2023-09-04 13:38:25,478
|
226 |
+
Results:
|
227 |
+
- F-score (micro) 0.7973
|
228 |
+
- F-score (macro) 0.6802
|
229 |
+
- Accuracy 0.6864
|
230 |
+
|
231 |
+
By class:
|
232 |
+
precision recall f1-score support
|
233 |
+
|
234 |
+
loc 0.8719 0.8730 0.8725 858
|
235 |
+
pers 0.7474 0.8156 0.7801 537
|
236 |
+
org 0.5769 0.5682 0.5725 132
|
237 |
+
prod 0.6418 0.7049 0.6719 61
|
238 |
+
time 0.4615 0.5556 0.5042 54
|
239 |
+
|
240 |
+
micro avg 0.7821 0.8130 0.7973 1642
|
241 |
+
macro avg 0.6599 0.7035 0.6802 1642
|
242 |
+
weighted avg 0.7855 0.8130 0.7986 1642
|
243 |
+
|
244 |
+
2023-09-04 13:38:25,478 ----------------------------------------------------------------------------------------------------
|