Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- loss.tsv +11 -0
- runs/events.out.tfevents.1697745066.46dc0c540dd0.4731.4 +3 -0
- test.tsv +0 -0
- training.log +248 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a218f6230b0fd0f0c82d1355b01441148bee01cdd1941fc7a11ab6f9f299e021
|
3 |
+
size 19048098
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 19:51:40 0.0000 1.0801 0.3227 0.2206 0.1020 0.1395 0.0765
|
3 |
+
2 19:52:14 0.0000 0.4333 0.2583 0.3764 0.3293 0.3512 0.2255
|
4 |
+
3 19:52:47 0.0000 0.3587 0.2308 0.4181 0.4517 0.4343 0.2975
|
5 |
+
4 19:53:21 0.0000 0.3211 0.2132 0.4536 0.4857 0.4691 0.3269
|
6 |
+
5 19:53:54 0.0000 0.2925 0.2065 0.4537 0.5129 0.4815 0.3384
|
7 |
+
6 19:54:28 0.0000 0.2727 0.2003 0.4576 0.5143 0.4843 0.3387
|
8 |
+
7 19:55:02 0.0000 0.2580 0.2005 0.4796 0.5265 0.5019 0.3550
|
9 |
+
8 19:55:35 0.0000 0.2499 0.1967 0.4678 0.5442 0.5031 0.3571
|
10 |
+
9 19:56:08 0.0000 0.2419 0.1973 0.4773 0.5429 0.5080 0.3624
|
11 |
+
10 19:56:42 0.0000 0.2373 0.1960 0.4722 0.5551 0.5103 0.3643
|
runs/events.out.tfevents.1697745066.46dc0c540dd0.4731.4
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:e6cf4f18a618d628b0b869dfa1c6938a434d058c8ca79deda6cc882f866aea88
|
3 |
+
size 999862
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,248 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-19 19:51:06,768 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-19 19:51:06,768 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 128)
|
7 |
+
(position_embeddings): Embedding(512, 128)
|
8 |
+
(token_type_embeddings): Embedding(2, 128)
|
9 |
+
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-1): 2 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=128, out_features=128, bias=True)
|
18 |
+
(key): Linear(in_features=128, out_features=128, bias=True)
|
19 |
+
(value): Linear(in_features=128, out_features=128, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=128, out_features=128, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=128, out_features=512, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=512, out_features=128, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=128, out_features=128, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=128, out_features=17, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-10-19 19:51:06,768 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-10-19 19:51:06,768 MultiCorpus: 7142 train + 698 dev + 2570 test sentences
|
52 |
+
- NER_HIPE_2022 Corpus: 7142 train + 698 dev + 2570 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/fr/with_doc_seperator
|
53 |
+
2023-10-19 19:51:06,768 ----------------------------------------------------------------------------------------------------
|
54 |
+
2023-10-19 19:51:06,768 Train: 7142 sentences
|
55 |
+
2023-10-19 19:51:06,768 (train_with_dev=False, train_with_test=False)
|
56 |
+
2023-10-19 19:51:06,768 ----------------------------------------------------------------------------------------------------
|
57 |
+
2023-10-19 19:51:06,768 Training Params:
|
58 |
+
2023-10-19 19:51:06,768 - learning_rate: "3e-05"
|
59 |
+
2023-10-19 19:51:06,768 - mini_batch_size: "4"
|
60 |
+
2023-10-19 19:51:06,769 - max_epochs: "10"
|
61 |
+
2023-10-19 19:51:06,769 - shuffle: "True"
|
62 |
+
2023-10-19 19:51:06,769 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-19 19:51:06,769 Plugins:
|
64 |
+
2023-10-19 19:51:06,769 - TensorboardLogger
|
65 |
+
2023-10-19 19:51:06,769 - LinearScheduler | warmup_fraction: '0.1'
|
66 |
+
2023-10-19 19:51:06,769 ----------------------------------------------------------------------------------------------------
|
67 |
+
2023-10-19 19:51:06,769 Final evaluation on model from best epoch (best-model.pt)
|
68 |
+
2023-10-19 19:51:06,769 - metric: "('micro avg', 'f1-score')"
|
69 |
+
2023-10-19 19:51:06,769 ----------------------------------------------------------------------------------------------------
|
70 |
+
2023-10-19 19:51:06,769 Computation:
|
71 |
+
2023-10-19 19:51:06,769 - compute on device: cuda:0
|
72 |
+
2023-10-19 19:51:06,769 - embedding storage: none
|
73 |
+
2023-10-19 19:51:06,769 ----------------------------------------------------------------------------------------------------
|
74 |
+
2023-10-19 19:51:06,769 Model training base path: "hmbench-newseye/fr-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2"
|
75 |
+
2023-10-19 19:51:06,769 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-10-19 19:51:06,769 ----------------------------------------------------------------------------------------------------
|
77 |
+
2023-10-19 19:51:06,769 Logging anything other than scalars to TensorBoard is currently not supported.
|
78 |
+
2023-10-19 19:51:09,948 epoch 1 - iter 178/1786 - loss 2.79687367 - time (sec): 3.18 - samples/sec: 8405.70 - lr: 0.000003 - momentum: 0.000000
|
79 |
+
2023-10-19 19:51:13,196 epoch 1 - iter 356/1786 - loss 2.50644631 - time (sec): 6.43 - samples/sec: 7917.10 - lr: 0.000006 - momentum: 0.000000
|
80 |
+
2023-10-19 19:51:16,242 epoch 1 - iter 534/1786 - loss 2.11669177 - time (sec): 9.47 - samples/sec: 7847.71 - lr: 0.000009 - momentum: 0.000000
|
81 |
+
2023-10-19 19:51:19,447 epoch 1 - iter 712/1786 - loss 1.78468271 - time (sec): 12.68 - samples/sec: 7832.18 - lr: 0.000012 - momentum: 0.000000
|
82 |
+
2023-10-19 19:51:22,749 epoch 1 - iter 890/1786 - loss 1.58099334 - time (sec): 15.98 - samples/sec: 7696.48 - lr: 0.000015 - momentum: 0.000000
|
83 |
+
2023-10-19 19:51:25,906 epoch 1 - iter 1068/1786 - loss 1.44626933 - time (sec): 19.14 - samples/sec: 7688.35 - lr: 0.000018 - momentum: 0.000000
|
84 |
+
2023-10-19 19:51:29,188 epoch 1 - iter 1246/1786 - loss 1.32135288 - time (sec): 22.42 - samples/sec: 7711.92 - lr: 0.000021 - momentum: 0.000000
|
85 |
+
2023-10-19 19:51:32,297 epoch 1 - iter 1424/1786 - loss 1.22571550 - time (sec): 25.53 - samples/sec: 7701.62 - lr: 0.000024 - momentum: 0.000000
|
86 |
+
2023-10-19 19:51:35,449 epoch 1 - iter 1602/1786 - loss 1.14718597 - time (sec): 28.68 - samples/sec: 7738.29 - lr: 0.000027 - momentum: 0.000000
|
87 |
+
2023-10-19 19:51:38,565 epoch 1 - iter 1780/1786 - loss 1.08157944 - time (sec): 31.80 - samples/sec: 7802.23 - lr: 0.000030 - momentum: 0.000000
|
88 |
+
2023-10-19 19:51:38,668 ----------------------------------------------------------------------------------------------------
|
89 |
+
2023-10-19 19:51:38,668 EPOCH 1 done: loss 1.0801 - lr: 0.000030
|
90 |
+
2023-10-19 19:51:40,134 DEV : loss 0.3227035701274872 - f1-score (micro avg) 0.1395
|
91 |
+
2023-10-19 19:51:40,149 saving best model
|
92 |
+
2023-10-19 19:51:40,183 ----------------------------------------------------------------------------------------------------
|
93 |
+
2023-10-19 19:51:43,289 epoch 2 - iter 178/1786 - loss 0.50600833 - time (sec): 3.11 - samples/sec: 7616.92 - lr: 0.000030 - momentum: 0.000000
|
94 |
+
2023-10-19 19:51:46,333 epoch 2 - iter 356/1786 - loss 0.46557022 - time (sec): 6.15 - samples/sec: 7898.55 - lr: 0.000029 - momentum: 0.000000
|
95 |
+
2023-10-19 19:51:49,371 epoch 2 - iter 534/1786 - loss 0.46513364 - time (sec): 9.19 - samples/sec: 7766.67 - lr: 0.000029 - momentum: 0.000000
|
96 |
+
2023-10-19 19:51:52,594 epoch 2 - iter 712/1786 - loss 0.45220596 - time (sec): 12.41 - samples/sec: 7737.80 - lr: 0.000029 - momentum: 0.000000
|
97 |
+
2023-10-19 19:51:56,019 epoch 2 - iter 890/1786 - loss 0.45404700 - time (sec): 15.84 - samples/sec: 7667.36 - lr: 0.000028 - momentum: 0.000000
|
98 |
+
2023-10-19 19:51:59,181 epoch 2 - iter 1068/1786 - loss 0.44855842 - time (sec): 19.00 - samples/sec: 7742.35 - lr: 0.000028 - momentum: 0.000000
|
99 |
+
2023-10-19 19:52:02,417 epoch 2 - iter 1246/1786 - loss 0.44009549 - time (sec): 22.23 - samples/sec: 7836.30 - lr: 0.000028 - momentum: 0.000000
|
100 |
+
2023-10-19 19:52:05,463 epoch 2 - iter 1424/1786 - loss 0.43715972 - time (sec): 25.28 - samples/sec: 7889.50 - lr: 0.000027 - momentum: 0.000000
|
101 |
+
2023-10-19 19:52:08,578 epoch 2 - iter 1602/1786 - loss 0.43518582 - time (sec): 28.39 - samples/sec: 7891.60 - lr: 0.000027 - momentum: 0.000000
|
102 |
+
2023-10-19 19:52:11,635 epoch 2 - iter 1780/1786 - loss 0.43341001 - time (sec): 31.45 - samples/sec: 7891.97 - lr: 0.000027 - momentum: 0.000000
|
103 |
+
2023-10-19 19:52:11,727 ----------------------------------------------------------------------------------------------------
|
104 |
+
2023-10-19 19:52:11,727 EPOCH 2 done: loss 0.4333 - lr: 0.000027
|
105 |
+
2023-10-19 19:52:14,547 DEV : loss 0.2583433985710144 - f1-score (micro avg) 0.3512
|
106 |
+
2023-10-19 19:52:14,562 saving best model
|
107 |
+
2023-10-19 19:52:14,594 ----------------------------------------------------------------------------------------------------
|
108 |
+
2023-10-19 19:52:17,843 epoch 3 - iter 178/1786 - loss 0.36689807 - time (sec): 3.25 - samples/sec: 7327.59 - lr: 0.000026 - momentum: 0.000000
|
109 |
+
2023-10-19 19:52:21,112 epoch 3 - iter 356/1786 - loss 0.34773508 - time (sec): 6.52 - samples/sec: 7774.91 - lr: 0.000026 - momentum: 0.000000
|
110 |
+
2023-10-19 19:52:24,217 epoch 3 - iter 534/1786 - loss 0.34418896 - time (sec): 9.62 - samples/sec: 7853.08 - lr: 0.000026 - momentum: 0.000000
|
111 |
+
2023-10-19 19:52:27,388 epoch 3 - iter 712/1786 - loss 0.35168554 - time (sec): 12.79 - samples/sec: 7808.77 - lr: 0.000025 - momentum: 0.000000
|
112 |
+
2023-10-19 19:52:30,291 epoch 3 - iter 890/1786 - loss 0.35595638 - time (sec): 15.70 - samples/sec: 8022.78 - lr: 0.000025 - momentum: 0.000000
|
113 |
+
2023-10-19 19:52:33,085 epoch 3 - iter 1068/1786 - loss 0.35812865 - time (sec): 18.49 - samples/sec: 8195.12 - lr: 0.000025 - momentum: 0.000000
|
114 |
+
2023-10-19 19:52:36,081 epoch 3 - iter 1246/1786 - loss 0.35668010 - time (sec): 21.49 - samples/sec: 8128.47 - lr: 0.000024 - momentum: 0.000000
|
115 |
+
2023-10-19 19:52:39,100 epoch 3 - iter 1424/1786 - loss 0.35997953 - time (sec): 24.50 - samples/sec: 8132.30 - lr: 0.000024 - momentum: 0.000000
|
116 |
+
2023-10-19 19:52:42,129 epoch 3 - iter 1602/1786 - loss 0.36107656 - time (sec): 27.53 - samples/sec: 8152.51 - lr: 0.000024 - momentum: 0.000000
|
117 |
+
2023-10-19 19:52:45,104 epoch 3 - iter 1780/1786 - loss 0.35849403 - time (sec): 30.51 - samples/sec: 8127.99 - lr: 0.000023 - momentum: 0.000000
|
118 |
+
2023-10-19 19:52:45,209 ----------------------------------------------------------------------------------------------------
|
119 |
+
2023-10-19 19:52:45,209 EPOCH 3 done: loss 0.3587 - lr: 0.000023
|
120 |
+
2023-10-19 19:52:47,579 DEV : loss 0.2307889312505722 - f1-score (micro avg) 0.4343
|
121 |
+
2023-10-19 19:52:47,594 saving best model
|
122 |
+
2023-10-19 19:52:47,627 ----------------------------------------------------------------------------------------------------
|
123 |
+
2023-10-19 19:52:50,603 epoch 4 - iter 178/1786 - loss 0.32503973 - time (sec): 2.98 - samples/sec: 8709.72 - lr: 0.000023 - momentum: 0.000000
|
124 |
+
2023-10-19 19:52:53,661 epoch 4 - iter 356/1786 - loss 0.33876825 - time (sec): 6.03 - samples/sec: 8239.92 - lr: 0.000023 - momentum: 0.000000
|
125 |
+
2023-10-19 19:52:56,706 epoch 4 - iter 534/1786 - loss 0.34909168 - time (sec): 9.08 - samples/sec: 8139.72 - lr: 0.000022 - momentum: 0.000000
|
126 |
+
2023-10-19 19:52:59,800 epoch 4 - iter 712/1786 - loss 0.33361868 - time (sec): 12.17 - samples/sec: 8180.84 - lr: 0.000022 - momentum: 0.000000
|
127 |
+
2023-10-19 19:53:02,796 epoch 4 - iter 890/1786 - loss 0.32921053 - time (sec): 15.17 - samples/sec: 8202.07 - lr: 0.000022 - momentum: 0.000000
|
128 |
+
2023-10-19 19:53:05,850 epoch 4 - iter 1068/1786 - loss 0.32434636 - time (sec): 18.22 - samples/sec: 8142.54 - lr: 0.000021 - momentum: 0.000000
|
129 |
+
2023-10-19 19:53:08,954 epoch 4 - iter 1246/1786 - loss 0.32469591 - time (sec): 21.33 - samples/sec: 8072.21 - lr: 0.000021 - momentum: 0.000000
|
130 |
+
2023-10-19 19:53:11,970 epoch 4 - iter 1424/1786 - loss 0.32393392 - time (sec): 24.34 - samples/sec: 8076.36 - lr: 0.000021 - momentum: 0.000000
|
131 |
+
2023-10-19 19:53:15,084 epoch 4 - iter 1602/1786 - loss 0.32421030 - time (sec): 27.46 - samples/sec: 8099.84 - lr: 0.000020 - momentum: 0.000000
|
132 |
+
2023-10-19 19:53:18,196 epoch 4 - iter 1780/1786 - loss 0.32088289 - time (sec): 30.57 - samples/sec: 8119.54 - lr: 0.000020 - momentum: 0.000000
|
133 |
+
2023-10-19 19:53:18,287 ----------------------------------------------------------------------------------------------------
|
134 |
+
2023-10-19 19:53:18,287 EPOCH 4 done: loss 0.3211 - lr: 0.000020
|
135 |
+
2023-10-19 19:53:21,110 DEV : loss 0.2131872922182083 - f1-score (micro avg) 0.4691
|
136 |
+
2023-10-19 19:53:21,125 saving best model
|
137 |
+
2023-10-19 19:53:21,160 ----------------------------------------------------------------------------------------------------
|
138 |
+
2023-10-19 19:53:24,078 epoch 5 - iter 178/1786 - loss 0.31966126 - time (sec): 2.92 - samples/sec: 8571.95 - lr: 0.000020 - momentum: 0.000000
|
139 |
+
2023-10-19 19:53:27,158 epoch 5 - iter 356/1786 - loss 0.30979983 - time (sec): 6.00 - samples/sec: 8542.72 - lr: 0.000019 - momentum: 0.000000
|
140 |
+
2023-10-19 19:53:30,224 epoch 5 - iter 534/1786 - loss 0.30109050 - time (sec): 9.06 - samples/sec: 8366.97 - lr: 0.000019 - momentum: 0.000000
|
141 |
+
2023-10-19 19:53:33,352 epoch 5 - iter 712/1786 - loss 0.30298422 - time (sec): 12.19 - samples/sec: 8163.31 - lr: 0.000019 - momentum: 0.000000
|
142 |
+
2023-10-19 19:53:36,499 epoch 5 - iter 890/1786 - loss 0.30175508 - time (sec): 15.34 - samples/sec: 7974.28 - lr: 0.000018 - momentum: 0.000000
|
143 |
+
2023-10-19 19:53:39,560 epoch 5 - iter 1068/1786 - loss 0.29354489 - time (sec): 18.40 - samples/sec: 8027.67 - lr: 0.000018 - momentum: 0.000000
|
144 |
+
2023-10-19 19:53:42,547 epoch 5 - iter 1246/1786 - loss 0.29628320 - time (sec): 21.39 - samples/sec: 7996.02 - lr: 0.000018 - momentum: 0.000000
|
145 |
+
2023-10-19 19:53:45,679 epoch 5 - iter 1424/1786 - loss 0.29373074 - time (sec): 24.52 - samples/sec: 8009.08 - lr: 0.000017 - momentum: 0.000000
|
146 |
+
2023-10-19 19:53:48,821 epoch 5 - iter 1602/1786 - loss 0.29243459 - time (sec): 27.66 - samples/sec: 8049.63 - lr: 0.000017 - momentum: 0.000000
|
147 |
+
2023-10-19 19:53:51,974 epoch 5 - iter 1780/1786 - loss 0.29269761 - time (sec): 30.81 - samples/sec: 8046.10 - lr: 0.000017 - momentum: 0.000000
|
148 |
+
2023-10-19 19:53:52,092 ----------------------------------------------------------------------------------------------------
|
149 |
+
2023-10-19 19:53:52,092 EPOCH 5 done: loss 0.2925 - lr: 0.000017
|
150 |
+
2023-10-19 19:53:54,463 DEV : loss 0.20653365552425385 - f1-score (micro avg) 0.4815
|
151 |
+
2023-10-19 19:53:54,477 saving best model
|
152 |
+
2023-10-19 19:53:54,510 ----------------------------------------------------------------------------------------------------
|
153 |
+
2023-10-19 19:53:57,812 epoch 6 - iter 178/1786 - loss 0.26481799 - time (sec): 3.30 - samples/sec: 7606.85 - lr: 0.000016 - momentum: 0.000000
|
154 |
+
2023-10-19 19:54:00,948 epoch 6 - iter 356/1786 - loss 0.26935936 - time (sec): 6.44 - samples/sec: 7536.84 - lr: 0.000016 - momentum: 0.000000
|
155 |
+
2023-10-19 19:54:04,139 epoch 6 - iter 534/1786 - loss 0.27348573 - time (sec): 9.63 - samples/sec: 7506.83 - lr: 0.000016 - momentum: 0.000000
|
156 |
+
2023-10-19 19:54:07,207 epoch 6 - iter 712/1786 - loss 0.27134878 - time (sec): 12.70 - samples/sec: 7760.38 - lr: 0.000015 - momentum: 0.000000
|
157 |
+
2023-10-19 19:54:10,303 epoch 6 - iter 890/1786 - loss 0.26920475 - time (sec): 15.79 - samples/sec: 7915.32 - lr: 0.000015 - momentum: 0.000000
|
158 |
+
2023-10-19 19:54:13,384 epoch 6 - iter 1068/1786 - loss 0.27009143 - time (sec): 18.87 - samples/sec: 7904.76 - lr: 0.000015 - momentum: 0.000000
|
159 |
+
2023-10-19 19:54:16,439 epoch 6 - iter 1246/1786 - loss 0.27126902 - time (sec): 21.93 - samples/sec: 7886.87 - lr: 0.000014 - momentum: 0.000000
|
160 |
+
2023-10-19 19:54:19,503 epoch 6 - iter 1424/1786 - loss 0.27292987 - time (sec): 24.99 - samples/sec: 7902.13 - lr: 0.000014 - momentum: 0.000000
|
161 |
+
2023-10-19 19:54:22,691 epoch 6 - iter 1602/1786 - loss 0.27225979 - time (sec): 28.18 - samples/sec: 7926.75 - lr: 0.000014 - momentum: 0.000000
|
162 |
+
2023-10-19 19:54:25,821 epoch 6 - iter 1780/1786 - loss 0.27262868 - time (sec): 31.31 - samples/sec: 7923.30 - lr: 0.000013 - momentum: 0.000000
|
163 |
+
2023-10-19 19:54:25,918 ----------------------------------------------------------------------------------------------------
|
164 |
+
2023-10-19 19:54:25,918 EPOCH 6 done: loss 0.2727 - lr: 0.000013
|
165 |
+
2023-10-19 19:54:28,745 DEV : loss 0.2003042846918106 - f1-score (micro avg) 0.4843
|
166 |
+
2023-10-19 19:54:28,759 saving best model
|
167 |
+
2023-10-19 19:54:28,791 ----------------------------------------------------------------------------------------------------
|
168 |
+
2023-10-19 19:54:31,964 epoch 7 - iter 178/1786 - loss 0.24380929 - time (sec): 3.17 - samples/sec: 8323.54 - lr: 0.000013 - momentum: 0.000000
|
169 |
+
2023-10-19 19:54:35,071 epoch 7 - iter 356/1786 - loss 0.25726475 - time (sec): 6.28 - samples/sec: 8238.91 - lr: 0.000013 - momentum: 0.000000
|
170 |
+
2023-10-19 19:54:38,117 epoch 7 - iter 534/1786 - loss 0.25410518 - time (sec): 9.33 - samples/sec: 8058.86 - lr: 0.000012 - momentum: 0.000000
|
171 |
+
2023-10-19 19:54:41,153 epoch 7 - iter 712/1786 - loss 0.25769061 - time (sec): 12.36 - samples/sec: 7964.33 - lr: 0.000012 - momentum: 0.000000
|
172 |
+
2023-10-19 19:54:44,262 epoch 7 - iter 890/1786 - loss 0.25823402 - time (sec): 15.47 - samples/sec: 7972.49 - lr: 0.000012 - momentum: 0.000000
|
173 |
+
2023-10-19 19:54:47,394 epoch 7 - iter 1068/1786 - loss 0.25747742 - time (sec): 18.60 - samples/sec: 7959.44 - lr: 0.000011 - momentum: 0.000000
|
174 |
+
2023-10-19 19:54:50,595 epoch 7 - iter 1246/1786 - loss 0.25719904 - time (sec): 21.80 - samples/sec: 7962.56 - lr: 0.000011 - momentum: 0.000000
|
175 |
+
2023-10-19 19:54:53,694 epoch 7 - iter 1424/1786 - loss 0.25589988 - time (sec): 24.90 - samples/sec: 8050.60 - lr: 0.000011 - momentum: 0.000000
|
176 |
+
2023-10-19 19:54:56,745 epoch 7 - iter 1602/1786 - loss 0.25845771 - time (sec): 27.95 - samples/sec: 8021.59 - lr: 0.000010 - momentum: 0.000000
|
177 |
+
2023-10-19 19:54:59,724 epoch 7 - iter 1780/1786 - loss 0.25787745 - time (sec): 30.93 - samples/sec: 8027.94 - lr: 0.000010 - momentum: 0.000000
|
178 |
+
2023-10-19 19:54:59,817 ----------------------------------------------------------------------------------------------------
|
179 |
+
2023-10-19 19:54:59,818 EPOCH 7 done: loss 0.2580 - lr: 0.000010
|
180 |
+
2023-10-19 19:55:02,193 DEV : loss 0.2004682868719101 - f1-score (micro avg) 0.5019
|
181 |
+
2023-10-19 19:55:02,209 saving best model
|
182 |
+
2023-10-19 19:55:02,246 ----------------------------------------------------------------------------------------------------
|
183 |
+
2023-10-19 19:55:05,316 epoch 8 - iter 178/1786 - loss 0.24861152 - time (sec): 3.07 - samples/sec: 8175.43 - lr: 0.000010 - momentum: 0.000000
|
184 |
+
2023-10-19 19:55:08,419 epoch 8 - iter 356/1786 - loss 0.24086117 - time (sec): 6.17 - samples/sec: 8098.43 - lr: 0.000009 - momentum: 0.000000
|
185 |
+
2023-10-19 19:55:11,525 epoch 8 - iter 534/1786 - loss 0.24815113 - time (sec): 9.28 - samples/sec: 7956.56 - lr: 0.000009 - momentum: 0.000000
|
186 |
+
2023-10-19 19:55:14,603 epoch 8 - iter 712/1786 - loss 0.24997628 - time (sec): 12.36 - samples/sec: 7987.55 - lr: 0.000009 - momentum: 0.000000
|
187 |
+
2023-10-19 19:55:17,675 epoch 8 - iter 890/1786 - loss 0.24787170 - time (sec): 15.43 - samples/sec: 8019.54 - lr: 0.000008 - momentum: 0.000000
|
188 |
+
2023-10-19 19:55:20,753 epoch 8 - iter 1068/1786 - loss 0.24807736 - time (sec): 18.51 - samples/sec: 8065.54 - lr: 0.000008 - momentum: 0.000000
|
189 |
+
2023-10-19 19:55:23,762 epoch 8 - iter 1246/1786 - loss 0.24768224 - time (sec): 21.52 - samples/sec: 8087.21 - lr: 0.000008 - momentum: 0.000000
|
190 |
+
2023-10-19 19:55:26,840 epoch 8 - iter 1424/1786 - loss 0.24665818 - time (sec): 24.59 - samples/sec: 8095.57 - lr: 0.000007 - momentum: 0.000000
|
191 |
+
2023-10-19 19:55:29,829 epoch 8 - iter 1602/1786 - loss 0.24985874 - time (sec): 27.58 - samples/sec: 8098.97 - lr: 0.000007 - momentum: 0.000000
|
192 |
+
2023-10-19 19:55:32,995 epoch 8 - iter 1780/1786 - loss 0.25000798 - time (sec): 30.75 - samples/sec: 8064.42 - lr: 0.000007 - momentum: 0.000000
|
193 |
+
2023-10-19 19:55:33,104 ----------------------------------------------------------------------------------------------------
|
194 |
+
2023-10-19 19:55:33,104 EPOCH 8 done: loss 0.2499 - lr: 0.000007
|
195 |
+
2023-10-19 19:55:35,972 DEV : loss 0.19668100774288177 - f1-score (micro avg) 0.5031
|
196 |
+
2023-10-19 19:55:35,986 saving best model
|
197 |
+
2023-10-19 19:55:36,020 ----------------------------------------------------------------------------------------------------
|
198 |
+
2023-10-19 19:55:39,211 epoch 9 - iter 178/1786 - loss 0.23800143 - time (sec): 3.19 - samples/sec: 8302.38 - lr: 0.000006 - momentum: 0.000000
|
199 |
+
2023-10-19 19:55:42,260 epoch 9 - iter 356/1786 - loss 0.23583672 - time (sec): 6.24 - samples/sec: 8301.37 - lr: 0.000006 - momentum: 0.000000
|
200 |
+
2023-10-19 19:55:45,281 epoch 9 - iter 534/1786 - loss 0.23499003 - time (sec): 9.26 - samples/sec: 8227.23 - lr: 0.000006 - momentum: 0.000000
|
201 |
+
2023-10-19 19:55:48,244 epoch 9 - iter 712/1786 - loss 0.23888394 - time (sec): 12.22 - samples/sec: 8139.72 - lr: 0.000005 - momentum: 0.000000
|
202 |
+
2023-10-19 19:55:51,288 epoch 9 - iter 890/1786 - loss 0.24062823 - time (sec): 15.27 - samples/sec: 8068.23 - lr: 0.000005 - momentum: 0.000000
|
203 |
+
2023-10-19 19:55:54,409 epoch 9 - iter 1068/1786 - loss 0.24208841 - time (sec): 18.39 - samples/sec: 8095.06 - lr: 0.000005 - momentum: 0.000000
|
204 |
+
2023-10-19 19:55:57,156 epoch 9 - iter 1246/1786 - loss 0.24572741 - time (sec): 21.13 - samples/sec: 8235.25 - lr: 0.000004 - momentum: 0.000000
|
205 |
+
2023-10-19 19:56:00,158 epoch 9 - iter 1424/1786 - loss 0.24461181 - time (sec): 24.14 - samples/sec: 8218.35 - lr: 0.000004 - momentum: 0.000000
|
206 |
+
2023-10-19 19:56:03,242 epoch 9 - iter 1602/1786 - loss 0.24434313 - time (sec): 27.22 - samples/sec: 8220.44 - lr: 0.000004 - momentum: 0.000000
|
207 |
+
2023-10-19 19:56:06,397 epoch 9 - iter 1780/1786 - loss 0.24230508 - time (sec): 30.38 - samples/sec: 8165.70 - lr: 0.000003 - momentum: 0.000000
|
208 |
+
2023-10-19 19:56:06,497 ----------------------------------------------------------------------------------------------------
|
209 |
+
2023-10-19 19:56:06,497 EPOCH 9 done: loss 0.2419 - lr: 0.000003
|
210 |
+
2023-10-19 19:56:08,853 DEV : loss 0.1973438411951065 - f1-score (micro avg) 0.508
|
211 |
+
2023-10-19 19:56:08,867 saving best model
|
212 |
+
2023-10-19 19:56:08,900 ----------------------------------------------------------------------------------------------------
|
213 |
+
2023-10-19 19:56:11,976 epoch 10 - iter 178/1786 - loss 0.24379983 - time (sec): 3.08 - samples/sec: 7549.01 - lr: 0.000003 - momentum: 0.000000
|
214 |
+
2023-10-19 19:56:15,113 epoch 10 - iter 356/1786 - loss 0.25116343 - time (sec): 6.21 - samples/sec: 7700.35 - lr: 0.000003 - momentum: 0.000000
|
215 |
+
2023-10-19 19:56:18,116 epoch 10 - iter 534/1786 - loss 0.25060293 - time (sec): 9.22 - samples/sec: 7865.97 - lr: 0.000002 - momentum: 0.000000
|
216 |
+
2023-10-19 19:56:21,260 epoch 10 - iter 712/1786 - loss 0.25011516 - time (sec): 12.36 - samples/sec: 7898.81 - lr: 0.000002 - momentum: 0.000000
|
217 |
+
2023-10-19 19:56:24,503 epoch 10 - iter 890/1786 - loss 0.24555631 - time (sec): 15.60 - samples/sec: 7783.06 - lr: 0.000002 - momentum: 0.000000
|
218 |
+
2023-10-19 19:56:27,557 epoch 10 - iter 1068/1786 - loss 0.24118692 - time (sec): 18.66 - samples/sec: 7834.94 - lr: 0.000001 - momentum: 0.000000
|
219 |
+
2023-10-19 19:56:30,299 epoch 10 - iter 1246/1786 - loss 0.23655811 - time (sec): 21.40 - samples/sec: 8048.55 - lr: 0.000001 - momentum: 0.000000
|
220 |
+
2023-10-19 19:56:33,323 epoch 10 - iter 1424/1786 - loss 0.23172756 - time (sec): 24.42 - samples/sec: 8110.64 - lr: 0.000001 - momentum: 0.000000
|
221 |
+
2023-10-19 19:56:36,431 epoch 10 - iter 1602/1786 - loss 0.23408678 - time (sec): 27.53 - samples/sec: 8114.79 - lr: 0.000000 - momentum: 0.000000
|
222 |
+
2023-10-19 19:56:39,501 epoch 10 - iter 1780/1786 - loss 0.23658798 - time (sec): 30.60 - samples/sec: 8109.58 - lr: 0.000000 - momentum: 0.000000
|
223 |
+
2023-10-19 19:56:39,599 ----------------------------------------------------------------------------------------------------
|
224 |
+
2023-10-19 19:56:39,599 EPOCH 10 done: loss 0.2373 - lr: 0.000000
|
225 |
+
2023-10-19 19:56:42,416 DEV : loss 0.19595196843147278 - f1-score (micro avg) 0.5103
|
226 |
+
2023-10-19 19:56:42,430 saving best model
|
227 |
+
2023-10-19 19:56:42,489 ----------------------------------------------------------------------------------------------------
|
228 |
+
2023-10-19 19:56:42,489 Loading model from best epoch ...
|
229 |
+
2023-10-19 19:56:42,562 SequenceTagger predicts: Dictionary with 17 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
|
230 |
+
2023-10-19 19:56:47,193
|
231 |
+
Results:
|
232 |
+
- F-score (micro) 0.414
|
233 |
+
- F-score (macro) 0.2562
|
234 |
+
- Accuracy 0.271
|
235 |
+
|
236 |
+
By class:
|
237 |
+
precision recall f1-score support
|
238 |
+
|
239 |
+
LOC 0.3964 0.5169 0.4487 1095
|
240 |
+
PER 0.4249 0.4951 0.4573 1012
|
241 |
+
ORG 0.1581 0.0952 0.1189 357
|
242 |
+
HumanProd 0.0000 0.0000 0.0000 33
|
243 |
+
|
244 |
+
micro avg 0.3901 0.4409 0.4140 2497
|
245 |
+
macro avg 0.2449 0.2768 0.2562 2497
|
246 |
+
weighted avg 0.3686 0.4409 0.3991 2497
|
247 |
+
|
248 |
+
2023-10-19 19:56:47,193 ----------------------------------------------------------------------------------------------------
|