Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- loss.tsv +11 -0
- test.tsv +0 -0
- training.log +239 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:263cd62a7c59f7c69c1b7f715b728c3b3672686706296272b5eaa56d2c0987b2
|
3 |
+
size 443311111
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 10:58:28 0.0000 0.4213 0.1286 0.7712 0.6477 0.7041 0.5544
|
3 |
+
2 10:59:32 0.0000 0.1014 0.0938 0.8188 0.6116 0.7002 0.5487
|
4 |
+
3 11:00:36 0.0000 0.0631 0.0863 0.8468 0.7707 0.8069 0.6876
|
5 |
+
4 11:01:42 0.0000 0.0412 0.0869 0.8429 0.8151 0.8288 0.7199
|
6 |
+
5 11:02:45 0.0000 0.0308 0.1313 0.8619 0.7541 0.8044 0.6816
|
7 |
+
6 11:03:49 0.0000 0.0223 0.1269 0.8591 0.7934 0.8249 0.7124
|
8 |
+
7 11:04:54 0.0000 0.0174 0.1546 0.8547 0.7779 0.8145 0.6985
|
9 |
+
8 11:05:58 0.0000 0.0132 0.1508 0.8720 0.8161 0.8431 0.7390
|
10 |
+
9 11:07:03 0.0000 0.0095 0.1579 0.8472 0.8192 0.8330 0.7282
|
11 |
+
10 11:08:08 0.0000 0.0077 0.1602 0.8551 0.8110 0.8324 0.7262
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,239 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-14 10:57:26,424 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-14 10:57:26,425 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=768, out_features=13, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-10-14 10:57:26,425 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-10-14 10:57:26,425 MultiCorpus: 5777 train + 722 dev + 723 test sentences
|
52 |
+
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /root/.flair/datasets/ner_icdar_europeana/nl
|
53 |
+
2023-10-14 10:57:26,425 ----------------------------------------------------------------------------------------------------
|
54 |
+
2023-10-14 10:57:26,425 Train: 5777 sentences
|
55 |
+
2023-10-14 10:57:26,425 (train_with_dev=False, train_with_test=False)
|
56 |
+
2023-10-14 10:57:26,425 ----------------------------------------------------------------------------------------------------
|
57 |
+
2023-10-14 10:57:26,425 Training Params:
|
58 |
+
2023-10-14 10:57:26,425 - learning_rate: "3e-05"
|
59 |
+
2023-10-14 10:57:26,425 - mini_batch_size: "8"
|
60 |
+
2023-10-14 10:57:26,425 - max_epochs: "10"
|
61 |
+
2023-10-14 10:57:26,425 - shuffle: "True"
|
62 |
+
2023-10-14 10:57:26,425 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-14 10:57:26,426 Plugins:
|
64 |
+
2023-10-14 10:57:26,426 - LinearScheduler | warmup_fraction: '0.1'
|
65 |
+
2023-10-14 10:57:26,426 ----------------------------------------------------------------------------------------------------
|
66 |
+
2023-10-14 10:57:26,426 Final evaluation on model from best epoch (best-model.pt)
|
67 |
+
2023-10-14 10:57:26,426 - metric: "('micro avg', 'f1-score')"
|
68 |
+
2023-10-14 10:57:26,426 ----------------------------------------------------------------------------------------------------
|
69 |
+
2023-10-14 10:57:26,426 Computation:
|
70 |
+
2023-10-14 10:57:26,426 - compute on device: cuda:0
|
71 |
+
2023-10-14 10:57:26,426 - embedding storage: none
|
72 |
+
2023-10-14 10:57:26,426 ----------------------------------------------------------------------------------------------------
|
73 |
+
2023-10-14 10:57:26,426 Model training base path: "hmbench-icdar/nl-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-4"
|
74 |
+
2023-10-14 10:57:26,426 ----------------------------------------------------------------------------------------------------
|
75 |
+
2023-10-14 10:57:26,426 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-10-14 10:57:32,692 epoch 1 - iter 72/723 - loss 2.31506993 - time (sec): 6.26 - samples/sec: 2976.58 - lr: 0.000003 - momentum: 0.000000
|
77 |
+
2023-10-14 10:57:38,328 epoch 1 - iter 144/723 - loss 1.40098098 - time (sec): 11.90 - samples/sec: 3035.24 - lr: 0.000006 - momentum: 0.000000
|
78 |
+
2023-10-14 10:57:44,275 epoch 1 - iter 216/723 - loss 1.02739094 - time (sec): 17.85 - samples/sec: 2985.78 - lr: 0.000009 - momentum: 0.000000
|
79 |
+
2023-10-14 10:57:49,976 epoch 1 - iter 288/723 - loss 0.82673104 - time (sec): 23.55 - samples/sec: 2980.75 - lr: 0.000012 - momentum: 0.000000
|
80 |
+
2023-10-14 10:57:55,969 epoch 1 - iter 360/723 - loss 0.69531849 - time (sec): 29.54 - samples/sec: 2988.43 - lr: 0.000015 - momentum: 0.000000
|
81 |
+
2023-10-14 10:58:01,677 epoch 1 - iter 432/723 - loss 0.60767546 - time (sec): 35.25 - samples/sec: 3009.88 - lr: 0.000018 - momentum: 0.000000
|
82 |
+
2023-10-14 10:58:07,438 epoch 1 - iter 504/723 - loss 0.54162968 - time (sec): 41.01 - samples/sec: 3014.13 - lr: 0.000021 - momentum: 0.000000
|
83 |
+
2023-10-14 10:58:14,071 epoch 1 - iter 576/723 - loss 0.48974416 - time (sec): 47.64 - samples/sec: 2993.71 - lr: 0.000024 - momentum: 0.000000
|
84 |
+
2023-10-14 10:58:20,287 epoch 1 - iter 648/723 - loss 0.45020073 - time (sec): 53.86 - samples/sec: 2969.41 - lr: 0.000027 - momentum: 0.000000
|
85 |
+
2023-10-14 10:58:25,550 epoch 1 - iter 720/723 - loss 0.42217115 - time (sec): 59.12 - samples/sec: 2971.06 - lr: 0.000030 - momentum: 0.000000
|
86 |
+
2023-10-14 10:58:25,760 ----------------------------------------------------------------------------------------------------
|
87 |
+
2023-10-14 10:58:25,760 EPOCH 1 done: loss 0.4213 - lr: 0.000030
|
88 |
+
2023-10-14 10:58:28,750 DEV : loss 0.12855297327041626 - f1-score (micro avg) 0.7041
|
89 |
+
2023-10-14 10:58:28,768 saving best model
|
90 |
+
2023-10-14 10:58:29,152 ----------------------------------------------------------------------------------------------------
|
91 |
+
2023-10-14 10:58:34,794 epoch 2 - iter 72/723 - loss 0.12156124 - time (sec): 5.64 - samples/sec: 2875.61 - lr: 0.000030 - momentum: 0.000000
|
92 |
+
2023-10-14 10:58:40,804 epoch 2 - iter 144/723 - loss 0.11146254 - time (sec): 11.65 - samples/sec: 2900.46 - lr: 0.000029 - momentum: 0.000000
|
93 |
+
2023-10-14 10:58:46,896 epoch 2 - iter 216/723 - loss 0.11686624 - time (sec): 17.74 - samples/sec: 2917.23 - lr: 0.000029 - momentum: 0.000000
|
94 |
+
2023-10-14 10:58:53,571 epoch 2 - iter 288/723 - loss 0.11075586 - time (sec): 24.42 - samples/sec: 2902.96 - lr: 0.000029 - momentum: 0.000000
|
95 |
+
2023-10-14 10:58:59,700 epoch 2 - iter 360/723 - loss 0.10651151 - time (sec): 30.55 - samples/sec: 2907.89 - lr: 0.000028 - momentum: 0.000000
|
96 |
+
2023-10-14 10:59:05,443 epoch 2 - iter 432/723 - loss 0.10541663 - time (sec): 36.29 - samples/sec: 2909.61 - lr: 0.000028 - momentum: 0.000000
|
97 |
+
2023-10-14 10:59:10,993 epoch 2 - iter 504/723 - loss 0.10566055 - time (sec): 41.84 - samples/sec: 2918.83 - lr: 0.000028 - momentum: 0.000000
|
98 |
+
2023-10-14 10:59:16,688 epoch 2 - iter 576/723 - loss 0.10315725 - time (sec): 47.53 - samples/sec: 2933.43 - lr: 0.000027 - momentum: 0.000000
|
99 |
+
2023-10-14 10:59:22,656 epoch 2 - iter 648/723 - loss 0.10168054 - time (sec): 53.50 - samples/sec: 2939.05 - lr: 0.000027 - momentum: 0.000000
|
100 |
+
2023-10-14 10:59:28,642 epoch 2 - iter 720/723 - loss 0.10158311 - time (sec): 59.49 - samples/sec: 2953.47 - lr: 0.000027 - momentum: 0.000000
|
101 |
+
2023-10-14 10:59:28,880 ----------------------------------------------------------------------------------------------------
|
102 |
+
2023-10-14 10:59:28,880 EPOCH 2 done: loss 0.1014 - lr: 0.000027
|
103 |
+
2023-10-14 10:59:32,771 DEV : loss 0.0938434973359108 - f1-score (micro avg) 0.7002
|
104 |
+
2023-10-14 10:59:32,788 ----------------------------------------------------------------------------------------------------
|
105 |
+
2023-10-14 10:59:38,964 epoch 3 - iter 72/723 - loss 0.06077009 - time (sec): 6.17 - samples/sec: 2935.87 - lr: 0.000026 - momentum: 0.000000
|
106 |
+
2023-10-14 10:59:45,122 epoch 3 - iter 144/723 - loss 0.06201343 - time (sec): 12.33 - samples/sec: 2887.51 - lr: 0.000026 - momentum: 0.000000
|
107 |
+
2023-10-14 10:59:50,971 epoch 3 - iter 216/723 - loss 0.06525859 - time (sec): 18.18 - samples/sec: 2855.52 - lr: 0.000026 - momentum: 0.000000
|
108 |
+
2023-10-14 10:59:56,686 epoch 3 - iter 288/723 - loss 0.06305458 - time (sec): 23.90 - samples/sec: 2897.57 - lr: 0.000025 - momentum: 0.000000
|
109 |
+
2023-10-14 11:00:02,887 epoch 3 - iter 360/723 - loss 0.06172832 - time (sec): 30.10 - samples/sec: 2913.59 - lr: 0.000025 - momentum: 0.000000
|
110 |
+
2023-10-14 11:00:08,780 epoch 3 - iter 432/723 - loss 0.06330693 - time (sec): 35.99 - samples/sec: 2921.37 - lr: 0.000025 - momentum: 0.000000
|
111 |
+
2023-10-14 11:00:15,222 epoch 3 - iter 504/723 - loss 0.06448096 - time (sec): 42.43 - samples/sec: 2919.30 - lr: 0.000024 - momentum: 0.000000
|
112 |
+
2023-10-14 11:00:20,867 epoch 3 - iter 576/723 - loss 0.06382838 - time (sec): 48.08 - samples/sec: 2918.70 - lr: 0.000024 - momentum: 0.000000
|
113 |
+
2023-10-14 11:00:26,929 epoch 3 - iter 648/723 - loss 0.06273635 - time (sec): 54.14 - samples/sec: 2908.32 - lr: 0.000024 - momentum: 0.000000
|
114 |
+
2023-10-14 11:00:33,161 epoch 3 - iter 720/723 - loss 0.06321271 - time (sec): 60.37 - samples/sec: 2905.22 - lr: 0.000023 - momentum: 0.000000
|
115 |
+
2023-10-14 11:00:33,487 ----------------------------------------------------------------------------------------------------
|
116 |
+
2023-10-14 11:00:33,488 EPOCH 3 done: loss 0.0631 - lr: 0.000023
|
117 |
+
2023-10-14 11:00:36,981 DEV : loss 0.08630853146314621 - f1-score (micro avg) 0.8069
|
118 |
+
2023-10-14 11:00:36,999 saving best model
|
119 |
+
2023-10-14 11:00:37,532 ----------------------------------------------------------------------------------------------------
|
120 |
+
2023-10-14 11:00:43,552 epoch 4 - iter 72/723 - loss 0.03595234 - time (sec): 6.02 - samples/sec: 2919.12 - lr: 0.000023 - momentum: 0.000000
|
121 |
+
2023-10-14 11:00:49,964 epoch 4 - iter 144/723 - loss 0.04721867 - time (sec): 12.43 - samples/sec: 2884.50 - lr: 0.000023 - momentum: 0.000000
|
122 |
+
2023-10-14 11:00:56,286 epoch 4 - iter 216/723 - loss 0.04688850 - time (sec): 18.75 - samples/sec: 2815.40 - lr: 0.000022 - momentum: 0.000000
|
123 |
+
2023-10-14 11:01:02,694 epoch 4 - iter 288/723 - loss 0.04288223 - time (sec): 25.16 - samples/sec: 2808.48 - lr: 0.000022 - momentum: 0.000000
|
124 |
+
2023-10-14 11:01:08,221 epoch 4 - iter 360/723 - loss 0.04159157 - time (sec): 30.69 - samples/sec: 2840.38 - lr: 0.000022 - momentum: 0.000000
|
125 |
+
2023-10-14 11:01:14,265 epoch 4 - iter 432/723 - loss 0.04113805 - time (sec): 36.73 - samples/sec: 2877.65 - lr: 0.000021 - momentum: 0.000000
|
126 |
+
2023-10-14 11:01:20,221 epoch 4 - iter 504/723 - loss 0.04112768 - time (sec): 42.69 - samples/sec: 2875.66 - lr: 0.000021 - momentum: 0.000000
|
127 |
+
2023-10-14 11:01:26,212 epoch 4 - iter 576/723 - loss 0.04139046 - time (sec): 48.68 - samples/sec: 2886.33 - lr: 0.000021 - momentum: 0.000000
|
128 |
+
2023-10-14 11:01:32,305 epoch 4 - iter 648/723 - loss 0.04073847 - time (sec): 54.77 - samples/sec: 2895.25 - lr: 0.000020 - momentum: 0.000000
|
129 |
+
2023-10-14 11:01:38,302 epoch 4 - iter 720/723 - loss 0.04096798 - time (sec): 60.77 - samples/sec: 2889.77 - lr: 0.000020 - momentum: 0.000000
|
130 |
+
2023-10-14 11:01:38,500 ----------------------------------------------------------------------------------------------------
|
131 |
+
2023-10-14 11:01:38,501 EPOCH 4 done: loss 0.0412 - lr: 0.000020
|
132 |
+
2023-10-14 11:01:42,023 DEV : loss 0.08693055063486099 - f1-score (micro avg) 0.8288
|
133 |
+
2023-10-14 11:01:42,042 saving best model
|
134 |
+
2023-10-14 11:01:42,516 ----------------------------------------------------------------------------------------------------
|
135 |
+
2023-10-14 11:01:48,799 epoch 5 - iter 72/723 - loss 0.03262287 - time (sec): 6.28 - samples/sec: 2932.38 - lr: 0.000020 - momentum: 0.000000
|
136 |
+
2023-10-14 11:01:54,248 epoch 5 - iter 144/723 - loss 0.03076967 - time (sec): 11.73 - samples/sec: 3021.49 - lr: 0.000019 - momentum: 0.000000
|
137 |
+
2023-10-14 11:02:00,487 epoch 5 - iter 216/723 - loss 0.02828212 - time (sec): 17.97 - samples/sec: 3012.82 - lr: 0.000019 - momentum: 0.000000
|
138 |
+
2023-10-14 11:02:06,303 epoch 5 - iter 288/723 - loss 0.03125818 - time (sec): 23.78 - samples/sec: 2978.07 - lr: 0.000019 - momentum: 0.000000
|
139 |
+
2023-10-14 11:02:11,988 epoch 5 - iter 360/723 - loss 0.02911501 - time (sec): 29.47 - samples/sec: 2974.33 - lr: 0.000018 - momentum: 0.000000
|
140 |
+
2023-10-14 11:02:17,343 epoch 5 - iter 432/723 - loss 0.02975581 - time (sec): 34.82 - samples/sec: 2979.90 - lr: 0.000018 - momentum: 0.000000
|
141 |
+
2023-10-14 11:02:23,371 epoch 5 - iter 504/723 - loss 0.02903271 - time (sec): 40.85 - samples/sec: 2985.46 - lr: 0.000018 - momentum: 0.000000
|
142 |
+
2023-10-14 11:02:29,523 epoch 5 - iter 576/723 - loss 0.03013901 - time (sec): 47.00 - samples/sec: 2972.90 - lr: 0.000017 - momentum: 0.000000
|
143 |
+
2023-10-14 11:02:35,695 epoch 5 - iter 648/723 - loss 0.03126828 - time (sec): 53.18 - samples/sec: 2976.24 - lr: 0.000017 - momentum: 0.000000
|
144 |
+
2023-10-14 11:02:41,551 epoch 5 - iter 720/723 - loss 0.03072279 - time (sec): 59.03 - samples/sec: 2976.35 - lr: 0.000017 - momentum: 0.000000
|
145 |
+
2023-10-14 11:02:41,720 ----------------------------------------------------------------------------------------------------
|
146 |
+
2023-10-14 11:02:41,721 EPOCH 5 done: loss 0.0308 - lr: 0.000017
|
147 |
+
2023-10-14 11:02:45,637 DEV : loss 0.13127633929252625 - f1-score (micro avg) 0.8044
|
148 |
+
2023-10-14 11:02:45,653 ----------------------------------------------------------------------------------------------------
|
149 |
+
2023-10-14 11:02:51,633 epoch 6 - iter 72/723 - loss 0.01842860 - time (sec): 5.98 - samples/sec: 2902.58 - lr: 0.000016 - momentum: 0.000000
|
150 |
+
2023-10-14 11:02:57,718 epoch 6 - iter 144/723 - loss 0.02104149 - time (sec): 12.06 - samples/sec: 2907.63 - lr: 0.000016 - momentum: 0.000000
|
151 |
+
2023-10-14 11:03:03,381 epoch 6 - iter 216/723 - loss 0.02082948 - time (sec): 17.73 - samples/sec: 2961.34 - lr: 0.000016 - momentum: 0.000000
|
152 |
+
2023-10-14 11:03:09,572 epoch 6 - iter 288/723 - loss 0.02184263 - time (sec): 23.92 - samples/sec: 2949.13 - lr: 0.000015 - momentum: 0.000000
|
153 |
+
2023-10-14 11:03:15,404 epoch 6 - iter 360/723 - loss 0.02072094 - time (sec): 29.75 - samples/sec: 2938.63 - lr: 0.000015 - momentum: 0.000000
|
154 |
+
2023-10-14 11:03:21,504 epoch 6 - iter 432/723 - loss 0.02069835 - time (sec): 35.85 - samples/sec: 2930.06 - lr: 0.000015 - momentum: 0.000000
|
155 |
+
2023-10-14 11:03:27,963 epoch 6 - iter 504/723 - loss 0.02129765 - time (sec): 42.31 - samples/sec: 2904.18 - lr: 0.000014 - momentum: 0.000000
|
156 |
+
2023-10-14 11:03:34,651 epoch 6 - iter 576/723 - loss 0.02050509 - time (sec): 49.00 - samples/sec: 2903.26 - lr: 0.000014 - momentum: 0.000000
|
157 |
+
2023-10-14 11:03:40,533 epoch 6 - iter 648/723 - loss 0.02175670 - time (sec): 54.88 - samples/sec: 2899.49 - lr: 0.000014 - momentum: 0.000000
|
158 |
+
2023-10-14 11:03:46,229 epoch 6 - iter 720/723 - loss 0.02231132 - time (sec): 60.57 - samples/sec: 2901.48 - lr: 0.000013 - momentum: 0.000000
|
159 |
+
2023-10-14 11:03:46,397 ----------------------------------------------------------------------------------------------------
|
160 |
+
2023-10-14 11:03:46,397 EPOCH 6 done: loss 0.0223 - lr: 0.000013
|
161 |
+
2023-10-14 11:03:49,947 DEV : loss 0.12690779566764832 - f1-score (micro avg) 0.8249
|
162 |
+
2023-10-14 11:03:49,964 ----------------------------------------------------------------------------------------------------
|
163 |
+
2023-10-14 11:03:56,165 epoch 7 - iter 72/723 - loss 0.00937309 - time (sec): 6.20 - samples/sec: 2830.04 - lr: 0.000013 - momentum: 0.000000
|
164 |
+
2023-10-14 11:04:02,678 epoch 7 - iter 144/723 - loss 0.01350239 - time (sec): 12.71 - samples/sec: 2882.76 - lr: 0.000013 - momentum: 0.000000
|
165 |
+
2023-10-14 11:04:08,323 epoch 7 - iter 216/723 - loss 0.01358360 - time (sec): 18.36 - samples/sec: 2915.57 - lr: 0.000012 - momentum: 0.000000
|
166 |
+
2023-10-14 11:04:15,072 epoch 7 - iter 288/723 - loss 0.01452212 - time (sec): 25.11 - samples/sec: 2864.73 - lr: 0.000012 - momentum: 0.000000
|
167 |
+
2023-10-14 11:04:20,800 epoch 7 - iter 360/723 - loss 0.01439804 - time (sec): 30.84 - samples/sec: 2869.66 - lr: 0.000012 - momentum: 0.000000
|
168 |
+
2023-10-14 11:04:26,288 epoch 7 - iter 432/723 - loss 0.01516061 - time (sec): 36.32 - samples/sec: 2893.06 - lr: 0.000011 - momentum: 0.000000
|
169 |
+
2023-10-14 11:04:32,788 epoch 7 - iter 504/723 - loss 0.01712164 - time (sec): 42.82 - samples/sec: 2892.01 - lr: 0.000011 - momentum: 0.000000
|
170 |
+
2023-10-14 11:04:38,780 epoch 7 - iter 576/723 - loss 0.01729432 - time (sec): 48.81 - samples/sec: 2906.42 - lr: 0.000011 - momentum: 0.000000
|
171 |
+
2023-10-14 11:04:44,374 epoch 7 - iter 648/723 - loss 0.01749135 - time (sec): 54.41 - samples/sec: 2924.73 - lr: 0.000010 - momentum: 0.000000
|
172 |
+
2023-10-14 11:04:50,330 epoch 7 - iter 720/723 - loss 0.01711051 - time (sec): 60.37 - samples/sec: 2912.37 - lr: 0.000010 - momentum: 0.000000
|
173 |
+
2023-10-14 11:04:50,524 ----------------------------------------------------------------------------------------------------
|
174 |
+
2023-10-14 11:04:50,525 EPOCH 7 done: loss 0.0174 - lr: 0.000010
|
175 |
+
2023-10-14 11:04:54,070 DEV : loss 0.15461641550064087 - f1-score (micro avg) 0.8145
|
176 |
+
2023-10-14 11:04:54,088 ----------------------------------------------------------------------------------------------------
|
177 |
+
2023-10-14 11:04:59,959 epoch 8 - iter 72/723 - loss 0.01665349 - time (sec): 5.87 - samples/sec: 2848.03 - lr: 0.000010 - momentum: 0.000000
|
178 |
+
2023-10-14 11:05:06,094 epoch 8 - iter 144/723 - loss 0.01446601 - time (sec): 12.00 - samples/sec: 2881.67 - lr: 0.000009 - momentum: 0.000000
|
179 |
+
2023-10-14 11:05:12,405 epoch 8 - iter 216/723 - loss 0.01410012 - time (sec): 18.32 - samples/sec: 2857.36 - lr: 0.000009 - momentum: 0.000000
|
180 |
+
2023-10-14 11:05:18,576 epoch 8 - iter 288/723 - loss 0.01366919 - time (sec): 24.49 - samples/sec: 2845.71 - lr: 0.000009 - momentum: 0.000000
|
181 |
+
2023-10-14 11:05:24,652 epoch 8 - iter 360/723 - loss 0.01239443 - time (sec): 30.56 - samples/sec: 2877.55 - lr: 0.000008 - momentum: 0.000000
|
182 |
+
2023-10-14 11:05:30,144 epoch 8 - iter 432/723 - loss 0.01265536 - time (sec): 36.05 - samples/sec: 2900.06 - lr: 0.000008 - momentum: 0.000000
|
183 |
+
2023-10-14 11:05:36,539 epoch 8 - iter 504/723 - loss 0.01358128 - time (sec): 42.45 - samples/sec: 2891.45 - lr: 0.000008 - momentum: 0.000000
|
184 |
+
2023-10-14 11:05:42,619 epoch 8 - iter 576/723 - loss 0.01348813 - time (sec): 48.53 - samples/sec: 2893.15 - lr: 0.000007 - momentum: 0.000000
|
185 |
+
2023-10-14 11:05:48,324 epoch 8 - iter 648/723 - loss 0.01308338 - time (sec): 54.23 - samples/sec: 2905.38 - lr: 0.000007 - momentum: 0.000000
|
186 |
+
2023-10-14 11:05:54,660 epoch 8 - iter 720/723 - loss 0.01321059 - time (sec): 60.57 - samples/sec: 2896.19 - lr: 0.000007 - momentum: 0.000000
|
187 |
+
2023-10-14 11:05:54,878 ----------------------------------------------------------------------------------------------------
|
188 |
+
2023-10-14 11:05:54,878 EPOCH 8 done: loss 0.0132 - lr: 0.000007
|
189 |
+
2023-10-14 11:05:58,864 DEV : loss 0.150771364569664 - f1-score (micro avg) 0.8431
|
190 |
+
2023-10-14 11:05:58,880 saving best model
|
191 |
+
2023-10-14 11:05:59,413 ----------------------------------------------------------------------------------------------------
|
192 |
+
2023-10-14 11:06:05,658 epoch 9 - iter 72/723 - loss 0.01017083 - time (sec): 6.24 - samples/sec: 2911.90 - lr: 0.000006 - momentum: 0.000000
|
193 |
+
2023-10-14 11:06:11,476 epoch 9 - iter 144/723 - loss 0.00911976 - time (sec): 12.06 - samples/sec: 2930.73 - lr: 0.000006 - momentum: 0.000000
|
194 |
+
2023-10-14 11:06:17,232 epoch 9 - iter 216/723 - loss 0.00838583 - time (sec): 17.81 - samples/sec: 2926.00 - lr: 0.000006 - momentum: 0.000000
|
195 |
+
2023-10-14 11:06:23,911 epoch 9 - iter 288/723 - loss 0.00904573 - time (sec): 24.49 - samples/sec: 2913.97 - lr: 0.000005 - momentum: 0.000000
|
196 |
+
2023-10-14 11:06:29,832 epoch 9 - iter 360/723 - loss 0.00913035 - time (sec): 30.41 - samples/sec: 2918.84 - lr: 0.000005 - momentum: 0.000000
|
197 |
+
2023-10-14 11:06:36,140 epoch 9 - iter 432/723 - loss 0.00938579 - time (sec): 36.72 - samples/sec: 2905.02 - lr: 0.000005 - momentum: 0.000000
|
198 |
+
2023-10-14 11:06:41,831 epoch 9 - iter 504/723 - loss 0.00897802 - time (sec): 42.41 - samples/sec: 2912.32 - lr: 0.000004 - momentum: 0.000000
|
199 |
+
2023-10-14 11:06:48,221 epoch 9 - iter 576/723 - loss 0.00894380 - time (sec): 48.80 - samples/sec: 2899.02 - lr: 0.000004 - momentum: 0.000000
|
200 |
+
2023-10-14 11:06:54,001 epoch 9 - iter 648/723 - loss 0.00958515 - time (sec): 54.58 - samples/sec: 2903.49 - lr: 0.000004 - momentum: 0.000000
|
201 |
+
2023-10-14 11:06:59,830 epoch 9 - iter 720/723 - loss 0.00943720 - time (sec): 60.41 - samples/sec: 2908.78 - lr: 0.000003 - momentum: 0.000000
|
202 |
+
2023-10-14 11:07:00,059 ----------------------------------------------------------------------------------------------------
|
203 |
+
2023-10-14 11:07:00,060 EPOCH 9 done: loss 0.0095 - lr: 0.000003
|
204 |
+
2023-10-14 11:07:03,577 DEV : loss 0.15790636837482452 - f1-score (micro avg) 0.833
|
205 |
+
2023-10-14 11:07:03,592 ----------------------------------------------------------------------------------------------------
|
206 |
+
2023-10-14 11:07:09,324 epoch 10 - iter 72/723 - loss 0.00876310 - time (sec): 5.73 - samples/sec: 2913.63 - lr: 0.000003 - momentum: 0.000000
|
207 |
+
2023-10-14 11:07:15,547 epoch 10 - iter 144/723 - loss 0.00697092 - time (sec): 11.95 - samples/sec: 2905.71 - lr: 0.000003 - momentum: 0.000000
|
208 |
+
2023-10-14 11:07:22,240 epoch 10 - iter 216/723 - loss 0.00740430 - time (sec): 18.65 - samples/sec: 2809.93 - lr: 0.000002 - momentum: 0.000000
|
209 |
+
2023-10-14 11:07:28,447 epoch 10 - iter 288/723 - loss 0.00869243 - time (sec): 24.85 - samples/sec: 2862.60 - lr: 0.000002 - momentum: 0.000000
|
210 |
+
2023-10-14 11:07:34,727 epoch 10 - iter 360/723 - loss 0.00813367 - time (sec): 31.13 - samples/sec: 2878.44 - lr: 0.000002 - momentum: 0.000000
|
211 |
+
2023-10-14 11:07:40,624 epoch 10 - iter 432/723 - loss 0.00870267 - time (sec): 37.03 - samples/sec: 2886.14 - lr: 0.000001 - momentum: 0.000000
|
212 |
+
2023-10-14 11:07:46,030 epoch 10 - iter 504/723 - loss 0.00810991 - time (sec): 42.44 - samples/sec: 2888.69 - lr: 0.000001 - momentum: 0.000000
|
213 |
+
2023-10-14 11:07:51,966 epoch 10 - iter 576/723 - loss 0.00778453 - time (sec): 48.37 - samples/sec: 2879.90 - lr: 0.000001 - momentum: 0.000000
|
214 |
+
2023-10-14 11:07:58,270 epoch 10 - iter 648/723 - loss 0.00790965 - time (sec): 54.68 - samples/sec: 2885.46 - lr: 0.000000 - momentum: 0.000000
|
215 |
+
2023-10-14 11:08:04,217 epoch 10 - iter 720/723 - loss 0.00773716 - time (sec): 60.62 - samples/sec: 2894.81 - lr: 0.000000 - momentum: 0.000000
|
216 |
+
2023-10-14 11:08:04,452 ----------------------------------------------------------------------------------------------------
|
217 |
+
2023-10-14 11:08:04,452 EPOCH 10 done: loss 0.0077 - lr: 0.000000
|
218 |
+
2023-10-14 11:08:07,999 DEV : loss 0.16016288101673126 - f1-score (micro avg) 0.8324
|
219 |
+
2023-10-14 11:08:08,469 ----------------------------------------------------------------------------------------------------
|
220 |
+
2023-10-14 11:08:08,470 Loading model from best epoch ...
|
221 |
+
2023-10-14 11:08:10,080 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
|
222 |
+
2023-10-14 11:08:13,254
|
223 |
+
Results:
|
224 |
+
- F-score (micro) 0.8224
|
225 |
+
- F-score (macro) 0.7445
|
226 |
+
- Accuracy 0.7093
|
227 |
+
|
228 |
+
By class:
|
229 |
+
precision recall f1-score support
|
230 |
+
|
231 |
+
PER 0.8407 0.8320 0.8363 482
|
232 |
+
LOC 0.8710 0.8253 0.8475 458
|
233 |
+
ORG 0.5806 0.5217 0.5496 69
|
234 |
+
|
235 |
+
micro avg 0.8376 0.8077 0.8224 1009
|
236 |
+
macro avg 0.7641 0.7263 0.7445 1009
|
237 |
+
weighted avg 0.8366 0.8077 0.8218 1009
|
238 |
+
|
239 |
+
2023-10-14 11:08:13,254 ----------------------------------------------------------------------------------------------------
|