ritutweets46
commited on
Commit
•
bad38f7
1
Parent(s):
7a9ce7c
End of training
Browse files- README.md +26 -27
- logs/events.out.tfevents.1710596829.70b3bcc79238.2120.0 +2 -2
- model.safetensors +1 -1
README.md
CHANGED
@@ -17,14 +17,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
17 |
|
18 |
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
-
- Loss: 1.
|
21 |
-
- Answer: {'precision': 0.
|
22 |
-
- Header: {'precision': 0.
|
23 |
-
- Question: {'precision': 0.
|
24 |
-
- Overall Precision: 0.
|
25 |
-
- Overall Recall: 0.
|
26 |
-
- Overall F1: 0.
|
27 |
-
- Overall Accuracy: 0.
|
28 |
|
29 |
## Model description
|
30 |
|
@@ -50,32 +50,31 @@ The following hyperparameters were used during training:
|
|
50 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
51 |
- lr_scheduler_type: linear
|
52 |
- num_epochs: 15
|
53 |
-
- mixed_precision_training: Native AMP
|
54 |
|
55 |
### Training results
|
56 |
|
57 |
-
| Training Loss | Epoch | Step | Validation Loss | Answer | Header
|
58 |
-
|
59 |
-
| 1.
|
60 |
-
| 1.
|
61 |
-
| 1.
|
62 |
-
| 1.
|
63 |
-
| 1.
|
64 |
-
|
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.
|
69 |
-
| 0.
|
70 |
-
| 0.
|
71 |
-
| 0.
|
72 |
-
| 0.
|
73 |
-
| 0.
|
74 |
|
75 |
|
76 |
### Framework versions
|
77 |
|
78 |
- Transformers 4.38.2
|
79 |
-
- Pytorch 2.1
|
80 |
- Datasets 2.18.0
|
81 |
- Tokenizers 0.15.2
|
|
|
17 |
|
18 |
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
|
19 |
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 1.1352
|
21 |
+
- Answer: {'precision': 0.38767395626242546, 'recall': 0.4820766378244747, 'f1': 0.42975206611570255, 'number': 809}
|
22 |
+
- Header: {'precision': 0.3181818181818182, 'recall': 0.23529411764705882, 'f1': 0.27053140096618356, 'number': 119}
|
23 |
+
- Question: {'precision': 0.4954954954954955, 'recall': 0.6197183098591549, 'f1': 0.5506883604505632, 'number': 1065}
|
24 |
+
- Overall Precision: 0.4444
|
25 |
+
- Overall Recall: 0.5409
|
26 |
+
- Overall F1: 0.4879
|
27 |
+
- Overall Accuracy: 0.6048
|
28 |
|
29 |
## Model description
|
30 |
|
|
|
50 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
51 |
- lr_scheduler_type: linear
|
52 |
- num_epochs: 15
|
|
|
53 |
|
54 |
### Training results
|
55 |
|
56 |
+
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
57 |
+
|:-------------:|:-----:|:----:|:---------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
58 |
+
| 1.7173 | 1.0 | 10 | 1.5055 | {'precision': 0.036076662908680945, 'recall': 0.03955500618046971, 'f1': 0.03773584905660377, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.2727272727272727, 'recall': 0.18873239436619718, 'f1': 0.22308546059933407, 'number': 1065} | 0.1435 | 0.1169 | 0.1288 | 0.3597 |
|
59 |
+
| 1.4183 | 2.0 | 20 | 1.3144 | {'precision': 0.18861414606095459, 'recall': 0.4054388133498146, 'f1': 0.2574568288854003, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.24258404746209625, 'recall': 0.3455399061032864, 'f1': 0.28505034856700234, 'number': 1065} | 0.2136 | 0.3492 | 0.2650 | 0.4353 |
|
60 |
+
| 1.257 | 3.0 | 30 | 1.1761 | {'precision': 0.2615723732549596, 'recall': 0.4400494437577256, 'f1': 0.32811059907834106, 'number': 809} | {'precision': 0.05, 'recall': 0.01680672268907563, 'f1': 0.025157232704402517, 'number': 119} | {'precision': 0.36586863106200124, 'recall': 0.5596244131455399, 'f1': 0.44246473645137346, 'number': 1065} | 0.3149 | 0.4787 | 0.3799 | 0.5312 |
|
61 |
+
| 1.1319 | 4.0 | 40 | 1.0879 | {'precision': 0.3036978756884343, 'recall': 0.47713226205191595, 'f1': 0.3711538461538461, 'number': 809} | {'precision': 0.2345679012345679, 'recall': 0.15966386554621848, 'f1': 0.18999999999999997, 'number': 119} | {'precision': 0.42283464566929135, 'recall': 0.504225352112676, 'f1': 0.4599571734475375, 'number': 1065} | 0.3593 | 0.4727 | 0.4082 | 0.5793 |
|
62 |
+
| 1.0046 | 5.0 | 50 | 1.1292 | {'precision': 0.32560137457044674, 'recall': 0.4684796044499382, 'f1': 0.38418651799290426, 'number': 809} | {'precision': 0.25301204819277107, 'recall': 0.17647058823529413, 'f1': 0.20792079207920794, 'number': 119} | {'precision': 0.4408831908831909, 'recall': 0.5812206572769953, 'f1': 0.5014175779667881, 'number': 1065} | 0.3844 | 0.5113 | 0.4388 | 0.5817 |
|
63 |
+
| 0.9305 | 6.0 | 60 | 1.1583 | {'precision': 0.3395311236863379, 'recall': 0.519159456118665, 'f1': 0.41055718475073316, 'number': 809} | {'precision': 0.2835820895522388, 'recall': 0.15966386554621848, 'f1': 0.2043010752688172, 'number': 119} | {'precision': 0.4719387755102041, 'recall': 0.5211267605633803, 'f1': 0.49531459170013387, 'number': 1065} | 0.4008 | 0.4987 | 0.4444 | 0.5817 |
|
64 |
+
| 0.8843 | 7.0 | 70 | 1.1142 | {'precision': 0.32987551867219916, 'recall': 0.3930778739184178, 'f1': 0.3587140439932318, 'number': 809} | {'precision': 0.25287356321839083, 'recall': 0.18487394957983194, 'f1': 0.21359223300970878, 'number': 119} | {'precision': 0.41626794258373206, 'recall': 0.6535211267605634, 'f1': 0.5085860431128973, 'number': 1065} | 0.3805 | 0.5198 | 0.4394 | 0.5831 |
|
65 |
+
| 0.8326 | 8.0 | 80 | 1.0891 | {'precision': 0.33364661654135336, 'recall': 0.4388133498145859, 'f1': 0.3790710090763481, 'number': 809} | {'precision': 0.26582278481012656, 'recall': 0.17647058823529413, 'f1': 0.2121212121212121, 'number': 119} | {'precision': 0.42464040025015637, 'recall': 0.6375586854460094, 'f1': 0.5097597597597597, 'number': 1065} | 0.3848 | 0.5294 | 0.4456 | 0.5943 |
|
66 |
+
| 0.7867 | 9.0 | 90 | 1.1168 | {'precision': 0.36489151873767256, 'recall': 0.4573547589616811, 'f1': 0.40592430060340096, 'number': 809} | {'precision': 0.27835051546391754, 'recall': 0.226890756302521, 'f1': 0.25, 'number': 119} | {'precision': 0.4975845410628019, 'recall': 0.5802816901408451, 'f1': 0.5357607282184654, 'number': 1065} | 0.4314 | 0.5093 | 0.4671 | 0.5919 |
|
67 |
+
| 0.7846 | 10.0 | 100 | 1.1754 | {'precision': 0.38025415444770283, 'recall': 0.48084054388133496, 'f1': 0.42467248908296945, 'number': 809} | {'precision': 0.3614457831325301, 'recall': 0.25210084033613445, 'f1': 0.297029702970297, 'number': 119} | {'precision': 0.5054945054945055, 'recall': 0.5615023474178403, 'f1': 0.5320284697508897, 'number': 1065} | 0.4443 | 0.5103 | 0.4750 | 0.5923 |
|
68 |
+
| 0.711 | 11.0 | 110 | 1.1427 | {'precision': 0.3814968814968815, 'recall': 0.453646477132262, 'f1': 0.41445511010728403, 'number': 809} | {'precision': 0.32967032967032966, 'recall': 0.25210084033613445, 'f1': 0.28571428571428575, 'number': 119} | {'precision': 0.4864667154352597, 'recall': 0.6244131455399061, 'f1': 0.5468750000000001, 'number': 1065} | 0.4388 | 0.5329 | 0.4813 | 0.6085 |
|
69 |
+
| 0.7118 | 12.0 | 120 | 1.1172 | {'precision': 0.36363636363636365, 'recall': 0.4796044499381953, 'f1': 0.4136460554371002, 'number': 809} | {'precision': 0.3764705882352941, 'recall': 0.2689075630252101, 'f1': 0.3137254901960785, 'number': 119} | {'precision': 0.47493036211699163, 'recall': 0.64037558685446, 'f1': 0.5453818472610956, 'number': 1065} | 0.4258 | 0.5529 | 0.4811 | 0.6020 |
|
70 |
+
| 0.6891 | 13.0 | 130 | 1.1580 | {'precision': 0.3810375670840787, 'recall': 0.5265760197775031, 'f1': 0.44213803840166066, 'number': 809} | {'precision': 0.3146067415730337, 'recall': 0.23529411764705882, 'f1': 0.2692307692307692, 'number': 119} | {'precision': 0.5264527320034692, 'recall': 0.5699530516431925, 'f1': 0.5473399458972048, 'number': 1065} | 0.4496 | 0.5324 | 0.4875 | 0.6035 |
|
71 |
+
| 0.6544 | 14.0 | 140 | 1.1198 | {'precision': 0.38986556359875907, 'recall': 0.46600741656365885, 'f1': 0.4245495495495496, 'number': 809} | {'precision': 0.3333333333333333, 'recall': 0.24369747899159663, 'f1': 0.2815533980582524, 'number': 119} | {'precision': 0.48421807747489237, 'recall': 0.6338028169014085, 'f1': 0.5490036600244002, 'number': 1065} | 0.4416 | 0.5424 | 0.4868 | 0.6037 |
|
72 |
+
| 0.6515 | 15.0 | 150 | 1.1352 | {'precision': 0.38767395626242546, 'recall': 0.4820766378244747, 'f1': 0.42975206611570255, 'number': 809} | {'precision': 0.3181818181818182, 'recall': 0.23529411764705882, 'f1': 0.27053140096618356, 'number': 119} | {'precision': 0.4954954954954955, 'recall': 0.6197183098591549, 'f1': 0.5506883604505632, 'number': 1065} | 0.4444 | 0.5409 | 0.4879 | 0.6048 |
|
73 |
|
74 |
|
75 |
### Framework versions
|
76 |
|
77 |
- Transformers 4.38.2
|
78 |
+
- Pytorch 2.2.1+cu121
|
79 |
- Datasets 2.18.0
|
80 |
- Tokenizers 0.15.2
|
logs/events.out.tfevents.1710596829.70b3bcc79238.2120.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9faced30fbb5c97ee0075bcffbc0602cdf1033e1d82425ba1041a92a9ac9588a
|
3 |
+
size 15739
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 450558212
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8f0cd9e6c6799ca0b0125bd8d2488400ebdae102acfffbd1dfd06a7f75149555
|
3 |
size 450558212
|