ritutweets46 commited on
Commit
bad38f7
1 Parent(s): 7a9ce7c

End of training

Browse files
README.md CHANGED
@@ -17,14 +17,14 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.0745
21
- - Answer: {'precision': 0.3554006968641115, 'recall': 0.5043263288009888, 'f1': 0.41696474195196725, 'number': 809}
22
- - Header: {'precision': 0.3411764705882353, 'recall': 0.24369747899159663, 'f1': 0.28431372549019607, 'number': 119}
23
- - Question: {'precision': 0.4910979228486647, 'recall': 0.6215962441314554, 'f1': 0.5486945710733527, 'number': 1065}
24
- - Overall Precision: 0.4258
25
- - Overall Recall: 0.5514
26
- - Overall F1: 0.4805
27
- - Overall Accuracy: 0.6117
28
 
29
  ## Model description
30
 
@@ -50,32 +50,31 @@ The following hyperparameters were used during training:
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
  - num_epochs: 15
53
- - mixed_precision_training: Native AMP
54
 
55
  ### Training results
56
 
57
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
58
- |:-------------:|:-----:|:----:|:---------------:|:------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
59
- | 1.7729 | 1.0 | 10 | 1.5447 | {'precision': 0.04415584415584416, 'recall': 0.042027194066749075, 'f1': 0.04306523115896137, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.22091584158415842, 'recall': 0.3352112676056338, 'f1': 0.26631853785900783, 'number': 1065} | 0.1639 | 0.1962 | 0.1786 | 0.3568 |
60
- | 1.4627 | 2.0 | 20 | 1.3779 | {'precision': 0.12121212121212122, 'recall': 0.2373300370828183, 'f1': 0.16046803175929797, 'number': 809} | {'precision': 0.04081632653061224, 'recall': 0.01680672268907563, 'f1': 0.023809523809523808, 'number': 119} | {'precision': 0.23460246360582307, 'recall': 0.39342723004694835, 'f1': 0.293931953700456, 'number': 1065} | 0.1793 | 0.3076 | 0.2265 | 0.4108 |
61
- | 1.2914 | 3.0 | 30 | 1.2345 | {'precision': 0.15814696485623003, 'recall': 0.24474660074165636, 'f1': 0.19213973799126638, 'number': 809} | {'precision': 0.15789473684210525, 'recall': 0.12605042016806722, 'f1': 0.14018691588785046, 'number': 119} | {'precision': 0.31094527363184077, 'recall': 0.5868544600938967, 'f1': 0.4065040650406504, 'number': 1065} | 0.2496 | 0.4205 | 0.3133 | 0.4524 |
62
- | 1.1698 | 4.0 | 40 | 1.1615 | {'precision': 0.2040990606319385, 'recall': 0.2954264524103832, 'f1': 0.2414141414141414, 'number': 809} | {'precision': 0.19387755102040816, 'recall': 0.15966386554621848, 'f1': 0.17511520737327188, 'number': 119} | {'precision': 0.351129363449692, 'recall': 0.6422535211267606, 'f1': 0.4540325257218719, 'number': 1065} | 0.2928 | 0.4727 | 0.3616 | 0.4925 |
63
- | 1.096 | 5.0 | 50 | 1.1141 | {'precision': 0.22423802612481858, 'recall': 0.3819530284301607, 'f1': 0.28257887517146774, 'number': 809} | {'precision': 0.2682926829268293, 'recall': 0.18487394957983194, 'f1': 0.21890547263681595, 'number': 119} | {'precision': 0.3757159221076747, 'recall': 0.615962441314554, 'f1': 0.4667378157239417, 'number': 1065} | 0.3079 | 0.4952 | 0.3797 | 0.5360 |
64
- | 1.0157 | 6.0 | 60 | 1.0480 | {'precision': 0.27807900852052675, 'recall': 0.4437577255871446, 'f1': 0.34190476190476193, 'number': 809} | {'precision': 0.3013698630136986, 'recall': 0.18487394957983194, 'f1': 0.22916666666666669, 'number': 119} | {'precision': 0.45481049562682213, 'recall': 0.5859154929577465, 'f1': 0.512105047189167, 'number': 1065} | 0.3673 | 0.5043 | 0.4250 | 0.5881 |
65
- | 0.9412 | 7.0 | 70 | 1.0314 | {'precision': 0.29177057356608477, 'recall': 0.4338689740420272, 'f1': 0.34890656063618286, 'number': 809} | {'precision': 0.2926829268292683, 'recall': 0.20168067226890757, 'f1': 0.23880597014925373, 'number': 119} | {'precision': 0.45625451916124365, 'recall': 0.5924882629107981, 'f1': 0.5155228758169934, 'number': 1065} | 0.3771 | 0.5048 | 0.4317 | 0.5961 |
66
- | 0.8828 | 8.0 | 80 | 1.0804 | {'precision': 0.3174061433447099, 'recall': 0.45982694684796044, 'f1': 0.37556789500252397, 'number': 809} | {'precision': 0.2828282828282828, 'recall': 0.23529411764705882, 'f1': 0.25688073394495414, 'number': 119} | {'precision': 0.46117804551539493, 'recall': 0.6469483568075117, 'f1': 0.5384915982805784, 'number': 1065} | 0.3939 | 0.5464 | 0.4578 | 0.5872 |
67
- | 0.8304 | 9.0 | 90 | 1.0436 | {'precision': 0.3404255319148936, 'recall': 0.49443757725587145, 'f1': 0.40322580645161293, 'number': 809} | {'precision': 0.36363636363636365, 'recall': 0.23529411764705882, 'f1': 0.2857142857142857, 'number': 119} | {'precision': 0.4878765613519471, 'recall': 0.6234741784037559, 'f1': 0.5474031327287716, 'number': 1065} | 0.4179 | 0.5479 | 0.4742 | 0.6095 |
68
- | 0.814 | 10.0 | 100 | 1.0871 | {'precision': 0.3464391691394659, 'recall': 0.5772558714462299, 'f1': 0.4330088085303662, 'number': 809} | {'precision': 0.4166666666666667, 'recall': 0.25210084033613445, 'f1': 0.31413612565445026, 'number': 119} | {'precision': 0.5084294587400178, 'recall': 0.5380281690140845, 'f1': 0.5228102189781022, 'number': 1065} | 0.4201 | 0.5369 | 0.4714 | 0.5989 |
69
- | 0.7273 | 11.0 | 110 | 1.0650 | {'precision': 0.3483348334833483, 'recall': 0.4783683559950556, 'f1': 0.40312499999999996, 'number': 809} | {'precision': 0.30434782608695654, 'recall': 0.23529411764705882, 'f1': 0.2654028436018957, 'number': 119} | {'precision': 0.4900953778429934, 'recall': 0.6272300469483568, 'f1': 0.5502471169686985, 'number': 1065} | 0.4221 | 0.5434 | 0.4751 | 0.6139 |
70
- | 0.7257 | 12.0 | 120 | 1.1221 | {'precision': 0.34212629896083135, 'recall': 0.5290482076637825, 'f1': 0.41553398058252433, 'number': 809} | {'precision': 0.38666666666666666, 'recall': 0.24369747899159663, 'f1': 0.29896907216494845, 'number': 119} | {'precision': 0.48787878787878786, 'recall': 0.6046948356807512, 'f1': 0.5400419287211741, 'number': 1065} | 0.4161 | 0.5524 | 0.4747 | 0.6032 |
71
- | 0.694 | 13.0 | 130 | 1.0688 | {'precision': 0.3702451394759087, 'recall': 0.5414091470951793, 'f1': 0.43975903614457834, 'number': 809} | {'precision': 0.345679012345679, 'recall': 0.23529411764705882, 'f1': 0.27999999999999997, 'number': 119} | {'precision': 0.5052041633306645, 'recall': 0.5924882629107981, 'f1': 0.5453759723422645, 'number': 1065} | 0.4365 | 0.5504 | 0.4869 | 0.6148 |
72
- | 0.6617 | 14.0 | 140 | 1.0465 | {'precision': 0.3598901098901099, 'recall': 0.4857849196538937, 'f1': 0.41346659652814305, 'number': 809} | {'precision': 0.3411764705882353, 'recall': 0.24369747899159663, 'f1': 0.28431372549019607, 'number': 119} | {'precision': 0.48916184971098264, 'recall': 0.6356807511737089, 'f1': 0.5528787260106166, 'number': 1065} | 0.4291 | 0.5514 | 0.4827 | 0.6191 |
73
- | 0.6536 | 15.0 | 150 | 1.0745 | {'precision': 0.3554006968641115, 'recall': 0.5043263288009888, 'f1': 0.41696474195196725, 'number': 809} | {'precision': 0.3411764705882353, 'recall': 0.24369747899159663, 'f1': 0.28431372549019607, 'number': 119} | {'precision': 0.4910979228486647, 'recall': 0.6215962441314554, 'f1': 0.5486945710733527, 'number': 1065} | 0.4258 | 0.5514 | 0.4805 | 0.6117 |
74
 
75
 
76
  ### Framework versions
77
 
78
  - Transformers 4.38.2
79
- - Pytorch 2.1.0+cu121
80
  - Datasets 2.18.0
81
  - Tokenizers 0.15.2
 
17
 
18
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.1352
21
+ - Answer: {'precision': 0.38767395626242546, 'recall': 0.4820766378244747, 'f1': 0.42975206611570255, 'number': 809}
22
+ - Header: {'precision': 0.3181818181818182, 'recall': 0.23529411764705882, 'f1': 0.27053140096618356, 'number': 119}
23
+ - Question: {'precision': 0.4954954954954955, 'recall': 0.6197183098591549, 'f1': 0.5506883604505632, 'number': 1065}
24
+ - Overall Precision: 0.4444
25
+ - Overall Recall: 0.5409
26
+ - Overall F1: 0.4879
27
+ - Overall Accuracy: 0.6048
28
 
29
  ## Model description
30
 
 
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
  - num_epochs: 15
 
53
 
54
  ### Training results
55
 
56
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
57
+ |:-------------:|:-----:|:----:|:---------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
58
+ | 1.7173 | 1.0 | 10 | 1.5055 | {'precision': 0.036076662908680945, 'recall': 0.03955500618046971, 'f1': 0.03773584905660377, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.2727272727272727, 'recall': 0.18873239436619718, 'f1': 0.22308546059933407, 'number': 1065} | 0.1435 | 0.1169 | 0.1288 | 0.3597 |
59
+ | 1.4183 | 2.0 | 20 | 1.3144 | {'precision': 0.18861414606095459, 'recall': 0.4054388133498146, 'f1': 0.2574568288854003, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.24258404746209625, 'recall': 0.3455399061032864, 'f1': 0.28505034856700234, 'number': 1065} | 0.2136 | 0.3492 | 0.2650 | 0.4353 |
60
+ | 1.257 | 3.0 | 30 | 1.1761 | {'precision': 0.2615723732549596, 'recall': 0.4400494437577256, 'f1': 0.32811059907834106, 'number': 809} | {'precision': 0.05, 'recall': 0.01680672268907563, 'f1': 0.025157232704402517, 'number': 119} | {'precision': 0.36586863106200124, 'recall': 0.5596244131455399, 'f1': 0.44246473645137346, 'number': 1065} | 0.3149 | 0.4787 | 0.3799 | 0.5312 |
61
+ | 1.1319 | 4.0 | 40 | 1.0879 | {'precision': 0.3036978756884343, 'recall': 0.47713226205191595, 'f1': 0.3711538461538461, 'number': 809} | {'precision': 0.2345679012345679, 'recall': 0.15966386554621848, 'f1': 0.18999999999999997, 'number': 119} | {'precision': 0.42283464566929135, 'recall': 0.504225352112676, 'f1': 0.4599571734475375, 'number': 1065} | 0.3593 | 0.4727 | 0.4082 | 0.5793 |
62
+ | 1.0046 | 5.0 | 50 | 1.1292 | {'precision': 0.32560137457044674, 'recall': 0.4684796044499382, 'f1': 0.38418651799290426, 'number': 809} | {'precision': 0.25301204819277107, 'recall': 0.17647058823529413, 'f1': 0.20792079207920794, 'number': 119} | {'precision': 0.4408831908831909, 'recall': 0.5812206572769953, 'f1': 0.5014175779667881, 'number': 1065} | 0.3844 | 0.5113 | 0.4388 | 0.5817 |
63
+ | 0.9305 | 6.0 | 60 | 1.1583 | {'precision': 0.3395311236863379, 'recall': 0.519159456118665, 'f1': 0.41055718475073316, 'number': 809} | {'precision': 0.2835820895522388, 'recall': 0.15966386554621848, 'f1': 0.2043010752688172, 'number': 119} | {'precision': 0.4719387755102041, 'recall': 0.5211267605633803, 'f1': 0.49531459170013387, 'number': 1065} | 0.4008 | 0.4987 | 0.4444 | 0.5817 |
64
+ | 0.8843 | 7.0 | 70 | 1.1142 | {'precision': 0.32987551867219916, 'recall': 0.3930778739184178, 'f1': 0.3587140439932318, 'number': 809} | {'precision': 0.25287356321839083, 'recall': 0.18487394957983194, 'f1': 0.21359223300970878, 'number': 119} | {'precision': 0.41626794258373206, 'recall': 0.6535211267605634, 'f1': 0.5085860431128973, 'number': 1065} | 0.3805 | 0.5198 | 0.4394 | 0.5831 |
65
+ | 0.8326 | 8.0 | 80 | 1.0891 | {'precision': 0.33364661654135336, 'recall': 0.4388133498145859, 'f1': 0.3790710090763481, 'number': 809} | {'precision': 0.26582278481012656, 'recall': 0.17647058823529413, 'f1': 0.2121212121212121, 'number': 119} | {'precision': 0.42464040025015637, 'recall': 0.6375586854460094, 'f1': 0.5097597597597597, 'number': 1065} | 0.3848 | 0.5294 | 0.4456 | 0.5943 |
66
+ | 0.7867 | 9.0 | 90 | 1.1168 | {'precision': 0.36489151873767256, 'recall': 0.4573547589616811, 'f1': 0.40592430060340096, 'number': 809} | {'precision': 0.27835051546391754, 'recall': 0.226890756302521, 'f1': 0.25, 'number': 119} | {'precision': 0.4975845410628019, 'recall': 0.5802816901408451, 'f1': 0.5357607282184654, 'number': 1065} | 0.4314 | 0.5093 | 0.4671 | 0.5919 |
67
+ | 0.7846 | 10.0 | 100 | 1.1754 | {'precision': 0.38025415444770283, 'recall': 0.48084054388133496, 'f1': 0.42467248908296945, 'number': 809} | {'precision': 0.3614457831325301, 'recall': 0.25210084033613445, 'f1': 0.297029702970297, 'number': 119} | {'precision': 0.5054945054945055, 'recall': 0.5615023474178403, 'f1': 0.5320284697508897, 'number': 1065} | 0.4443 | 0.5103 | 0.4750 | 0.5923 |
68
+ | 0.711 | 11.0 | 110 | 1.1427 | {'precision': 0.3814968814968815, 'recall': 0.453646477132262, 'f1': 0.41445511010728403, 'number': 809} | {'precision': 0.32967032967032966, 'recall': 0.25210084033613445, 'f1': 0.28571428571428575, 'number': 119} | {'precision': 0.4864667154352597, 'recall': 0.6244131455399061, 'f1': 0.5468750000000001, 'number': 1065} | 0.4388 | 0.5329 | 0.4813 | 0.6085 |
69
+ | 0.7118 | 12.0 | 120 | 1.1172 | {'precision': 0.36363636363636365, 'recall': 0.4796044499381953, 'f1': 0.4136460554371002, 'number': 809} | {'precision': 0.3764705882352941, 'recall': 0.2689075630252101, 'f1': 0.3137254901960785, 'number': 119} | {'precision': 0.47493036211699163, 'recall': 0.64037558685446, 'f1': 0.5453818472610956, 'number': 1065} | 0.4258 | 0.5529 | 0.4811 | 0.6020 |
70
+ | 0.6891 | 13.0 | 130 | 1.1580 | {'precision': 0.3810375670840787, 'recall': 0.5265760197775031, 'f1': 0.44213803840166066, 'number': 809} | {'precision': 0.3146067415730337, 'recall': 0.23529411764705882, 'f1': 0.2692307692307692, 'number': 119} | {'precision': 0.5264527320034692, 'recall': 0.5699530516431925, 'f1': 0.5473399458972048, 'number': 1065} | 0.4496 | 0.5324 | 0.4875 | 0.6035 |
71
+ | 0.6544 | 14.0 | 140 | 1.1198 | {'precision': 0.38986556359875907, 'recall': 0.46600741656365885, 'f1': 0.4245495495495496, 'number': 809} | {'precision': 0.3333333333333333, 'recall': 0.24369747899159663, 'f1': 0.2815533980582524, 'number': 119} | {'precision': 0.48421807747489237, 'recall': 0.6338028169014085, 'f1': 0.5490036600244002, 'number': 1065} | 0.4416 | 0.5424 | 0.4868 | 0.6037 |
72
+ | 0.6515 | 15.0 | 150 | 1.1352 | {'precision': 0.38767395626242546, 'recall': 0.4820766378244747, 'f1': 0.42975206611570255, 'number': 809} | {'precision': 0.3181818181818182, 'recall': 0.23529411764705882, 'f1': 0.27053140096618356, 'number': 119} | {'precision': 0.4954954954954955, 'recall': 0.6197183098591549, 'f1': 0.5506883604505632, 'number': 1065} | 0.4444 | 0.5409 | 0.4879 | 0.6048 |
73
 
74
 
75
  ### Framework versions
76
 
77
  - Transformers 4.38.2
78
+ - Pytorch 2.2.1+cu121
79
  - Datasets 2.18.0
80
  - Tokenizers 0.15.2
logs/events.out.tfevents.1710596829.70b3bcc79238.2120.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2d6ee5d3dc74144127b3d6ccfb1921898783257ac41f90183fca02ed7c6fa58e
3
- size 13955
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9faced30fbb5c97ee0075bcffbc0602cdf1033e1d82425ba1041a92a9ac9588a
3
+ size 15739
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:bdfdbbc6c9b6741286cacfb9a6488e4213102eba59e44cec24a8c68d67b9c8ca
3
  size 450558212
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8f0cd9e6c6799ca0b0125bd8d2488400ebdae102acfffbd1dfd06a7f75149555
3
  size 450558212