Saed2023 commited on
Commit
6741db9
1 Parent(s): b918b64

End of training

Browse files
README.md CHANGED
@@ -16,14 +16,14 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.5275
20
- - Answer: {'precision': 0.8656542056074766, 'recall': 0.9069767441860465, 'f1': 0.885833831440526, 'number': 817}
21
- - Header: {'precision': 0.6050420168067226, 'recall': 0.6050420168067226, 'f1': 0.6050420168067226, 'number': 119}
22
- - Question: {'precision': 0.8847184986595175, 'recall': 0.9192200557103064, 'f1': 0.9016393442622952, 'number': 1077}
23
- - Overall Precision: 0.8610
24
- - Overall Recall: 0.8957
25
- - Overall F1: 0.8780
26
- - Overall Accuracy: 0.8063
27
 
28
  ## Model description
29
 
@@ -52,20 +52,20 @@ The following hyperparameters were used during training:
52
 
53
  ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
56
- |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
57
- | 0.4351 | 10.53 | 200 | 1.0793 | {'precision': 0.8371824480369515, 'recall': 0.8873929008567931, 'f1': 0.8615567439096851, 'number': 817} | {'precision': 0.5833333333333334, 'recall': 0.47058823529411764, 'f1': 0.5209302325581395, 'number': 119} | {'precision': 0.8660071942446043, 'recall': 0.8941504178272981, 'f1': 0.8798538145271813, 'number': 1077} | 0.8409 | 0.8664 | 0.8534 | 0.7890 |
58
- | 0.0438 | 21.05 | 400 | 1.4722 | {'precision': 0.8228822882288229, 'recall': 0.9155446756425949, 'f1': 0.8667439165701043, 'number': 817} | {'precision': 0.5916666666666667, 'recall': 0.5966386554621849, 'f1': 0.5941422594142259, 'number': 119} | {'precision': 0.9014218009478673, 'recall': 0.883008356545961, 'f1': 0.8921200750469044, 'number': 1077} | 0.8493 | 0.8793 | 0.8640 | 0.7809 |
59
- | 0.0138 | 31.58 | 600 | 1.7220 | {'precision': 0.8554641598119859, 'recall': 0.8910648714810282, 'f1': 0.8729016786570742, 'number': 817} | {'precision': 0.5138888888888888, 'recall': 0.6218487394957983, 'f1': 0.5627376425855513, 'number': 119} | {'precision': 0.8821396192203083, 'recall': 0.903435468895079, 'f1': 0.8926605504587157, 'number': 1077} | 0.8460 | 0.8818 | 0.8635 | 0.7762 |
60
- | 0.0085 | 42.11 | 800 | 1.5618 | {'precision': 0.8459796149490374, 'recall': 0.9143206854345165, 'f1': 0.8788235294117647, 'number': 817} | {'precision': 0.6078431372549019, 'recall': 0.5210084033613446, 'f1': 0.5610859728506787, 'number': 119} | {'precision': 0.8905109489051095, 'recall': 0.9062209842154132, 'f1': 0.8982972848596411, 'number': 1077} | 0.8578 | 0.8867 | 0.8720 | 0.7965 |
61
- | 0.004 | 52.63 | 1000 | 1.5275 | {'precision': 0.8656542056074766, 'recall': 0.9069767441860465, 'f1': 0.885833831440526, 'number': 817} | {'precision': 0.6050420168067226, 'recall': 0.6050420168067226, 'f1': 0.6050420168067226, 'number': 119} | {'precision': 0.8847184986595175, 'recall': 0.9192200557103064, 'f1': 0.9016393442622952, 'number': 1077} | 0.8610 | 0.8957 | 0.8780 | 0.8063 |
62
- | 0.0029 | 63.16 | 1200 | 1.6222 | {'precision': 0.8576349024110218, 'recall': 0.9143206854345165, 'f1': 0.8850710900473934, 'number': 817} | {'precision': 0.6565656565656566, 'recall': 0.5462184873949579, 'f1': 0.5963302752293578, 'number': 119} | {'precision': 0.8782452999104745, 'recall': 0.9108635097493036, 'f1': 0.8942570647219691, 'number': 1077} | 0.8591 | 0.8907 | 0.8746 | 0.7879 |
63
- | 0.0018 | 73.68 | 1400 | 1.7916 | {'precision': 0.8514285714285714, 'recall': 0.9118727050183598, 'f1': 0.8806146572104018, 'number': 817} | {'precision': 0.5390625, 'recall': 0.5798319327731093, 'f1': 0.5587044534412955, 'number': 119} | {'precision': 0.8845096241979835, 'recall': 0.8960074280408542, 'f1': 0.890221402214022, 'number': 1077} | 0.8496 | 0.8838 | 0.8663 | 0.7738 |
64
- | 0.0013 | 84.21 | 1600 | 1.9358 | {'precision': 0.8530751708428246, 'recall': 0.9167686658506732, 'f1': 0.8837758112094395, 'number': 817} | {'precision': 0.6263736263736264, 'recall': 0.4789915966386555, 'f1': 0.5428571428571428, 'number': 119} | {'precision': 0.8936363636363637, 'recall': 0.9127205199628597, 'f1': 0.9030776297657327, 'number': 1077} | 0.8647 | 0.8887 | 0.8765 | 0.7762 |
65
- | 0.0006 | 94.74 | 1800 | 1.9653 | {'precision': 0.8605990783410138, 'recall': 0.9143206854345165, 'f1': 0.886646884272997, 'number': 817} | {'precision': 0.5818181818181818, 'recall': 0.5378151260504201, 'f1': 0.5589519650655022, 'number': 119} | {'precision': 0.8934802571166207, 'recall': 0.903435468895079, 'f1': 0.8984302862419206, 'number': 1077} | 0.8631 | 0.8862 | 0.8745 | 0.7717 |
66
- | 0.0005 | 105.26 | 2000 | 1.9637 | {'precision': 0.8527397260273972, 'recall': 0.9143206854345165, 'f1': 0.8824571766095688, 'number': 817} | {'precision': 0.5619047619047619, 'recall': 0.4957983193277311, 'f1': 0.5267857142857143, 'number': 119} | {'precision': 0.8894927536231884, 'recall': 0.9117920148560817, 'f1': 0.9005043558000918, 'number': 1077} | 0.8576 | 0.8882 | 0.8726 | 0.7686 |
67
- | 0.0003 | 115.79 | 2200 | 1.8611 | {'precision': 0.8609501738122828, 'recall': 0.9094247246022031, 'f1': 0.8845238095238096, 'number': 817} | {'precision': 0.5929203539823009, 'recall': 0.5630252100840336, 'f1': 0.5775862068965517, 'number': 119} | {'precision': 0.8886861313868614, 'recall': 0.904363974001857, 'f1': 0.8964565117349288, 'number': 1077} | 0.8610 | 0.8862 | 0.8734 | 0.7839 |
68
- | 0.0002 | 126.32 | 2400 | 1.9031 | {'precision': 0.860919540229885, 'recall': 0.9167686658506732, 'f1': 0.887966804979253, 'number': 817} | {'precision': 0.5887850467289719, 'recall': 0.5294117647058824, 'f1': 0.5575221238938053, 'number': 119} | {'precision': 0.8902104300091491, 'recall': 0.903435468895079, 'f1': 0.8967741935483872, 'number': 1077} | 0.8623 | 0.8867 | 0.8744 | 0.7805 |
69
 
70
 
71
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.8784
20
+ - Answer: {'precision': 0.8651817116060961, 'recall': 0.9033047735618115, 'f1': 0.8838323353293414, 'number': 817}
21
+ - Header: {'precision': 0.6504854368932039, 'recall': 0.5630252100840336, 'f1': 0.6036036036036037, 'number': 119}
22
+ - Question: {'precision': 0.9073394495412844, 'recall': 0.9182915506035283, 'f1': 0.912782648823258, 'number': 1077}
23
+ - Overall Precision: 0.8768
24
+ - Overall Recall: 0.8912
25
+ - Overall F1: 0.8840
26
+ - Overall Accuracy: 0.7948
27
 
28
  ## Model description
29
 
 
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
56
+ |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
57
+ | 0.4369 | 10.53 | 200 | 0.9022 | {'precision': 0.8049065420560748, 'recall': 0.8433292533659731, 'f1': 0.8236700537955769, 'number': 817} | {'precision': 0.5317460317460317, 'recall': 0.5630252100840336, 'f1': 0.5469387755102041, 'number': 119} | {'precision': 0.8837420526793823, 'recall': 0.903435468895079, 'f1': 0.8934802571166208, 'number': 1077} | 0.8301 | 0.8589 | 0.8442 | 0.7888 |
58
+ | 0.047 | 21.05 | 400 | 1.3222 | {'precision': 0.8382526564344747, 'recall': 0.8690330477356181, 'f1': 0.8533653846153846, 'number': 817} | {'precision': 0.5447761194029851, 'recall': 0.6134453781512605, 'f1': 0.5770750988142292, 'number': 119} | {'precision': 0.8667866786678667, 'recall': 0.8941504178272981, 'f1': 0.8802559414990858, 'number': 1077} | 0.8346 | 0.8674 | 0.8507 | 0.7837 |
59
+ | 0.015 | 31.58 | 600 | 1.4745 | {'precision': 0.8549528301886793, 'recall': 0.8873929008567931, 'f1': 0.8708708708708709, 'number': 817} | {'precision': 0.5867768595041323, 'recall': 0.5966386554621849, 'f1': 0.5916666666666667, 'number': 119} | {'precision': 0.8755635707844905, 'recall': 0.9015784586815228, 'f1': 0.888380603842635, 'number': 1077} | 0.8503 | 0.8778 | 0.8638 | 0.7969 |
60
+ | 0.0051 | 42.11 | 800 | 1.5719 | {'precision': 0.8768472906403941, 'recall': 0.8714810281517748, 'f1': 0.8741559238796808, 'number': 817} | {'precision': 0.5736434108527132, 'recall': 0.6218487394957983, 'f1': 0.596774193548387, 'number': 119} | {'precision': 0.8794326241134752, 'recall': 0.9210770659238626, 'f1': 0.8997732426303855, 'number': 1077} | 0.8594 | 0.8833 | 0.8711 | 0.7923 |
61
+ | 0.0041 | 52.63 | 1000 | 1.6771 | {'precision': 0.8352402745995423, 'recall': 0.8935128518971848, 'f1': 0.8633944411590775, 'number': 817} | {'precision': 0.6568627450980392, 'recall': 0.5630252100840336, 'f1': 0.6063348416289592, 'number': 119} | {'precision': 0.8865116279069768, 'recall': 0.8848653667595172, 'f1': 0.8856877323420075, 'number': 1077} | 0.8532 | 0.8693 | 0.8612 | 0.7877 |
62
+ | 0.0039 | 63.16 | 1200 | 1.6064 | {'precision': 0.8609112709832134, 'recall': 0.8788249694002448, 'f1': 0.8697758933979407, 'number': 817} | {'precision': 0.6106194690265486, 'recall': 0.5798319327731093, 'f1': 0.5948275862068966, 'number': 119} | {'precision': 0.8897777777777778, 'recall': 0.9294336118848654, 'f1': 0.9091734786557675, 'number': 1077} | 0.8629 | 0.8882 | 0.8754 | 0.8009 |
63
+ | 0.0019 | 73.68 | 1400 | 1.7674 | {'precision': 0.8533178114086146, 'recall': 0.8971848225214198, 'f1': 0.8747016706443913, 'number': 817} | {'precision': 0.5769230769230769, 'recall': 0.5042016806722689, 'f1': 0.5381165919282511, 'number': 119} | {'precision': 0.8842676311030742, 'recall': 0.9080779944289693, 'f1': 0.8960146587265231, 'number': 1077} | 0.8560 | 0.8798 | 0.8677 | 0.7981 |
64
+ | 0.0007 | 84.21 | 1600 | 1.8380 | {'precision': 0.8469387755102041, 'recall': 0.9143206854345165, 'f1': 0.8793407886992348, 'number': 817} | {'precision': 0.6017699115044248, 'recall': 0.5714285714285714, 'f1': 0.5862068965517241, 'number': 119} | {'precision': 0.8931159420289855, 'recall': 0.9155060352831941, 'f1': 0.9041723979825768, 'number': 1077} | 0.8580 | 0.8947 | 0.8760 | 0.7931 |
65
+ | 0.0007 | 94.74 | 1800 | 1.8108 | {'precision': 0.8600478468899522, 'recall': 0.8800489596083231, 'f1': 0.8699334543254689, 'number': 817} | {'precision': 0.6435643564356436, 'recall': 0.5462184873949579, 'f1': 0.5909090909090908, 'number': 119} | {'precision': 0.8722849695916595, 'recall': 0.9322191272051996, 'f1': 0.9012567324955117, 'number': 1077} | 0.8563 | 0.8882 | 0.8720 | 0.7887 |
66
+ | 0.0004 | 105.26 | 2000 | 1.9035 | {'precision': 0.8627906976744186, 'recall': 0.9082007343941249, 'f1': 0.8849135360763267, 'number': 817} | {'precision': 0.6285714285714286, 'recall': 0.5546218487394958, 'f1': 0.5892857142857143, 'number': 119} | {'precision': 0.8955495004541326, 'recall': 0.9155060352831941, 'f1': 0.9054178145087237, 'number': 1077} | 0.8683 | 0.8912 | 0.8796 | 0.7965 |
67
+ | 0.0002 | 115.79 | 2200 | 1.8784 | {'precision': 0.8651817116060961, 'recall': 0.9033047735618115, 'f1': 0.8838323353293414, 'number': 817} | {'precision': 0.6504854368932039, 'recall': 0.5630252100840336, 'f1': 0.6036036036036037, 'number': 119} | {'precision': 0.9073394495412844, 'recall': 0.9182915506035283, 'f1': 0.912782648823258, 'number': 1077} | 0.8768 | 0.8912 | 0.8840 | 0.7948 |
68
+ | 0.0002 | 126.32 | 2400 | 1.9075 | {'precision': 0.8640093786635404, 'recall': 0.9020807833537332, 'f1': 0.8826347305389222, 'number': 817} | {'precision': 0.6296296296296297, 'recall': 0.5714285714285714, 'f1': 0.5991189427312775, 'number': 119} | {'precision': 0.9041970802919708, 'recall': 0.9201485608170845, 'f1': 0.9121030832949838, 'number': 1077} | 0.8731 | 0.8922 | 0.8826 | 0.7959 |
69
 
70
 
71
  ### Framework versions
logs/events.out.tfevents.1687183382.8c93e043151d.4792.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:72bfed2abb88ca630026ff7f4bd8ec5871a57c3bada01cc8b6e3ff1878ba12a7
3
- size 12340
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcb9c75bbd8b4487cabbbb86102a504949700c70d0ba5852f5c28cb38815e5e8
3
+ size 12694
logs/events.out.tfevents.1687185248.8c93e043151d.4792.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d44dbc9ec65f9d4cc72fd6742caed0da97dcdf1683a74d146b2b04c6dc0fe085
3
+ size 592
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8a7020622d6aae29168e0b8cbe8047f143c9c0dbe0adb995a7657e3eebda17cb
3
  size 520821201
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c16aab1a6da7b03f266853d58d06e280a9614ac0991e9bee72920e7495b7b36
3
  size 520821201