Sebabrata commited on
Commit
ff27899
1 Parent(s): f7a2c7e

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +121 -0
README.md ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: lmv2-g-paystb-999-doc-09-11
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # lmv2-g-paystb-999-doc-09-11
14
+
15
+ This model is a fine-tuned version of [microsoft/layoutlmv2-base-uncased](https://huggingface.co/microsoft/layoutlmv2-base-uncased) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.1207
18
+ - Employee Address Precision: 0.8152
19
+ - Employee Address Recall: 0.8523
20
+ - Employee Address F1: 0.8333
21
+ - Employee Address Number: 88
22
+ - Employee Name Precision: 0.9511
23
+ - Employee Name Recall: 0.9722
24
+ - Employee Name F1: 0.9615
25
+ - Employee Name Number: 180
26
+ - Employer Address Precision: 0.8151
27
+ - Employer Address Recall: 0.8509
28
+ - Employer Address F1: 0.8326
29
+ - Employer Address Number: 114
30
+ - Employer Name Precision: 0.8564
31
+ - Employer Name Recall: 0.8857
32
+ - Employer Name F1: 0.8708
33
+ - Employer Name Number: 175
34
+ - Gross Pay Precision: 0.8976
35
+ - Gross Pay Recall: 0.8324
36
+ - Gross Pay F1: 0.8638
37
+ - Gross Pay Number: 179
38
+ - Net Pay Precision: 0.8994
39
+ - Net Pay Recall: 0.8846
40
+ - Net Pay F1: 0.8920
41
+ - Net Pay Number: 182
42
+ - Pay Date Precision: 0.9107
43
+ - Pay Date Recall: 0.8718
44
+ - Pay Date F1: 0.8908
45
+ - Pay Date Number: 117
46
+ - Ssn Number Precision: 0.9032
47
+ - Ssn Number Recall: 0.9032
48
+ - Ssn Number F1: 0.9032
49
+ - Ssn Number Number: 31
50
+ - Overall Precision: 0.8853
51
+ - Overall Recall: 0.8837
52
+ - Overall F1: 0.8845
53
+ - Overall Accuracy: 0.9834
54
+
55
+ ## Model description
56
+
57
+ More information needed
58
+
59
+ ## Intended uses & limitations
60
+
61
+ More information needed
62
+
63
+ ## Training and evaluation data
64
+
65
+ More information needed
66
+
67
+ ## Training procedure
68
+
69
+ ### Training hyperparameters
70
+
71
+ The following hyperparameters were used during training:
72
+ - learning_rate: 4e-05
73
+ - train_batch_size: 1
74
+ - eval_batch_size: 1
75
+ - seed: 42
76
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
77
+ - lr_scheduler_type: constant
78
+ - num_epochs: 30
79
+
80
+ ### Training results
81
+
82
+ | Training Loss | Epoch | Step | Validation Loss | Employee Address Precision | Employee Address Recall | Employee Address F1 | Employee Address Number | Employee Name Precision | Employee Name Recall | Employee Name F1 | Employee Name Number | Employer Address Precision | Employer Address Recall | Employer Address F1 | Employer Address Number | Employer Name Precision | Employer Name Recall | Employer Name F1 | Employer Name Number | Gross Pay Precision | Gross Pay Recall | Gross Pay F1 | Gross Pay Number | Net Pay Precision | Net Pay Recall | Net Pay F1 | Net Pay Number | Pay Date Precision | Pay Date Recall | Pay Date F1 | Pay Date Number | Ssn Number Precision | Ssn Number Recall | Ssn Number F1 | Ssn Number Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
83
+ |:-------------:|:-----:|:-----:|:---------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:--------------:|:------------------:|:---------------:|:-----------:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
84
+ | 0.8872 | 1.0 | 799 | 0.3053 | 0.0 | 0.0 | 0.0 | 88 | 0.5146 | 0.5889 | 0.5492 | 180 | 0.4213 | 0.7281 | 0.5338 | 114 | 0.6468 | 0.7429 | 0.6915 | 175 | 0.0 | 0.0 | 0.0 | 179 | 0.4143 | 0.4780 | 0.4439 | 182 | 0.0 | 0.0 | 0.0 | 117 | 0.8148 | 0.7097 | 0.7586 | 31 | 0.5089 | 0.4015 | 0.4489 | 0.9454 |
85
+ | 0.2088 | 2.0 | 1598 | 0.1402 | 0.7527 | 0.7955 | 0.7735 | 88 | 0.9270 | 0.9167 | 0.9218 | 180 | 0.7209 | 0.8158 | 0.7654 | 114 | 0.7949 | 0.8857 | 0.8378 | 175 | 0.8425 | 0.6872 | 0.7569 | 179 | 0.8079 | 0.7857 | 0.7967 | 182 | 0.7068 | 0.8034 | 0.752 | 117 | 0.8889 | 0.7742 | 0.8276 | 31 | 0.8043 | 0.8133 | 0.8088 | 0.9769 |
86
+ | 0.1044 | 3.0 | 2397 | 0.1181 | 0.6881 | 0.8523 | 0.7614 | 88 | 0.9227 | 0.9278 | 0.9252 | 180 | 0.7323 | 0.8158 | 0.7718 | 114 | 0.8297 | 0.8629 | 0.8459 | 175 | 0.8402 | 0.7933 | 0.8161 | 179 | 0.8132 | 0.8132 | 0.8132 | 182 | 0.8509 | 0.8291 | 0.8398 | 117 | 0.8333 | 0.8065 | 0.8197 | 31 | 0.8208 | 0.8424 | 0.8315 | 0.9780 |
87
+ | 0.0659 | 4.0 | 3196 | 0.1117 | 0.8105 | 0.875 | 0.8415 | 88 | 0.9101 | 0.9556 | 0.9322 | 180 | 0.784 | 0.8596 | 0.8201 | 114 | 0.8032 | 0.8629 | 0.8320 | 175 | 0.8247 | 0.8939 | 0.8579 | 179 | 0.7885 | 0.9011 | 0.8410 | 182 | 0.8120 | 0.9231 | 0.864 | 117 | 0.8966 | 0.8387 | 0.8667 | 31 | 0.8234 | 0.8968 | 0.8586 | 0.9791 |
88
+ | 0.0505 | 5.0 | 3995 | 0.0975 | 0.8636 | 0.8636 | 0.8636 | 88 | 0.9309 | 0.9722 | 0.9511 | 180 | 0.8115 | 0.8684 | 0.8390 | 114 | 0.8516 | 0.8857 | 0.8683 | 175 | 0.8772 | 0.8380 | 0.8571 | 179 | 0.8914 | 0.8571 | 0.8739 | 182 | 0.8333 | 0.9402 | 0.8835 | 117 | 0.8929 | 0.8065 | 0.8475 | 31 | 0.8711 | 0.8874 | 0.8792 | 0.9826 |
89
+ | 0.0407 | 6.0 | 4794 | 0.1120 | 0.7660 | 0.8182 | 0.7912 | 88 | 0.9301 | 0.9611 | 0.9454 | 180 | 0.7344 | 0.8246 | 0.7769 | 114 | 0.8245 | 0.8857 | 0.8540 | 175 | 0.8352 | 0.8492 | 0.8421 | 179 | 0.8261 | 0.8352 | 0.8306 | 182 | 0.8333 | 0.9402 | 0.8835 | 117 | 0.8788 | 0.9355 | 0.9062 | 31 | 0.8314 | 0.8790 | 0.8545 | 0.9794 |
90
+ | 0.032 | 7.0 | 5593 | 0.1585 | 0.7526 | 0.8295 | 0.7892 | 88 | 0.9454 | 0.9611 | 0.9532 | 180 | 0.8120 | 0.8333 | 0.8225 | 114 | 0.8457 | 0.8457 | 0.8457 | 175 | 0.8084 | 0.7542 | 0.7803 | 179 | 0.8426 | 0.5 | 0.6276 | 182 | 0.8271 | 0.9402 | 0.8800 | 117 | 0.8710 | 0.8710 | 0.8710 | 31 | 0.8427 | 0.7992 | 0.8204 | 0.9774 |
91
+ | 0.0298 | 8.0 | 6392 | 0.1335 | 0.7822 | 0.8977 | 0.8360 | 88 | 0.9344 | 0.95 | 0.9421 | 180 | 0.68 | 0.8947 | 0.7727 | 114 | 0.7536 | 0.8914 | 0.8168 | 175 | 0.8478 | 0.8715 | 0.8595 | 179 | 0.8876 | 0.8242 | 0.8547 | 182 | 0.8095 | 0.8718 | 0.8395 | 117 | 0.9 | 0.8710 | 0.8852 | 31 | 0.82 | 0.8846 | 0.8511 | 0.9765 |
92
+ | 0.0231 | 9.0 | 7191 | 0.1189 | 0.8315 | 0.8409 | 0.8362 | 88 | 0.9396 | 0.95 | 0.9448 | 180 | 0.7672 | 0.7807 | 0.7739 | 114 | 0.7989 | 0.84 | 0.8189 | 175 | 0.8508 | 0.8603 | 0.8556 | 179 | 0.8659 | 0.8516 | 0.8587 | 182 | 0.9217 | 0.9060 | 0.9138 | 117 | 0.9 | 0.8710 | 0.8852 | 31 | 0.8578 | 0.8659 | 0.8618 | 0.9814 |
93
+ | 0.0216 | 10.0 | 7990 | 0.1207 | 0.8152 | 0.8523 | 0.8333 | 88 | 0.9511 | 0.9722 | 0.9615 | 180 | 0.8151 | 0.8509 | 0.8326 | 114 | 0.8564 | 0.8857 | 0.8708 | 175 | 0.8976 | 0.8324 | 0.8638 | 179 | 0.8994 | 0.8846 | 0.8920 | 182 | 0.9107 | 0.8718 | 0.8908 | 117 | 0.9032 | 0.9032 | 0.9032 | 31 | 0.8853 | 0.8837 | 0.8845 | 0.9834 |
94
+ | 0.0229 | 11.0 | 8789 | 0.1297 | 0.8242 | 0.8523 | 0.8380 | 88 | 0.9663 | 0.9556 | 0.9609 | 180 | 0.8197 | 0.8772 | 0.8475 | 114 | 0.8772 | 0.8571 | 0.8671 | 175 | 0.8539 | 0.8492 | 0.8515 | 179 | 0.8817 | 0.8187 | 0.8490 | 182 | 0.9126 | 0.8034 | 0.8545 | 117 | 0.8056 | 0.9355 | 0.8657 | 31 | 0.8788 | 0.8640 | 0.8713 | 0.9816 |
95
+ | 0.0221 | 12.0 | 9588 | 0.1326 | 0.8462 | 0.875 | 0.8603 | 88 | 0.9318 | 0.9111 | 0.9213 | 180 | 0.7338 | 0.8947 | 0.8063 | 114 | 0.7487 | 0.8514 | 0.7968 | 175 | 0.8757 | 0.8659 | 0.8708 | 179 | 0.8871 | 0.9066 | 0.8967 | 182 | 0.8651 | 0.9316 | 0.8971 | 117 | 0.7436 | 0.9355 | 0.8286 | 31 | 0.8385 | 0.8912 | 0.8640 | 0.9810 |
96
+ | 0.0179 | 13.0 | 10387 | 0.1413 | 0.8085 | 0.8636 | 0.8352 | 88 | 0.9553 | 0.95 | 0.9526 | 180 | 0.8065 | 0.8772 | 0.8403 | 114 | 0.8098 | 0.8514 | 0.8301 | 175 | 0.8659 | 0.8659 | 0.8659 | 179 | 0.8729 | 0.8681 | 0.8705 | 182 | 0.8934 | 0.9316 | 0.9121 | 117 | 0.9333 | 0.9032 | 0.9180 | 31 | 0.8655 | 0.8874 | 0.8763 | 0.9825 |
97
+ | 0.0143 | 14.0 | 11186 | 0.1267 | 0.8315 | 0.8409 | 0.8362 | 88 | 0.9454 | 0.9611 | 0.9532 | 180 | 0.7372 | 0.8860 | 0.8048 | 114 | 0.8054 | 0.8514 | 0.8278 | 175 | 0.8043 | 0.8268 | 0.8154 | 179 | 0.7861 | 0.8681 | 0.8251 | 182 | 0.8889 | 0.8889 | 0.8889 | 117 | 0.8571 | 0.9677 | 0.9091 | 31 | 0.8285 | 0.8790 | 0.8530 | 0.9810 |
98
+ | 0.0139 | 15.0 | 11985 | 0.1592 | 0.7449 | 0.8295 | 0.7849 | 88 | 0.9355 | 0.9667 | 0.9508 | 180 | 0.8319 | 0.8246 | 0.8282 | 114 | 0.8125 | 0.8171 | 0.8148 | 175 | 0.8708 | 0.8659 | 0.8683 | 179 | 0.8944 | 0.8846 | 0.8895 | 182 | 0.9027 | 0.8718 | 0.8870 | 117 | 0.8788 | 0.9355 | 0.9062 | 31 | 0.8644 | 0.8734 | 0.8689 | 0.9807 |
99
+ | 0.0163 | 16.0 | 12784 | 0.1449 | 0.6496 | 0.8636 | 0.7415 | 88 | 0.8687 | 0.9556 | 0.9101 | 180 | 0.8448 | 0.8596 | 0.8522 | 114 | 0.7935 | 0.8343 | 0.8134 | 175 | 0.8168 | 0.8715 | 0.8432 | 179 | 0.8557 | 0.9121 | 0.8830 | 182 | 0.8413 | 0.9060 | 0.8724 | 117 | 0.8788 | 0.9355 | 0.9062 | 31 | 0.8188 | 0.8902 | 0.8530 | 0.9792 |
100
+ | 0.013 | 17.0 | 13583 | 0.1449 | 0.7353 | 0.8523 | 0.7895 | 88 | 0.9422 | 0.9056 | 0.9235 | 180 | 0.808 | 0.8860 | 0.8452 | 114 | 0.8207 | 0.8629 | 0.8412 | 175 | 0.8531 | 0.8436 | 0.8483 | 179 | 0.9416 | 0.7967 | 0.8631 | 182 | 0.9304 | 0.9145 | 0.9224 | 117 | 0.8571 | 0.9677 | 0.9091 | 31 | 0.8667 | 0.8659 | 0.8663 | 0.9805 |
101
+ | 0.0107 | 18.0 | 14382 | 0.1510 | 0.7315 | 0.8977 | 0.8061 | 88 | 0.9048 | 0.95 | 0.9268 | 180 | 0.8 | 0.8421 | 0.8205 | 114 | 0.8152 | 0.8571 | 0.8357 | 175 | 0.8844 | 0.8547 | 0.8693 | 179 | 0.8486 | 0.8626 | 0.8556 | 182 | 0.9076 | 0.9231 | 0.9153 | 117 | 0.8824 | 0.9677 | 0.9231 | 31 | 0.8489 | 0.8856 | 0.8669 | 0.9816 |
102
+ | 0.0108 | 19.0 | 15181 | 0.1424 | 0.7979 | 0.8523 | 0.8242 | 88 | 0.8895 | 0.9389 | 0.9135 | 180 | 0.8333 | 0.8772 | 0.8547 | 114 | 0.8075 | 0.8629 | 0.8343 | 175 | 0.8636 | 0.8492 | 0.8563 | 179 | 0.8596 | 0.8407 | 0.8500 | 182 | 0.9008 | 0.9316 | 0.9160 | 117 | 0.9032 | 0.9032 | 0.9032 | 31 | 0.8541 | 0.8790 | 0.8664 | 0.9829 |
103
+ | 0.0122 | 20.0 | 15980 | 0.1525 | 0.7228 | 0.8295 | 0.7725 | 88 | 0.9185 | 0.9389 | 0.9286 | 180 | 0.792 | 0.8684 | 0.8285 | 114 | 0.7513 | 0.8114 | 0.7802 | 175 | 0.8523 | 0.8380 | 0.8451 | 179 | 0.8870 | 0.8626 | 0.8747 | 182 | 0.8468 | 0.8974 | 0.8714 | 117 | 0.9655 | 0.9032 | 0.9333 | 31 | 0.8353 | 0.8659 | 0.8503 | 0.9813 |
104
+ | 0.0238 | 21.0 | 16779 | 0.1477 | 0.8605 | 0.8409 | 0.8506 | 88 | 0.9441 | 0.9389 | 0.9415 | 180 | 0.8279 | 0.8860 | 0.8559 | 114 | 0.8177 | 0.8457 | 0.8315 | 175 | 0.8523 | 0.8380 | 0.8451 | 179 | 0.9186 | 0.8681 | 0.8927 | 182 | 0.8417 | 0.8632 | 0.8523 | 117 | 0.8788 | 0.9355 | 0.9062 | 31 | 0.8700 | 0.8724 | 0.8712 | 0.9821 |
105
+ | 0.0124 | 22.0 | 17578 | 0.1458 | 0.7255 | 0.8409 | 0.7789 | 88 | 0.9435 | 0.9278 | 0.9356 | 180 | 0.7786 | 0.8947 | 0.8327 | 114 | 0.7968 | 0.8514 | 0.8232 | 175 | 0.8659 | 0.8659 | 0.8659 | 179 | 0.9029 | 0.8681 | 0.8852 | 182 | 0.8889 | 0.8889 | 0.8889 | 117 | 0.9032 | 0.9032 | 0.9032 | 31 | 0.8526 | 0.8790 | 0.8656 | 0.9819 |
106
+ | 0.0116 | 23.0 | 18377 | 0.1504 | 0.7895 | 0.8523 | 0.8197 | 88 | 0.9399 | 0.9556 | 0.9477 | 180 | 0.8065 | 0.8772 | 0.8403 | 114 | 0.7525 | 0.8686 | 0.8064 | 175 | 0.8655 | 0.8268 | 0.8457 | 179 | 0.8441 | 0.8626 | 0.8533 | 182 | 0.8 | 0.9231 | 0.8571 | 117 | 0.8333 | 0.9677 | 0.8955 | 31 | 0.8322 | 0.8837 | 0.8571 | 0.9803 |
107
+ | 0.0093 | 24.0 | 19176 | 0.1616 | 0.75 | 0.75 | 0.75 | 88 | 0.8978 | 0.9278 | 0.9126 | 180 | 0.8393 | 0.8246 | 0.8319 | 114 | 0.8261 | 0.8686 | 0.8468 | 175 | 0.8844 | 0.8547 | 0.8693 | 179 | 0.8757 | 0.8516 | 0.8635 | 182 | 0.8992 | 0.9145 | 0.9068 | 117 | 0.9062 | 0.9355 | 0.9206 | 31 | 0.8618 | 0.8659 | 0.8638 | 0.9804 |
108
+ | 0.0104 | 25.0 | 19975 | 0.1614 | 0.8182 | 0.8182 | 0.8182 | 88 | 0.9101 | 0.9556 | 0.9322 | 180 | 0.8197 | 0.8772 | 0.8475 | 114 | 0.8466 | 0.8514 | 0.8490 | 175 | 0.8475 | 0.8380 | 0.8427 | 179 | 0.8833 | 0.8736 | 0.8785 | 182 | 0.8548 | 0.9060 | 0.8797 | 117 | 0.8529 | 0.9355 | 0.8923 | 31 | 0.8596 | 0.8790 | 0.8692 | 0.9820 |
109
+ | 0.0105 | 26.0 | 20774 | 0.1362 | 0.7978 | 0.8068 | 0.8023 | 88 | 0.9076 | 0.9278 | 0.9176 | 180 | 0.8534 | 0.8684 | 0.8609 | 114 | 0.8541 | 0.9029 | 0.8778 | 175 | 0.8254 | 0.8715 | 0.8478 | 179 | 0.8852 | 0.8901 | 0.8877 | 182 | 0.9043 | 0.8889 | 0.8966 | 117 | 0.8485 | 0.9032 | 0.875 | 31 | 0.8638 | 0.8865 | 0.875 | 0.9822 |
110
+ | 0.0086 | 27.0 | 21573 | 0.1691 | 0.8172 | 0.8636 | 0.8398 | 88 | 0.9385 | 0.9333 | 0.9359 | 180 | 0.7407 | 0.8772 | 0.8032 | 114 | 0.7812 | 0.8571 | 0.8174 | 175 | 0.8539 | 0.8492 | 0.8515 | 179 | 0.875 | 0.8846 | 0.8798 | 182 | 0.9076 | 0.9231 | 0.9153 | 117 | 0.625 | 0.9677 | 0.7595 | 31 | 0.8378 | 0.8865 | 0.8614 | 0.9791 |
111
+ | 0.0092 | 28.0 | 22372 | 0.1536 | 0.7789 | 0.8409 | 0.8087 | 88 | 0.9266 | 0.9111 | 0.9188 | 180 | 0.8487 | 0.8860 | 0.8670 | 114 | 0.8588 | 0.8686 | 0.8636 | 175 | 0.8982 | 0.8380 | 0.8671 | 179 | 0.9 | 0.8901 | 0.8950 | 182 | 0.8783 | 0.8632 | 0.8707 | 117 | 0.9375 | 0.9677 | 0.9524 | 31 | 0.8795 | 0.8762 | 0.8778 | 0.9826 |
112
+ | 0.0065 | 29.0 | 23171 | 0.1676 | 0.8202 | 0.8295 | 0.8249 | 88 | 0.9444 | 0.9444 | 0.9444 | 180 | 0.7951 | 0.8509 | 0.8220 | 114 | 0.7685 | 0.8914 | 0.8254 | 175 | 0.9060 | 0.7542 | 0.8232 | 179 | 0.9153 | 0.8901 | 0.9025 | 182 | 0.9 | 0.8462 | 0.8722 | 117 | 0.9091 | 0.9677 | 0.9375 | 31 | 0.8674 | 0.8649 | 0.8661 | 0.9812 |
113
+ | 0.0118 | 30.0 | 23970 | 0.1803 | 0.8636 | 0.8636 | 0.8636 | 88 | 0.9293 | 0.95 | 0.9396 | 180 | 0.6690 | 0.8509 | 0.7490 | 114 | 0.8261 | 0.8686 | 0.8468 | 175 | 0.8 | 0.8715 | 0.8342 | 179 | 0.8421 | 0.8791 | 0.8602 | 182 | 0.8780 | 0.9231 | 0.9 | 117 | 0.8788 | 0.9355 | 0.9062 | 31 | 0.8310 | 0.8902 | 0.8596 | 0.9783 |
114
+
115
+
116
+ ### Framework versions
117
+
118
+ - Transformers 4.22.0.dev0
119
+ - Pytorch 1.12.1+cu113
120
+ - Datasets 2.2.2
121
+ - Tokenizers 0.12.1