metadata
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
model-index:
- name: lmv2-g-bnkstm-994-doc-09-10
results: []
lmv2-g-bnkstm-994-doc-09-10
This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0926
- Account Number Precision: 0.8889
- Account Number Recall: 0.9014
- Account Number F1: 0.8951
- Account Number Number: 142
- Bank Name Precision: 0.7993
- Bank Name Recall: 0.8484
- Bank Name F1: 0.8231
- Bank Name Number: 277
- Cust Address Precision: 0.8563
- Cust Address Recall: 0.8827
- Cust Address F1: 0.8693
- Cust Address Number: 162
- Cust Name Precision: 0.9181
- Cust Name Recall: 0.9290
- Cust Name F1: 0.9235
- Cust Name Number: 169
- Ending Balance Precision: 0.7706
- Ending Balance Recall: 0.7892
- Ending Balance F1: 0.7798
- Ending Balance Number: 166
- Starting Balance Precision: 0.9051
- Starting Balance Recall: 0.8720
- Starting Balance F1: 0.8882
- Starting Balance Number: 164
- Statement Date Precision: 0.8817
- Statement Date Recall: 0.8765
- Statement Date F1: 0.8791
- Statement Date Number: 170
- Overall Precision: 0.8531
- Overall Recall: 0.8688
- Overall F1: 0.8609
- Overall Accuracy: 0.9850
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Account Number Precision | Account Number Recall | Account Number F1 | Account Number Number | Bank Name Precision | Bank Name Recall | Bank Name F1 | Bank Name Number | Cust Address Precision | Cust Address Recall | Cust Address F1 | Cust Address Number | Cust Name Precision | Cust Name Recall | Cust Name F1 | Cust Name Number | Ending Balance Precision | Ending Balance Recall | Ending Balance F1 | Ending Balance Number | Starting Balance Precision | Starting Balance Recall | Starting Balance F1 | Starting Balance Number | Statement Date Precision | Statement Date Recall | Statement Date F1 | Statement Date Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.7648 | 1.0 | 795 | 0.2550 | 0.8514 | 0.4437 | 0.5833 | 142 | 0.6229 | 0.5307 | 0.5731 | 277 | 0.5650 | 0.7778 | 0.6545 | 162 | 0.6682 | 0.8698 | 0.7558 | 169 | 0.0 | 0.0 | 0.0 | 166 | 0.0 | 0.0 | 0.0 | 164 | 0.6040 | 0.3588 | 0.4502 | 170 | 0.6370 | 0.4352 | 0.5171 | 0.9623 |
0.1725 | 2.0 | 1590 | 0.1128 | 0.6067 | 0.7606 | 0.675 | 142 | 0.7294 | 0.7978 | 0.7621 | 277 | 0.8150 | 0.8704 | 0.8418 | 162 | 0.8966 | 0.9231 | 0.9096 | 169 | 0.7786 | 0.6566 | 0.7124 | 166 | 0.7576 | 0.7622 | 0.7599 | 164 | 0.8509 | 0.8059 | 0.8278 | 170 | 0.7705 | 0.7976 | 0.7838 | 0.9816 |
0.0877 | 3.0 | 2385 | 0.0877 | 0.7857 | 0.9296 | 0.8516 | 142 | 0.7872 | 0.8014 | 0.7943 | 277 | 0.7709 | 0.8519 | 0.8094 | 162 | 0.8827 | 0.9349 | 0.9080 | 169 | 0.7673 | 0.7349 | 0.7508 | 166 | 0.8313 | 0.8415 | 0.8364 | 164 | 0.7716 | 0.8941 | 0.8283 | 170 | 0.7985 | 0.8496 | 0.8233 | 0.9830 |
0.0564 | 4.0 | 3180 | 0.0826 | 0.8503 | 0.8803 | 0.8651 | 142 | 0.7566 | 0.8303 | 0.7917 | 277 | 0.7895 | 0.8333 | 0.8108 | 162 | 0.8824 | 0.8876 | 0.8850 | 169 | 0.7049 | 0.7771 | 0.7393 | 166 | 0.7717 | 0.8659 | 0.8161 | 164 | 0.8363 | 0.8412 | 0.8387 | 170 | 0.7925 | 0.8432 | 0.8171 | 0.9828 |
0.0402 | 5.0 | 3975 | 0.0889 | 0.8815 | 0.8380 | 0.8592 | 142 | 0.7758 | 0.7870 | 0.7814 | 277 | 0.8266 | 0.8827 | 0.8537 | 162 | 0.8983 | 0.9408 | 0.9191 | 169 | 0.6378 | 0.7108 | 0.6724 | 166 | 0.8707 | 0.7805 | 0.8232 | 164 | 0.8508 | 0.9059 | 0.8775 | 170 | 0.8124 | 0.8312 | 0.8217 | 0.9837 |
0.0332 | 6.0 | 4770 | 0.0864 | 0.7778 | 0.9366 | 0.8498 | 142 | 0.8175 | 0.8412 | 0.8292 | 277 | 0.8704 | 0.8704 | 0.8704 | 162 | 0.9167 | 0.9112 | 0.9139 | 169 | 0.7702 | 0.7470 | 0.7584 | 166 | 0.8424 | 0.8476 | 0.8450 | 164 | 0.8728 | 0.8882 | 0.8805 | 170 | 0.8366 | 0.86 | 0.8481 | 0.9846 |
0.0285 | 7.0 | 5565 | 0.0858 | 0.7516 | 0.8310 | 0.7893 | 142 | 0.8156 | 0.8303 | 0.8229 | 277 | 0.8373 | 0.8580 | 0.8476 | 162 | 0.9133 | 0.9349 | 0.9240 | 169 | 0.8288 | 0.7289 | 0.7756 | 166 | 0.8144 | 0.8293 | 0.8218 | 164 | 0.8353 | 0.8353 | 0.8353 | 170 | 0.8279 | 0.8352 | 0.8315 | 0.9840 |
0.027 | 8.0 | 6360 | 0.1033 | 0.8841 | 0.8592 | 0.8714 | 142 | 0.7695 | 0.8556 | 0.8103 | 277 | 0.7816 | 0.8395 | 0.8095 | 162 | 0.9075 | 0.9290 | 0.9181 | 169 | 0.8538 | 0.6687 | 0.75 | 166 | 0.8861 | 0.8537 | 0.8696 | 164 | 0.8492 | 0.8941 | 0.8711 | 170 | 0.8373 | 0.844 | 0.8406 | 0.9837 |
0.0237 | 9.0 | 7155 | 0.0922 | 0.8792 | 0.9225 | 0.9003 | 142 | 0.8262 | 0.8412 | 0.8336 | 277 | 0.8421 | 0.8889 | 0.8649 | 162 | 0.8983 | 0.9408 | 0.9191 | 169 | 0.8113 | 0.7771 | 0.7938 | 166 | 0.7641 | 0.9085 | 0.8301 | 164 | 0.8466 | 0.8765 | 0.8613 | 170 | 0.8358 | 0.8752 | 0.8550 | 0.9850 |
0.023 | 10.0 | 7950 | 0.0935 | 0.8493 | 0.8732 | 0.8611 | 142 | 0.7848 | 0.8556 | 0.8187 | 277 | 0.8246 | 0.8704 | 0.8468 | 162 | 0.9080 | 0.9349 | 0.9213 | 169 | 0.8133 | 0.7349 | 0.7722 | 166 | 0.8867 | 0.8110 | 0.8471 | 164 | 0.8735 | 0.8529 | 0.8631 | 170 | 0.8419 | 0.848 | 0.8450 | 0.9841 |
0.0197 | 11.0 | 8745 | 0.0926 | 0.8889 | 0.9014 | 0.8951 | 142 | 0.7993 | 0.8484 | 0.8231 | 277 | 0.8563 | 0.8827 | 0.8693 | 162 | 0.9181 | 0.9290 | 0.9235 | 169 | 0.7706 | 0.7892 | 0.7798 | 166 | 0.9051 | 0.8720 | 0.8882 | 164 | 0.8817 | 0.8765 | 0.8791 | 170 | 0.8531 | 0.8688 | 0.8609 | 0.9850 |
0.0193 | 12.0 | 9540 | 0.1035 | 0.7514 | 0.9366 | 0.8339 | 142 | 0.8127 | 0.8773 | 0.8438 | 277 | 0.8103 | 0.8704 | 0.8393 | 162 | 0.9405 | 0.9349 | 0.9377 | 169 | 0.6983 | 0.7530 | 0.7246 | 166 | 0.8011 | 0.8841 | 0.8406 | 164 | 0.8462 | 0.9059 | 0.8750 | 170 | 0.8081 | 0.8792 | 0.8421 | 0.9836 |
0.0166 | 13.0 | 10335 | 0.1077 | 0.8889 | 0.8451 | 0.8664 | 142 | 0.8062 | 0.8412 | 0.8233 | 277 | 0.7953 | 0.8395 | 0.8168 | 162 | 0.8786 | 0.8994 | 0.8889 | 169 | 0.8069 | 0.7048 | 0.7524 | 166 | 0.8167 | 0.8963 | 0.8547 | 164 | 0.8671 | 0.8824 | 0.8746 | 170 | 0.8333 | 0.844 | 0.8386 | 0.9836 |
0.016 | 14.0 | 11130 | 0.1247 | 0.8521 | 0.8521 | 0.8521 | 142 | 0.8456 | 0.8303 | 0.8379 | 277 | 0.8050 | 0.7901 | 0.7975 | 162 | 0.9167 | 0.9112 | 0.9139 | 169 | 0.8392 | 0.7229 | 0.7767 | 166 | 0.8521 | 0.8780 | 0.8649 | 164 | 0.9262 | 0.8118 | 0.8652 | 170 | 0.8611 | 0.828 | 0.8442 | 0.9836 |
0.0153 | 15.0 | 11925 | 0.1030 | 0.8280 | 0.9155 | 0.8696 | 142 | 0.7637 | 0.8051 | 0.7838 | 277 | 0.8452 | 0.8765 | 0.8606 | 162 | 0.9337 | 0.9172 | 0.9254 | 169 | 0.7551 | 0.6687 | 0.7093 | 166 | 0.8616 | 0.8354 | 0.8483 | 164 | 0.8287 | 0.8824 | 0.8547 | 170 | 0.8252 | 0.8384 | 0.8317 | 0.9834 |
0.0139 | 16.0 | 12720 | 0.0920 | 0.8075 | 0.9155 | 0.8581 | 142 | 0.7735 | 0.8628 | 0.8157 | 277 | 0.7663 | 0.8704 | 0.8150 | 162 | 0.8870 | 0.9290 | 0.9075 | 169 | 0.7647 | 0.7831 | 0.7738 | 166 | 0.8571 | 0.8780 | 0.8675 | 164 | 0.6630 | 0.7176 | 0.6893 | 170 | 0.7857 | 0.8504 | 0.8167 | 0.9832 |
0.0124 | 17.0 | 13515 | 0.1057 | 0.8013 | 0.8521 | 0.8259 | 142 | 0.8087 | 0.8087 | 0.8087 | 277 | 0.7663 | 0.8704 | 0.8150 | 162 | 0.9186 | 0.9349 | 0.9267 | 169 | 0.8322 | 0.7169 | 0.7702 | 166 | 0.8563 | 0.8720 | 0.8640 | 164 | 0.8603 | 0.9059 | 0.8825 | 170 | 0.8327 | 0.848 | 0.8403 | 0.9829 |
0.0135 | 18.0 | 14310 | 0.1001 | 0.8323 | 0.9085 | 0.8687 | 142 | 0.8363 | 0.8484 | 0.8423 | 277 | 0.8494 | 0.8704 | 0.8598 | 162 | 0.8462 | 0.9112 | 0.8775 | 169 | 0.7925 | 0.7590 | 0.7754 | 166 | 0.8286 | 0.8841 | 0.8555 | 164 | 0.8686 | 0.8941 | 0.8812 | 170 | 0.8368 | 0.8656 | 0.8510 | 0.9839 |
0.0125 | 19.0 | 15105 | 0.1200 | 0.8562 | 0.8803 | 0.8681 | 142 | 0.8 | 0.8520 | 0.8252 | 277 | 0.7705 | 0.8704 | 0.8174 | 162 | 0.8864 | 0.9231 | 0.9043 | 169 | 0.7716 | 0.7530 | 0.7622 | 166 | 0.8642 | 0.8537 | 0.8589 | 164 | 0.85 | 0.9 | 0.8743 | 170 | 0.8252 | 0.8608 | 0.8426 | 0.9843 |
0.0098 | 20.0 | 15900 | 0.1097 | 0.8993 | 0.8803 | 0.8897 | 142 | 0.7933 | 0.8592 | 0.8250 | 277 | 0.8144 | 0.8395 | 0.8267 | 162 | 0.8641 | 0.9408 | 0.9008 | 169 | 0.82 | 0.7410 | 0.7785 | 166 | 0.8704 | 0.8598 | 0.8650 | 164 | 0.8876 | 0.8824 | 0.8850 | 170 | 0.8434 | 0.8576 | 0.8505 | 0.9846 |
0.0128 | 21.0 | 16695 | 0.1090 | 0.8993 | 0.8803 | 0.8897 | 142 | 0.8294 | 0.8773 | 0.8526 | 277 | 0.8107 | 0.8457 | 0.8278 | 162 | 0.8678 | 0.8935 | 0.8805 | 169 | 0.8133 | 0.7349 | 0.7722 | 166 | 0.8218 | 0.8720 | 0.8462 | 164 | 0.8889 | 0.8471 | 0.8675 | 170 | 0.8446 | 0.852 | 0.8483 | 0.9838 |
0.01 | 22.0 | 17490 | 0.1280 | 0.9 | 0.8239 | 0.8603 | 142 | 0.7848 | 0.8556 | 0.8187 | 277 | 0.8057 | 0.8704 | 0.8368 | 162 | 0.8674 | 0.9290 | 0.8971 | 169 | 0.7595 | 0.7229 | 0.7407 | 166 | 0.8412 | 0.8720 | 0.8563 | 164 | 0.7989 | 0.8882 | 0.8412 | 170 | 0.8169 | 0.8528 | 0.8344 | 0.9832 |
0.0096 | 23.0 | 18285 | 0.1023 | 0.8889 | 0.9014 | 0.8951 | 142 | 0.8041 | 0.8448 | 0.8239 | 277 | 0.8253 | 0.8457 | 0.8354 | 162 | 0.8415 | 0.9112 | 0.875 | 169 | 0.7683 | 0.7590 | 0.7636 | 166 | 0.8118 | 0.8415 | 0.8263 | 164 | 0.7979 | 0.8824 | 0.8380 | 170 | 0.8170 | 0.8536 | 0.8349 | 0.9843 |
0.0088 | 24.0 | 19080 | 0.1172 | 0.8649 | 0.9014 | 0.8828 | 142 | 0.8298 | 0.8448 | 0.8372 | 277 | 0.7816 | 0.8395 | 0.8095 | 162 | 0.8674 | 0.9290 | 0.8971 | 169 | 0.7257 | 0.7651 | 0.7449 | 166 | 0.8136 | 0.8780 | 0.8446 | 164 | 0.8229 | 0.8471 | 0.8348 | 170 | 0.8155 | 0.856 | 0.8353 | 0.9829 |
0.0083 | 25.0 | 19875 | 0.1090 | 0.7401 | 0.9225 | 0.8213 | 142 | 0.8363 | 0.8484 | 0.8423 | 277 | 0.8057 | 0.8704 | 0.8368 | 162 | 0.8889 | 0.8994 | 0.8941 | 169 | 0.8176 | 0.7289 | 0.7707 | 166 | 0.7609 | 0.8537 | 0.8046 | 164 | 0.8488 | 0.8588 | 0.8538 | 170 | 0.8150 | 0.8528 | 0.8335 | 0.9830 |
0.0105 | 26.0 | 20670 | 0.1191 | 0.7241 | 0.8873 | 0.7975 | 142 | 0.7468 | 0.8412 | 0.7912 | 277 | 0.8161 | 0.8765 | 0.8452 | 162 | 0.8254 | 0.9231 | 0.8715 | 169 | 0.7384 | 0.7651 | 0.7515 | 166 | 0.8333 | 0.8537 | 0.8434 | 164 | 0.8378 | 0.9118 | 0.8732 | 170 | 0.7853 | 0.8632 | 0.8224 | 0.9814 |
0.0103 | 27.0 | 21465 | 0.1125 | 0.8378 | 0.8732 | 0.8552 | 142 | 0.8566 | 0.8628 | 0.8597 | 277 | 0.8046 | 0.8642 | 0.8333 | 162 | 0.8764 | 0.9231 | 0.8991 | 169 | 0.8289 | 0.7590 | 0.7925 | 166 | 0.8466 | 0.8415 | 0.8440 | 164 | 0.8929 | 0.8824 | 0.8876 | 170 | 0.8502 | 0.8584 | 0.8543 | 0.9847 |
0.0081 | 28.0 | 22260 | 0.1301 | 0.8601 | 0.8662 | 0.8632 | 142 | 0.8489 | 0.8520 | 0.8505 | 277 | 0.8225 | 0.8580 | 0.8399 | 162 | 0.8870 | 0.9290 | 0.9075 | 169 | 0.8067 | 0.7289 | 0.7658 | 166 | 0.8625 | 0.8415 | 0.8519 | 164 | 0.8613 | 0.8765 | 0.8688 | 170 | 0.8504 | 0.8504 | 0.8504 | 0.9850 |
0.0079 | 29.0 | 23055 | 0.1458 | 0.9104 | 0.8592 | 0.8841 | 142 | 0.8185 | 0.8303 | 0.8244 | 277 | 0.7730 | 0.7778 | 0.7754 | 162 | 0.8191 | 0.9112 | 0.8627 | 169 | 0.8013 | 0.7530 | 0.7764 | 166 | 0.8304 | 0.8659 | 0.8478 | 164 | 0.8941 | 0.8941 | 0.8941 | 170 | 0.8321 | 0.8408 | 0.8365 | 0.9834 |
0.0084 | 30.0 | 23850 | 0.1264 | 0.8435 | 0.8732 | 0.8581 | 142 | 0.8328 | 0.8628 | 0.8475 | 277 | 0.8256 | 0.8765 | 0.8503 | 162 | 0.9023 | 0.9290 | 0.9155 | 169 | 0.8531 | 0.7349 | 0.7896 | 166 | 0.8598 | 0.8598 | 0.8598 | 164 | 0.8757 | 0.8706 | 0.8732 | 170 | 0.8543 | 0.8584 | 0.8563 | 0.9848 |
Framework versions
- Transformers 4.22.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1