lmv2-g-w9-2018-148-doc-07-07_1

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0160
  • Address Precision: 0.9667
  • Address Recall: 0.9667
  • Address F1: 0.9667
  • Address Number: 30
  • Business Name Precision: 1.0
  • Business Name Recall: 1.0
  • Business Name F1: 1.0
  • Business Name Number: 29
  • City State Zip Code Precision: 1.0
  • City State Zip Code Recall: 1.0
  • City State Zip Code F1: 1.0
  • City State Zip Code Number: 30
  • Ein Precision: 0.0
  • Ein Recall: 0.0
  • Ein F1: 0.0
  • Ein Number: 1
  • List Account Number Precision: 1.0
  • List Account Number Recall: 1.0
  • List Account Number F1: 1.0
  • List Account Number Number: 11
  • Name Precision: 1.0
  • Name Recall: 1.0
  • Name F1: 1.0
  • Name Number: 30
  • Ssn Precision: 0.8333
  • Ssn Recall: 1.0
  • Ssn F1: 0.9091
  • Ssn Number: 10
  • Overall Precision: 0.9789
  • Overall Recall: 0.9858
  • Overall F1: 0.9823
  • Overall Accuracy: 0.9995

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Address Precision Address Recall Address F1 Address Number Business Name Precision Business Name Recall Business Name F1 Business Name Number City State Zip Code Precision City State Zip Code Recall City State Zip Code F1 City State Zip Code Number Ein Precision Ein Recall Ein F1 Ein Number List Account Number Precision List Account Number Recall List Account Number F1 List Account Number Number Name Precision Name Recall Name F1 Name Number Ssn Precision Ssn Recall Ssn F1 Ssn Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.5672 1.0 118 1.1527 0.0 0.0 0.0 30 0.0 0.0 0.0 29 0.0 0.0 0.0 30 0.0 0.0 0.0 1 0.0 0.0 0.0 11 0.0 0.0 0.0 30 0.0 0.0 0.0 10 0.0 0.0 0.0 0.9642
0.8804 2.0 236 0.5661 0.2095 0.7333 0.3259 30 0.0 0.0 0.0 29 0.0 0.0 0.0 30 0.0 0.0 0.0 1 0.0 0.0 0.0 11 0.0 0.0 0.0 30 0.0 0.0 0.0 10 0.2095 0.1560 0.1789 0.9704
0.3739 3.0 354 0.2118 0.9375 1.0 0.9677 30 0.7143 0.1724 0.2778 29 0.9375 1.0 0.9677 30 0.0 0.0 0.0 1 0.8182 0.8182 0.8182 11 0.5 1.0 0.6667 30 0.75 0.9 0.8182 10 0.7338 0.8014 0.7661 0.9932
0.1626 4.0 472 0.1155 0.9375 1.0 0.9677 30 0.8710 0.9310 0.9 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 0.6923 0.8182 0.7500 11 1.0 1.0 1.0 30 0.7 0.7 0.7 10 0.9110 0.9433 0.9268 0.9976
0.1031 5.0 590 0.0817 0.9355 0.9667 0.9508 30 0.8125 0.8966 0.8525 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 0.6923 0.8182 0.7500 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9048 0.9433 0.9236 0.9981
0.0769 6.0 708 0.0634 0.9355 0.9667 0.9508 30 0.9333 0.9655 0.9492 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 0.6923 0.8182 0.7500 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9310 0.9574 0.9441 0.9984
0.0614 7.0 826 0.0518 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 0.6923 0.8182 0.7500 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9510 0.9645 0.9577 0.9991
0.0509 8.0 944 0.0432 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 0.8333 0.9091 0.8696 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9648 0.9716 0.9682 0.9994
0.0431 9.0 1062 0.0369 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9787 0.9787 0.9787 0.9994
0.037 10.0 1180 0.0313 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9787 0.9787 0.9787 0.9994
0.0328 11.0 1298 0.0281 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.7143 1.0 0.8333 10 0.9653 0.9858 0.9754 0.9994
0.0295 12.0 1416 0.0246 0.7429 0.8667 0.8 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.6667 0.8 0.7273 10 0.9116 0.9504 0.9306 0.9991
0.0251 13.0 1534 0.0207 0.9677 1.0 0.9836 30 0.9333 0.9655 0.9492 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9653 0.9858 0.9754 0.9994
0.0231 14.0 1652 0.0210 0.9667 0.9667 0.9667 30 1.0 0.9655 0.9825 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9787 0.9787 0.9787 0.9991
0.0184 15.0 1770 0.0160 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9789 0.9858 0.9823 0.9995
0.0162 16.0 1888 0.0142 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9789 0.9858 0.9823 0.9995
0.0142 17.0 2006 0.0127 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9789 0.9858 0.9823 0.9995
0.0123 18.0 2124 0.0114 0.9667 0.9667 0.9667 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9789 0.9858 0.9823 0.9995
0.0118 19.0 2242 0.0152 0.9677 1.0 0.9836 30 0.6765 0.7931 0.7302 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 0.8333 0.9091 0.8696 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.8859 0.9362 0.9103 0.9986
0.0104 20.0 2360 0.0125 0.9677 1.0 0.9836 30 1.0 0.9655 0.9825 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.9091 1.0 0.9524 10 0.9789 0.9858 0.9823 0.9992
0.0092 21.0 2478 0.0113 0.9677 1.0 0.9836 30 1.0 0.9655 0.9825 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9653 0.9858 0.9754 0.9993
0.0089 22.0 2596 0.0111 0.9677 1.0 0.9836 30 1.0 0.9655 0.9825 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9789 0.9858 0.9823 0.9992
0.0076 23.0 2714 0.0107 0.9677 1.0 0.9836 30 0.9310 0.9310 0.9310 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8333 1.0 0.9091 10 0.9650 0.9787 0.9718 0.9991
0.0074 24.0 2832 0.0105 0.9677 1.0 0.9836 30 0.9310 0.9310 0.9310 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9514 0.9716 0.9614 0.9990
0.007 25.0 2950 0.0092 0.9677 1.0 0.9836 30 1.0 0.9655 0.9825 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.7692 1.0 0.8696 10 0.9720 0.9858 0.9789 0.9991
0.0062 26.0 3068 0.0061 0.9677 1.0 0.9836 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.7143 1.0 0.8333 10 0.9655 0.9929 0.9790 0.9994
0.0057 27.0 3186 0.0056 0.9677 1.0 0.9836 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.8182 0.9 0.8571 10 0.9720 0.9858 0.9789 0.9995
0.0047 28.0 3304 0.0054 0.9677 1.0 0.9836 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.7143 1.0 0.8333 10 0.9655 0.9929 0.9790 0.9994
0.0042 29.0 3422 0.0052 0.9677 1.0 0.9836 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.7143 1.0 0.8333 10 0.9655 0.9929 0.9790 0.9994
0.0039 30.0 3540 0.0049 0.9677 1.0 0.9836 30 1.0 1.0 1.0 29 1.0 1.0 1.0 30 0.0 0.0 0.0 1 1.0 1.0 1.0 11 1.0 1.0 1.0 30 0.7143 1.0 0.8333 10 0.9655 0.9929 0.9790 0.9994

Framework versions

  • Transformers 4.21.0.dev0
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.