Edit model card

lmv2-g-receipts4

This model is a fine-tuned version of microsoft/layoutlmv2-large-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3037
  • Purchase Time Precision: 0.8868
  • Purchase Time Recall: 0.94
  • Purchase Time F1: 0.9126
  • Purchase Time Number: 50
  • Receipt Date Precision: 0.8272
  • Receipt Date Recall: 0.8590
  • Receipt Date F1: 0.8428
  • Receipt Date Number: 78
  • Sub Total Precision: 0.8333
  • Sub Total Recall: 0.7778
  • Sub Total F1: 0.8046
  • Sub Total Number: 45
  • Supplier Address Precision: 0.6981
  • Supplier Address Recall: 0.8409
  • Supplier Address F1: 0.7629
  • Supplier Address Number: 44
  • Supplier Name Precision: 0.7540
  • Supplier Name Recall: 0.7851
  • Supplier Name F1: 0.7692
  • Supplier Name Number: 121
  • Tip Amount Precision: 1.0
  • Tip Amount Recall: 1.0
  • Tip Amount F1: 1.0
  • Tip Amount Number: 1
  • Total Precision: 0.9348
  • Total Recall: 0.9348
  • Total F1: 0.9348
  • Total Number: 92
  • Total Tax Amount Precision: 0.8438
  • Total Tax Amount Recall: 0.8438
  • Total Tax Amount F1: 0.8438
  • Total Tax Amount Number: 32
  • Overall Precision: 0.8229
  • Overall Recall: 0.8531
  • Overall F1: 0.8378
  • Overall Accuracy: 0.9794

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Purchase Time Precision Purchase Time Recall Purchase Time F1 Purchase Time Number Receipt Date Precision Receipt Date Recall Receipt Date F1 Receipt Date Number Sub Total Precision Sub Total Recall Sub Total F1 Sub Total Number Supplier Address Precision Supplier Address Recall Supplier Address F1 Supplier Address Number Supplier Name Precision Supplier Name Recall Supplier Name F1 Supplier Name Number Tip Amount Precision Tip Amount Recall Tip Amount F1 Tip Amount Number Total Precision Total Recall Total F1 Total Number Total Tax Amount Precision Total Tax Amount Recall Total Tax Amount F1 Total Tax Amount Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.2363 1.0 892 0.1264 0.8519 0.92 0.8846 50 0.8214 0.8846 0.8519 78 0.7381 0.6889 0.7126 45 0.5636 0.7045 0.6263 44 0.6825 0.7107 0.6964 121 0.0 0.0 0.0 1 0.8737 0.9022 0.8877 92 0.8571 0.75 0.8000 32 0.7645 0.7991 0.7814 0.9724
0.0941 2.0 1784 0.1193 0.9592 0.94 0.9495 50 0.8293 0.8718 0.8500 78 0.6557 0.8889 0.7547 45 0.6042 0.6591 0.6304 44 0.7018 0.6612 0.6809 121 0.0 0.0 0.0 1 0.9167 0.8370 0.875 92 0.8710 0.8438 0.8571 32 0.7846 0.7948 0.7897 0.9774
0.0627 3.0 2676 0.1328 0.9057 0.96 0.9320 50 0.8023 0.8846 0.8415 78 0.7609 0.7778 0.7692 45 0.6481 0.7955 0.7143 44 0.6761 0.7934 0.7300 121 1.0 1.0 1.0 1 0.9639 0.8696 0.9143 92 0.8710 0.8438 0.8571 32 0.7883 0.8445 0.8154 0.9784
0.0413 4.0 3568 0.1526 0.9216 0.94 0.9307 50 0.8193 0.8718 0.8447 78 0.6792 0.8 0.7347 45 0.62 0.7045 0.6596 44 0.7231 0.7769 0.7490 121 0.0 0.0 0.0 1 0.9130 0.9130 0.9130 92 0.7 0.875 0.7778 32 0.7776 0.8380 0.8067 0.9773
0.0334 5.0 4460 0.1755 0.9245 0.98 0.9515 50 0.8313 0.8846 0.8571 78 0.8043 0.8222 0.8132 45 0.5789 0.75 0.6535 44 0.7109 0.7521 0.7309 121 1.0 1.0 1.0 1 0.9053 0.9348 0.9198 92 0.75 0.9375 0.8333 32 0.7873 0.8553 0.8199 0.9768
0.0258 6.0 5352 0.1885 0.9184 0.9 0.9091 50 0.8101 0.8205 0.8153 78 0.7292 0.7778 0.7527 45 0.5263 0.6818 0.5941 44 0.6667 0.7438 0.7031 121 0.0 0.0 0.0 1 0.9111 0.8913 0.9011 92 0.8235 0.875 0.8485 32 0.7602 0.8078 0.7832 0.9746
0.0186 7.0 6244 0.1609 0.9216 0.94 0.9307 50 0.8125 0.8333 0.8228 78 0.6481 0.7778 0.7071 45 0.7045 0.7045 0.7045 44 0.7188 0.7603 0.7390 121 1.0 1.0 1.0 1 0.8723 0.8913 0.8817 92 0.6905 0.9062 0.7838 32 0.7733 0.8251 0.7983 0.9779
0.0181 8.0 7136 0.1821 0.9375 0.9 0.9184 50 0.8312 0.8205 0.8258 78 0.7143 0.6667 0.6897 45 0.5536 0.7045 0.62 44 0.6667 0.7438 0.7031 121 1.0 1.0 1.0 1 0.9111 0.8913 0.9011 92 0.7568 0.875 0.8116 32 0.7634 0.8013 0.7819 0.9763
0.0128 9.0 8028 0.2082 0.9057 0.96 0.9320 50 0.8193 0.8718 0.8447 78 0.68 0.7556 0.7158 45 0.7447 0.7955 0.7692 44 0.7381 0.7686 0.7530 121 1.0 1.0 1.0 1 0.9425 0.8913 0.9162 92 0.7778 0.875 0.8235 32 0.8054 0.8402 0.8224 0.9778
0.0107 10.0 8920 0.1934 0.9184 0.9 0.9091 50 0.8333 0.8333 0.8333 78 0.7381 0.6889 0.7126 45 0.6809 0.7273 0.7033 44 0.744 0.7686 0.7561 121 1.0 1.0 1.0 1 0.9425 0.8913 0.9162 92 0.7317 0.9375 0.8219 32 0.8064 0.8186 0.8124 0.9788
0.0108 11.0 9812 0.2336 0.9184 0.9 0.9091 50 0.825 0.8462 0.8354 78 0.6481 0.7778 0.7071 45 0.5893 0.75 0.6600 44 0.7328 0.7934 0.7619 121 1.0 1.0 1.0 1 0.8795 0.7935 0.8343 92 0.7879 0.8125 0.8 32 0.7700 0.8099 0.7895 0.9750
0.0069 12.0 10704 0.2376 0.9020 0.92 0.9109 50 0.7529 0.8205 0.7853 78 0.8095 0.7556 0.7816 45 0.7083 0.7727 0.7391 44 0.7154 0.7686 0.7410 121 0.0 0.0 0.0 1 0.9022 0.9022 0.9022 92 0.7179 0.875 0.7887 32 0.7844 0.8251 0.8042 0.9770
0.0071 13.0 11596 0.2467 0.9038 0.94 0.9216 50 0.8148 0.8462 0.8302 78 0.7917 0.8444 0.8172 45 0.68 0.7727 0.7234 44 0.752 0.7769 0.7642 121 1.0 1.0 1.0 1 0.9195 0.8696 0.8939 92 0.7632 0.9062 0.8286 32 0.8071 0.8402 0.8233 0.9767
0.0052 14.0 12488 0.2818 0.92 0.92 0.92 50 0.7927 0.8333 0.8125 78 0.7778 0.7778 0.7778 45 0.7234 0.7727 0.7473 44 0.7015 0.7769 0.7373 121 1.0 1.0 1.0 1 0.9535 0.8913 0.9213 92 0.8 0.875 0.8358 32 0.8021 0.8315 0.8165 0.9767
0.0072 15.0 13380 0.2193 0.8333 0.9 0.8654 50 0.8 0.8205 0.8101 78 0.7609 0.7778 0.7692 45 0.6735 0.75 0.7097 44 0.7686 0.7686 0.7686 121 1.0 1.0 1.0 1 0.9518 0.8587 0.9029 92 0.7317 0.9375 0.8219 32 0.8 0.8207 0.8102 0.9783
0.0049 16.0 14272 0.2457 0.92 0.92 0.92 50 0.7738 0.8333 0.8025 78 0.7308 0.8444 0.7835 45 0.6122 0.6818 0.6452 44 0.7480 0.7851 0.7661 121 1.0 1.0 1.0 1 0.8989 0.8696 0.8840 92 0.725 0.9062 0.8056 32 0.7805 0.8294 0.8042 0.9767
0.0055 17.0 15164 0.2359 0.8545 0.94 0.8952 50 0.8025 0.8333 0.8176 78 0.7273 0.7111 0.7191 45 0.6939 0.7727 0.7312 44 0.7661 0.7851 0.7755 121 1.0 1.0 1.0 1 0.8925 0.9022 0.8973 92 0.6905 0.9062 0.7838 32 0.7894 0.8337 0.8109 0.9780
0.0045 18.0 16056 0.2472 0.92 0.92 0.92 50 0.8101 0.8205 0.8153 78 0.7805 0.7111 0.7442 45 0.6735 0.75 0.7097 44 0.7402 0.7769 0.7581 121 1.0 1.0 1.0 1 0.8876 0.8587 0.8729 92 0.7368 0.875 0.8000 32 0.7954 0.8143 0.8047 0.9781
0.0052 19.0 16948 0.2287 0.8868 0.94 0.9126 50 0.7683 0.8077 0.7875 78 0.7857 0.7333 0.7586 45 0.6226 0.75 0.6804 44 0.7287 0.7769 0.7520 121 1.0 1.0 1.0 1 0.9302 0.8696 0.8989 92 0.7073 0.9062 0.7945 32 0.7803 0.8207 0.8000 0.9765
0.0021 20.0 17840 0.2552 0.8824 0.9 0.8911 50 0.8025 0.8333 0.8176 78 0.775 0.6889 0.7294 45 0.6957 0.7273 0.7111 44 0.7541 0.7603 0.7572 121 1.0 1.0 1.0 1 0.9318 0.8913 0.9111 92 0.7143 0.9375 0.8108 32 0.8025 0.8164 0.8094 0.9779
0.0027 21.0 18732 0.2547 0.8679 0.92 0.8932 50 0.8148 0.8462 0.8302 78 0.6552 0.8444 0.7379 45 0.6154 0.7273 0.6667 44 0.7661 0.7851 0.7755 121 1.0 1.0 1.0 1 0.9 0.8804 0.8901 92 0.7436 0.9062 0.8169 32 0.7791 0.8380 0.8075 0.9780
0.0021 22.0 19624 0.2829 0.9020 0.92 0.9109 50 0.8 0.8205 0.8101 78 0.7660 0.8 0.7826 45 0.6364 0.7955 0.7071 44 0.7460 0.7769 0.7611 121 1.0 1.0 1.0 1 0.9022 0.9022 0.9022 92 0.6905 0.9062 0.7838 32 0.7854 0.8380 0.8109 0.9756
0.0022 23.0 20516 0.2834 0.8333 0.9 0.8654 50 0.8025 0.8333 0.8176 78 0.72 0.8 0.7579 45 0.6140 0.7955 0.6931 44 0.736 0.7603 0.7480 121 1.0 1.0 1.0 1 0.9111 0.8913 0.9011 92 0.7692 0.9375 0.8451 32 0.7767 0.8337 0.8042 0.9737
0.0016 24.0 21408 0.2631 0.9020 0.92 0.9109 50 0.8101 0.8205 0.8153 78 0.7447 0.7778 0.7609 45 0.6471 0.75 0.6947 44 0.7308 0.7851 0.7570 121 0.5 1.0 0.6667 1 0.9310 0.8804 0.9050 92 0.725 0.9062 0.8056 32 0.7885 0.8294 0.8084 0.9785
0.0035 25.0 22300 0.2889 0.8868 0.94 0.9126 50 0.8025 0.8333 0.8176 78 0.6481 0.7778 0.7071 45 0.625 0.7955 0.7 44 0.7068 0.7769 0.7402 121 1.0 1.0 1.0 1 0.9412 0.8696 0.9040 92 0.7692 0.9375 0.8451 32 0.7709 0.8359 0.8021 0.9764
0.0022 26.0 23192 0.3023 0.8545 0.94 0.8952 50 0.8519 0.8846 0.8679 78 0.6379 0.8222 0.7184 45 0.5862 0.7727 0.6667 44 0.6763 0.7769 0.7231 121 1.0 1.0 1.0 1 0.9213 0.8913 0.9061 92 0.7436 0.9062 0.8169 32 0.7558 0.8488 0.7996 0.9742
0.0024 27.0 24084 0.2836 0.9020 0.92 0.9109 50 0.8442 0.8333 0.8387 78 0.7447 0.7778 0.7609 45 0.6731 0.7955 0.7292 44 0.7308 0.7851 0.7570 121 1.0 1.0 1.0 1 0.9412 0.8696 0.9040 92 0.75 0.9375 0.8333 32 0.8012 0.8359 0.8182 0.9787
0.0015 28.0 24976 0.2825 0.8246 0.94 0.8785 50 0.825 0.8462 0.8354 78 0.7333 0.7333 0.7333 45 0.7045 0.7045 0.7045 44 0.7949 0.7686 0.7815 121 1.0 1.0 1.0 1 0.9176 0.8478 0.8814 92 0.7 0.875 0.7778 32 0.8038 0.8143 0.8090 0.9784
0.0011 29.0 25868 0.2815 0.8519 0.92 0.8846 50 0.8375 0.8590 0.8481 78 0.6545 0.8 0.7200 45 0.6731 0.7955 0.7292 44 0.7769 0.7769 0.7769 121 1.0 1.0 1.0 1 0.8925 0.9022 0.8973 92 0.7368 0.875 0.8000 32 0.7895 0.8423 0.8150 0.9775
0.001 30.0 26760 0.2851 0.8545 0.94 0.8952 50 0.8171 0.8590 0.8375 78 0.7083 0.7556 0.7312 45 0.7174 0.75 0.7333 44 0.75 0.7934 0.7711 121 1.0 1.0 1.0 1 0.9101 0.8804 0.8950 92 0.7568 0.875 0.8116 32 0.7963 0.8359 0.8156 0.9791
0.0009 31.0 27652 0.2652 0.8679 0.92 0.8932 50 0.8228 0.8333 0.8280 78 0.7609 0.7778 0.7692 45 0.7391 0.7727 0.7556 44 0.7638 0.8017 0.7823 121 0.5 1.0 0.6667 1 0.9213 0.8913 0.9061 92 0.7647 0.8125 0.7879 32 0.8109 0.8337 0.8222 0.9807
0.0008 32.0 28544 0.2681 0.8727 0.96 0.9143 50 0.8481 0.8590 0.8535 78 0.68 0.7556 0.7158 45 0.7143 0.7955 0.7527 44 0.7863 0.7603 0.7731 121 1.0 1.0 1.0 1 0.9101 0.8804 0.8950 92 0.7941 0.8438 0.8182 32 0.8122 0.8315 0.8218 0.9805
0.0009 33.0 29436 0.2700 0.8654 0.9 0.8824 50 0.8272 0.8590 0.8428 78 0.7609 0.7778 0.7692 45 0.6731 0.7955 0.7292 44 0.7385 0.7934 0.7649 121 1.0 1.0 1.0 1 0.9140 0.9239 0.9189 92 0.8 0.875 0.8358 32 0.8 0.8467 0.8227 0.9794
0.0008 34.0 30328 0.2832 0.9 0.9 0.9 50 0.8354 0.8462 0.8408 78 0.75 0.8 0.7742 45 0.6471 0.75 0.6947 44 0.8051 0.7851 0.7950 121 1.0 1.0 1.0 1 0.9333 0.9130 0.9231 92 0.8 0.875 0.8358 32 0.8220 0.8380 0.8299 0.9791
0.0009 35.0 31220 0.2778 0.92 0.92 0.92 50 0.8272 0.8590 0.8428 78 0.7347 0.8 0.7660 45 0.66 0.75 0.7021 44 0.7603 0.7603 0.7603 121 1.0 1.0 1.0 1 0.9231 0.9130 0.9180 92 0.875 0.875 0.875 32 0.8147 0.8359 0.8252 0.9787
0.0004 36.0 32112 0.2885 0.8889 0.96 0.9231 50 0.8293 0.8718 0.8500 78 0.7955 0.7778 0.7865 45 0.6863 0.7955 0.7368 44 0.7983 0.7851 0.7917 121 1.0 1.0 1.0 1 0.8925 0.9022 0.8973 92 0.8438 0.8438 0.8438 32 0.8235 0.8467 0.8349 0.9795
0.0007 37.0 33004 0.2868 0.8868 0.94 0.9126 50 0.85 0.8718 0.8608 78 0.7660 0.8 0.7826 45 0.7727 0.7727 0.7727 44 0.7540 0.7851 0.7692 121 1.0 1.0 1.0 1 0.9130 0.9130 0.9130 92 0.7941 0.8438 0.8182 32 0.8218 0.8467 0.8340 0.9796
0.003 38.0 33896 0.2946 0.8868 0.94 0.9126 50 0.8395 0.8718 0.8553 78 0.8 0.8 0.8000 45 0.7059 0.8182 0.7579 44 0.7705 0.7769 0.7737 121 1.0 1.0 1.0 1 0.9130 0.9130 0.9130 92 0.7941 0.8438 0.8182 32 0.8205 0.8488 0.8344 0.9788
0.0007 39.0 34788 0.2761 0.8846 0.92 0.9020 50 0.8293 0.8718 0.8500 78 0.7447 0.7778 0.7609 45 0.7778 0.7955 0.7865 44 0.7661 0.7851 0.7755 121 1.0 1.0 1.0 1 0.9222 0.9022 0.9121 92 0.7714 0.8438 0.8060 32 0.8193 0.8423 0.8307 0.9806
0.0004 40.0 35680 0.2942 0.9038 0.94 0.9216 50 0.8272 0.8590 0.8428 78 0.7609 0.7778 0.7692 45 0.6923 0.8182 0.7500 44 0.7368 0.8099 0.7717 121 1.0 1.0 1.0 1 0.9231 0.9130 0.9180 92 0.8182 0.8438 0.8308 32 0.8078 0.8531 0.8298 0.9785
0.0001 41.0 36572 0.2966 0.8519 0.92 0.8846 50 0.8171 0.8590 0.8375 78 0.72 0.8 0.7579 45 0.7447 0.7955 0.7692 44 0.7712 0.7521 0.7615 121 1.0 1.0 1.0 1 0.8737 0.9022 0.8877 92 0.7714 0.8438 0.8060 32 0.8008 0.8337 0.8169 0.9788
0.0003 42.0 37464 0.3037 0.8868 0.94 0.9126 50 0.8272 0.8590 0.8428 78 0.8333 0.7778 0.8046 45 0.6981 0.8409 0.7629 44 0.7540 0.7851 0.7692 121 1.0 1.0 1.0 1 0.9348 0.9348 0.9348 92 0.8438 0.8438 0.8438 32 0.8229 0.8531 0.8378 0.9794
0.0001 43.0 38356 0.3135 0.8519 0.92 0.8846 50 0.8148 0.8462 0.8302 78 0.7447 0.7778 0.7609 45 0.7174 0.75 0.7333 44 0.7692 0.7438 0.7563 121 1.0 1.0 1.0 1 0.9444 0.9239 0.9341 92 0.7714 0.8438 0.8060 32 0.8132 0.8272 0.8201 0.9785
0.0001 44.0 39248 0.3127 0.8868 0.94 0.9126 50 0.8148 0.8462 0.8302 78 0.7292 0.7778 0.7527 45 0.7 0.7955 0.7447 44 0.7966 0.7769 0.7866 121 1.0 1.0 1.0 1 0.9222 0.9022 0.9121 92 0.7105 0.8438 0.7714 32 0.8100 0.8380 0.8238 0.9784
0.0003 45.0 40140 0.3167 0.9038 0.94 0.9216 50 0.8148 0.8462 0.8302 78 0.7609 0.7778 0.7692 45 0.68 0.7727 0.7234 44 0.8034 0.7769 0.7899 121 1.0 1.0 1.0 1 0.9333 0.9130 0.9231 92 0.7297 0.8438 0.7826 32 0.8186 0.8380 0.8282 0.9782
0.0004 46.0 41032 0.3189 0.8868 0.94 0.9126 50 0.825 0.8462 0.8354 78 0.7609 0.7778 0.7692 45 0.7 0.7955 0.7447 44 0.7833 0.7769 0.7801 121 1.0 1.0 1.0 1 0.9432 0.9022 0.9222 92 0.7297 0.8438 0.7826 32 0.8168 0.8380 0.8273 0.9777
0.0001 47.0 41924 0.3171 0.9216 0.94 0.9307 50 0.825 0.8462 0.8354 78 0.7609 0.7778 0.7692 45 0.6667 0.7727 0.7158 44 0.7917 0.7851 0.7884 121 1.0 1.0 1.0 1 0.9326 0.9022 0.9171 92 0.7297 0.8438 0.7826 32 0.8168 0.8380 0.8273 0.9779
0.0001 48.0 42816 0.3186 0.9216 0.94 0.9307 50 0.825 0.8462 0.8354 78 0.7609 0.7778 0.7692 45 0.6667 0.7727 0.7158 44 0.7917 0.7851 0.7884 121 1.0 1.0 1.0 1 0.9326 0.9022 0.9171 92 0.75 0.8438 0.7941 32 0.8186 0.8380 0.8282 0.9779
0.0001 49.0 43708 0.3165 0.9216 0.94 0.9307 50 0.825 0.8462 0.8354 78 0.7609 0.7778 0.7692 45 0.6939 0.7727 0.7312 44 0.8051 0.7851 0.7950 121 1.0 1.0 1.0 1 0.9326 0.9022 0.9171 92 0.7714 0.8438 0.8060 32 0.8273 0.8380 0.8326 0.9781
0.0001 50.0 44600 0.3158 0.9216 0.94 0.9307 50 0.825 0.8462 0.8354 78 0.7609 0.7778 0.7692 45 0.6939 0.7727 0.7312 44 0.8051 0.7851 0.7950 121 1.0 1.0 1.0 1 0.9326 0.9022 0.9171 92 0.7714 0.8438 0.8060 32 0.8273 0.8380 0.8326 0.9782

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.2.2
  • Tokenizers 0.13.2
Downloads last month
2