Sebabrata's picture
update model card README.md
bc05cac
|
raw
history blame
17.1 kB
metadata
license: cc-by-nc-sa-4.0
tags:
  - generated_from_trainer
model-index:
  - name: lmv2-g-voterid-117-doc-09-13
    results: []

lmv2-g-voterid-117-doc-09-13

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1322
  • Age Precision: 1.0
  • Age Recall: 1.0
  • Age F1: 1.0
  • Age Number: 3
  • Dob Precision: 1.0
  • Dob Recall: 1.0
  • Dob F1: 1.0
  • Dob Number: 5
  • F H M Name Precision: 0.7917
  • F H M Name Recall: 0.7917
  • F H M Name F1: 0.7917
  • F H M Name Number: 24
  • Name Precision: 0.8462
  • Name Recall: 0.9167
  • Name F1: 0.8800
  • Name Number: 24
  • Sex Precision: 1.0
  • Sex Recall: 1.0
  • Sex F1: 1.0
  • Sex Number: 8
  • Voter Id Precision: 0.92
  • Voter Id Recall: 0.9583
  • Voter Id F1: 0.9388
  • Voter Id Number: 24
  • Overall Precision: 0.8791
  • Overall Recall: 0.9091
  • Overall F1: 0.8939
  • Overall Accuracy: 0.9836

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Age Precision Age Recall Age F1 Age Number Dob Precision Dob Recall Dob F1 Dob Number F H M Name Precision F H M Name Recall F H M Name F1 F H M Name Number Name Precision Name Recall Name F1 Name Number Sex Precision Sex Recall Sex F1 Sex Number Voter Id Precision Voter Id Recall Voter Id F1 Voter Id Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.5488 1.0 93 1.2193 0.0 0.0 0.0 3 0.0 0.0 0.0 5 0.0 0.0 0.0 24 0.0 0.0 0.0 24 0.0 0.0 0.0 8 1.0 0.0833 0.1538 24 1.0 0.0227 0.0444 0.9100
1.0594 2.0 186 0.8695 0.0 0.0 0.0 3 0.0 0.0 0.0 5 0.0 0.0 0.0 24 0.0 0.0 0.0 24 0.0 0.0 0.0 8 0.6286 0.9167 0.7458 24 0.6286 0.25 0.3577 0.9173
0.763 3.0 279 0.6057 0.0 0.0 0.0 3 0.0 0.0 0.0 5 0.0667 0.0417 0.0513 24 0.0 0.0 0.0 24 0.0 0.0 0.0 8 0.6875 0.9167 0.7857 24 0.4694 0.2614 0.3358 0.9228
0.5241 4.0 372 0.4257 0.0 0.0 0.0 3 0.0 0.0 0.0 5 0.0 0.0 0.0 24 0.2381 0.4167 0.3030 24 0.0 0.0 0.0 8 0.7097 0.9167 0.8000 24 0.4384 0.3636 0.3975 0.9331
0.3847 5.0 465 0.3317 0.0 0.0 0.0 3 0.3333 0.4 0.3636 5 0.3889 0.2917 0.3333 24 0.2745 0.5833 0.3733 24 1.0 0.75 0.8571 8 0.88 0.9167 0.8980 24 0.4811 0.5795 0.5258 0.9574
0.3015 6.0 558 0.2654 0.0 0.0 0.0 3 0.3333 0.4 0.3636 5 0.48 0.5 0.4898 24 0.4737 0.75 0.5806 24 0.8889 1.0 0.9412 8 0.8462 0.9167 0.8800 24 0.5962 0.7045 0.6458 0.9653
0.2233 7.0 651 0.2370 1.0 0.6667 0.8 3 0.6667 0.8 0.7273 5 0.6957 0.6667 0.6809 24 0.625 0.8333 0.7143 24 1.0 1.0 1.0 8 0.8148 0.9167 0.8627 24 0.7347 0.8182 0.7742 0.9726
0.1814 8.0 744 0.2190 0.5 1.0 0.6667 3 0.6667 0.8 0.7273 5 0.6818 0.625 0.6522 24 0.7 0.875 0.7778 24 1.0 1.0 1.0 8 0.88 0.9167 0.8980 24 0.7526 0.8295 0.7892 0.9708
0.1547 9.0 837 0.1815 1.0 0.6667 0.8 3 1.0 1.0 1.0 5 0.7391 0.7083 0.7234 24 0.8 0.8333 0.8163 24 1.0 1.0 1.0 8 0.9583 0.9583 0.9583 24 0.8621 0.8523 0.8571 0.9836
0.1258 10.0 930 0.1799 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.5714 0.6667 0.6154 24 0.6897 0.8333 0.7547 24 1.0 1.0 1.0 8 0.92 0.9583 0.9388 24 0.7653 0.8523 0.8065 0.9805
0.1088 11.0 1023 0.1498 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7037 0.7917 0.7451 24 0.7586 0.9167 0.8302 24 1.0 1.0 1.0 8 0.9583 0.9583 0.9583 24 0.8333 0.9091 0.8696 0.9842
0.0916 12.0 1116 0.1572 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.76 0.7917 0.7755 24 0.7241 0.875 0.7925 24 1.0 1.0 1.0 8 0.8519 0.9583 0.9020 24 0.8144 0.8977 0.8541 0.9805
0.0821 13.0 1209 0.1763 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7391 0.7083 0.7234 24 0.7692 0.8333 0.8 24 1.0 1.0 1.0 8 0.9545 0.875 0.9130 24 0.8506 0.8409 0.8457 0.9812
0.0733 14.0 1302 0.1632 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.6538 0.7083 0.68 24 0.6452 0.8333 0.7273 24 1.0 1.0 1.0 8 0.9565 0.9167 0.9362 24 0.7812 0.8523 0.8152 0.9757
0.0691 15.0 1395 0.1536 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.75 0.75 0.75 24 0.7692 0.8333 0.8 24 1.0 1.0 1.0 8 0.88 0.9167 0.8980 24 0.8352 0.8636 0.8492 0.9812
0.063 16.0 1488 0.1420 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7391 0.7083 0.7234 24 0.8519 0.9583 0.9020 24 1.0 1.0 1.0 8 0.9565 0.9167 0.9362 24 0.8764 0.8864 0.8814 0.9842
0.0565 17.0 1581 0.2375 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7647 0.5417 0.6341 24 0.7727 0.7083 0.7391 24 1.0 1.0 1.0 8 0.9565 0.9167 0.9362 24 0.8718 0.7727 0.8193 0.9775
0.0567 18.0 1674 0.1838 0.75 1.0 0.8571 3 1.0 1.0 1.0 5 0.75 0.5 0.6 24 0.7407 0.8333 0.7843 24 1.0 1.0 1.0 8 0.9583 0.9583 0.9583 24 0.8452 0.8068 0.8256 0.9775
0.0515 19.0 1767 0.1360 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.6538 0.7083 0.68 24 0.8077 0.875 0.8400 24 1.0 1.0 1.0 8 0.9583 0.9583 0.9583 24 0.8370 0.875 0.8556 0.9830
0.0484 20.0 1860 0.1505 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7391 0.7083 0.7234 24 0.875 0.875 0.875 24 1.0 1.0 1.0 8 0.9545 0.875 0.9130 24 0.8824 0.8523 0.8671 0.9842
0.0444 21.0 1953 0.1718 0.75 1.0 0.8571 3 1.0 1.0 1.0 5 0.6 0.625 0.6122 24 0.7407 0.8333 0.7843 24 0.8889 1.0 0.9412 8 0.9565 0.9167 0.9362 24 0.7849 0.8295 0.8066 0.9787
0.0449 22.0 2046 0.1626 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7727 0.7083 0.7391 24 0.84 0.875 0.8571 24 1.0 1.0 1.0 8 0.9167 0.9167 0.9167 24 0.8736 0.8636 0.8686 0.9812
0.0355 23.0 2139 0.1532 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.8095 0.7083 0.7556 24 0.8462 0.9167 0.8800 24 1.0 1.0 1.0 8 0.9167 0.9167 0.9167 24 0.8851 0.875 0.8800 0.9824
0.0356 24.0 2232 0.1612 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7391 0.7083 0.7234 24 0.84 0.875 0.8571 24 1.0 1.0 1.0 8 0.9545 0.875 0.9130 24 0.8721 0.8523 0.8621 0.9830
0.0332 25.0 2325 0.1237 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7391 0.7083 0.7234 24 0.8846 0.9583 0.9200 24 1.0 1.0 1.0 8 0.92 0.9583 0.9388 24 0.8778 0.8977 0.8876 0.9848
0.029 26.0 2418 0.1259 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7083 0.7083 0.7083 24 0.88 0.9167 0.8980 24 1.0 1.0 1.0 8 0.9545 0.875 0.9130 24 0.8736 0.8636 0.8686 0.9860
0.0272 27.0 2511 0.1316 0.75 1.0 0.8571 3 1.0 1.0 1.0 5 0.75 0.75 0.75 24 0.8214 0.9583 0.8846 24 1.0 1.0 1.0 8 0.92 0.9583 0.9388 24 0.8511 0.9091 0.8791 0.9799
0.0265 28.0 2604 0.1369 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.8095 0.7083 0.7556 24 0.7931 0.9583 0.8679 24 1.0 1.0 1.0 8 0.9565 0.9167 0.9362 24 0.8764 0.8864 0.8814 0.9830
0.0271 29.0 2697 0.1078 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7143 0.8333 0.7692 24 0.8 0.8333 0.8163 24 1.0 1.0 1.0 8 0.9583 0.9583 0.9583 24 0.8495 0.8977 0.8729 0.9848
0.0219 30.0 2790 0.1322 1.0 1.0 1.0 3 1.0 1.0 1.0 5 0.7917 0.7917 0.7917 24 0.8462 0.9167 0.8800 24 1.0 1.0 1.0 8 0.92 0.9583 0.9388 24 0.8791 0.9091 0.8939 0.9836

Framework versions

  • Transformers 4.23.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1