Edit model card

layoutlmv2-finetuned-sroie

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the sroie dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0291
  • Address Precision: 0.9341
  • Address Recall: 0.9395
  • Address F1: 0.9368
  • Address Number: 347
  • Company Precision: 0.9570
  • Company Recall: 0.9625
  • Company F1: 0.9598
  • Company Number: 347
  • Date Precision: 0.9885
  • Date Recall: 0.9885
  • Date F1: 0.9885
  • Date Number: 347
  • Total Precision: 0.9253
  • Total Recall: 0.9280
  • Total F1: 0.9266
  • Total Number: 347
  • Overall Precision: 0.9512
  • Overall Recall: 0.9546
  • Overall F1: 0.9529
  • Overall Accuracy: 0.9961

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 3000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Address Precision Address Recall Address F1 Address Number Company Precision Company Recall Company F1 Company Number Date Precision Date Recall Date F1 Date Number Total Precision Total Recall Total F1 Total Number Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 0.05 157 0.8162 0.3670 0.7233 0.4869 347 0.0617 0.0144 0.0234 347 0.0 0.0 0.0 347 0.0 0.0 0.0 347 0.3346 0.1844 0.2378 0.9342
No log 1.05 314 0.3490 0.8564 0.8934 0.8745 347 0.8610 0.9280 0.8932 347 0.7297 0.8559 0.7878 347 0.0 0.0 0.0 347 0.8128 0.6693 0.7341 0.9826
No log 2.05 471 0.1845 0.7970 0.9049 0.8475 347 0.9211 0.9424 0.9316 347 0.9885 0.9885 0.9885 347 0.0 0.0 0.0 347 0.8978 0.7089 0.7923 0.9835
0.7027 3.05 628 0.1194 0.9040 0.9222 0.9130 347 0.8880 0.9135 0.9006 347 0.9885 0.9885 0.9885 347 0.0 0.0 0.0 347 0.9263 0.7061 0.8013 0.9853
0.7027 4.05 785 0.0762 0.9397 0.9424 0.9410 347 0.8889 0.9222 0.9052 347 0.9885 0.9885 0.9885 347 0.7740 0.9078 0.8355 347 0.8926 0.9402 0.9158 0.9928
0.7027 5.05 942 0.0564 0.9282 0.9308 0.9295 347 0.9296 0.9510 0.9402 347 0.9885 0.9885 0.9885 347 0.7801 0.8588 0.8176 347 0.9036 0.9323 0.9177 0.9946
0.0935 6.05 1099 0.0548 0.9222 0.9222 0.9222 347 0.6975 0.7378 0.7171 347 0.9885 0.9885 0.9885 347 0.8608 0.8732 0.8670 347 0.8648 0.8804 0.8725 0.9921
0.0935 7.05 1256 0.0410 0.92 0.9280 0.9240 347 0.9486 0.9568 0.9527 347 0.9885 0.9885 0.9885 347 0.9091 0.9222 0.9156 347 0.9414 0.9488 0.9451 0.9961
0.0935 8.05 1413 0.0369 0.9368 0.9395 0.9381 347 0.9569 0.9597 0.9583 347 0.9772 0.9885 0.9828 347 0.9143 0.9222 0.9182 347 0.9463 0.9524 0.9494 0.9960
0.038 9.05 1570 0.0343 0.9282 0.9308 0.9295 347 0.9624 0.9597 0.9610 347 0.9885 0.9885 0.9885 347 0.9206 0.9020 0.9112 347 0.9500 0.9452 0.9476 0.9958
0.038 10.05 1727 0.0317 0.9395 0.9395 0.9395 347 0.9598 0.9625 0.9612 347 0.9885 0.9885 0.9885 347 0.9280 0.9280 0.9280 347 0.9539 0.9546 0.9543 0.9963
0.038 11.05 1884 0.0312 0.9368 0.9395 0.9381 347 0.9514 0.9597 0.9555 347 0.9885 0.9885 0.9885 347 0.9226 0.9280 0.9253 347 0.9498 0.9539 0.9518 0.9960
0.0236 12.05 2041 0.0318 0.9368 0.9395 0.9381 347 0.9570 0.9625 0.9598 347 0.9885 0.9885 0.9885 347 0.9043 0.8991 0.9017 347 0.9467 0.9474 0.9471 0.9956
0.0236 13.05 2198 0.0291 0.9337 0.9337 0.9337 347 0.9598 0.9625 0.9612 347 0.9885 0.9885 0.9885 347 0.9164 0.9164 0.9164 347 0.9496 0.9503 0.9499 0.9960
0.0236 14.05 2355 0.0300 0.9286 0.9366 0.9326 347 0.9459 0.9568 0.9513 347 0.9885 0.9885 0.9885 347 0.9275 0.9222 0.9249 347 0.9476 0.9510 0.9493 0.9959
0.0178 15.05 2512 0.0307 0.9366 0.9366 0.9366 347 0.9513 0.9568 0.9540 347 0.9885 0.9885 0.9885 347 0.9275 0.9222 0.9249 347 0.9510 0.9510 0.9510 0.9959
0.0178 16.05 2669 0.0300 0.9312 0.9366 0.9339 347 0.9543 0.9625 0.9584 347 0.9885 0.9885 0.9885 347 0.9171 0.9251 0.9211 347 0.9477 0.9532 0.9504 0.9959
0.0178 17.05 2826 0.0292 0.9368 0.9395 0.9381 347 0.9570 0.9625 0.9598 347 0.9885 0.9885 0.9885 347 0.9253 0.9280 0.9266 347 0.9519 0.9546 0.9532 0.9961
0.0178 18.05 2983 0.0291 0.9341 0.9395 0.9368 347 0.9570 0.9625 0.9598 347 0.9885 0.9885 0.9885 347 0.9253 0.9280 0.9266 347 0.9512 0.9546 0.9529 0.9961
0.0149 19.01 3000 0.0291 0.9341 0.9395 0.9368 347 0.9570 0.9625 0.9598 347 0.9885 0.9885 0.9885 347 0.9253 0.9280 0.9266 347 0.9512 0.9546 0.9529 0.9961

Framework versions

  • Transformers 4.16.2
  • Pytorch 1.8.0+cu101
  • Datasets 1.18.4.dev0
  • Tokenizers 0.11.6
Downloads last month
679

Space using Theivaprakasham/layoutlmv2-finetuned-sroie 1