Edit model card

2024-01-09_one_stage_subgraphs_weighted_txt_vision_enc_all_ramp

This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1363
  • Accuracy: 0.78
  • Exit 0 Accuracy: 0.1575
  • Exit 1 Accuracy: 0.2375
  • Exit 2 Accuracy: 0.3475
  • Exit 3 Accuracy: 0.27
  • Exit 4 Accuracy: 0.3275
  • Exit 5 Accuracy: 0.51
  • Exit 6 Accuracy: 0.6075
  • Exit 7 Accuracy: 0.7225
  • Exit 8 Accuracy: 0.7575
  • Exit 9 Accuracy: 0.76
  • Exit 10 Accuracy: 0.735
  • Exit 11 Accuracy: 0.7625
  • Exit 12 Accuracy: 0.785
  • Exit 13 Accuracy: 0.7775

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 24
  • total_train_batch_size: 48
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Accuracy Exit 0 Accuracy Exit 1 Accuracy Exit 2 Accuracy Exit 3 Accuracy Exit 4 Accuracy Exit 5 Accuracy Exit 6 Accuracy Exit 7 Accuracy Exit 8 Accuracy Exit 9 Accuracy Exit 10 Accuracy Exit 11 Accuracy Exit 12 Accuracy Exit 13 Accuracy
No log 0.96 16 2.6689 0.185 0.0625 0.09 0.06 0.085 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.1075 0.0725
No log 1.98 33 2.4891 0.2725 0.0875 0.1175 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.0625 0.065 0.15
No log 3.0 50 2.2773 0.3425 0.1075 0.125 0.09 0.1125 0.0625 0.0625 0.0625 0.065 0.0625 0.0625 0.0625 0.0625 0.0925 0.2375
No log 3.96 66 2.0666 0.4 0.1 0.1175 0.0925 0.115 0.0625 0.0775 0.0625 0.065 0.0625 0.0625 0.0625 0.0625 0.065 0.305
No log 4.98 83 1.8597 0.485 0.12 0.12 0.0775 0.1225 0.0625 0.085 0.065 0.1125 0.0625 0.0625 0.0625 0.0625 0.0975 0.3275
No log 6.0 100 1.5470 0.5875 0.1225 0.135 0.1075 0.12 0.0625 0.0625 0.095 0.18 0.0825 0.0625 0.0625 0.0625 0.125 0.4425
No log 6.96 116 1.3384 0.6575 0.13 0.145 0.115 0.0825 0.065 0.0725 0.0925 0.19 0.1725 0.0625 0.0625 0.0625 0.12 0.5825
No log 7.98 133 1.1479 0.7125 0.1225 0.1475 0.11 0.135 0.0725 0.0925 0.12 0.2425 0.265 0.0625 0.0625 0.0625 0.12 0.63
No log 9.0 150 1.0184 0.745 0.1175 0.16 0.1575 0.1425 0.0775 0.095 0.1525 0.315 0.3775 0.0625 0.0625 0.0625 0.1225 0.68
No log 9.96 166 0.9324 0.7725 0.1225 0.1625 0.1425 0.13 0.095 0.1225 0.18 0.31 0.465 0.0625 0.0625 0.0625 0.1225 0.7225
No log 10.98 183 0.9027 0.755 0.125 0.165 0.1775 0.135 0.0975 0.125 0.24 0.405 0.515 0.0625 0.0625 0.0625 0.145 0.7325
No log 12.0 200 0.8841 0.7725 0.1225 0.1725 0.2 0.1425 0.095 0.115 0.3075 0.4525 0.5375 0.0625 0.0625 0.0625 0.2325 0.7525
No log 12.96 216 0.8951 0.7625 0.125 0.1725 0.2325 0.0625 0.115 0.12 0.3675 0.505 0.5425 0.0625 0.0625 0.085 0.2975 0.76
No log 13.98 233 0.9242 0.75 0.135 0.1825 0.1825 0.12 0.1675 0.13 0.3325 0.485 0.6225 0.0625 0.0625 0.0975 0.425 0.75
No log 15.0 250 0.8407 0.7775 0.13 0.185 0.165 0.11 0.1075 0.1275 0.3575 0.51 0.62 0.0625 0.0625 0.0975 0.455 0.7875
No log 15.96 266 1.0176 0.7325 0.165 0.19 0.1475 0.0925 0.2 0.1075 0.445 0.4625 0.6225 0.0625 0.0625 0.1375 0.5 0.7375
No log 16.98 283 0.9081 0.7825 0.15 0.19 0.17 0.095 0.21 0.125 0.455 0.5225 0.645 0.135 0.0925 0.2125 0.6275 0.79
No log 18.0 300 0.9397 0.7775 0.1425 0.1975 0.235 0.1125 0.195 0.1775 0.4225 0.56 0.665 0.1175 0.0625 0.195 0.665 0.785
No log 18.96 316 1.0205 0.765 0.1375 0.205 0.155 0.1 0.245 0.2225 0.4825 0.6275 0.6925 0.2175 0.37 0.17 0.71 0.7675
No log 19.98 333 0.9132 0.7975 0.1425 0.2 0.2525 0.1225 0.2775 0.3125 0.5375 0.6275 0.695 0.2175 0.305 0.2775 0.755 0.7975
No log 21.0 350 0.9610 0.78 0.1525 0.1975 0.2 0.1675 0.265 0.3225 0.5275 0.62 0.69 0.3975 0.285 0.435 0.7425 0.78
No log 21.96 366 0.9429 0.785 0.155 0.1975 0.2325 0.1425 0.2675 0.36 0.4975 0.645 0.7175 0.54 0.3725 0.55 0.775 0.78
No log 22.98 383 0.9549 0.795 0.1525 0.205 0.2625 0.16 0.29 0.3375 0.535 0.6525 0.72 0.5025 0.3675 0.635 0.765 0.795
No log 24.0 400 0.9764 0.7775 0.155 0.205 0.235 0.19 0.275 0.39 0.515 0.655 0.725 0.625 0.4275 0.615 0.7725 0.7825
No log 24.96 416 0.9804 0.7775 0.16 0.2025 0.32 0.16 0.2725 0.4125 0.5225 0.6425 0.7325 0.6375 0.4425 0.685 0.7775 0.7825
No log 25.98 433 0.9886 0.775 0.165 0.2075 0.3025 0.17 0.355 0.4875 0.5325 0.665 0.7325 0.6925 0.52 0.6175 0.7775 0.7775
No log 27.0 450 0.9906 0.7775 0.145 0.21 0.28 0.2075 0.33 0.4225 0.52 0.6675 0.7375 0.705 0.59 0.6675 0.7725 0.775
No log 27.96 466 1.0077 0.7875 0.1475 0.21 0.2625 0.23 0.3525 0.445 0.515 0.665 0.74 0.695 0.5975 0.6625 0.78 0.785
No log 28.98 483 1.0239 0.7775 0.1625 0.2075 0.3325 0.2075 0.35 0.455 0.55 0.6725 0.7425 0.72 0.6275 0.6925 0.7725 0.7825
0.4097 30.0 500 1.0323 0.785 0.16 0.21 0.375 0.3 0.365 0.49 0.555 0.68 0.7425 0.73 0.5575 0.705 0.7825 0.785
0.4097 30.96 516 1.0327 0.785 0.1525 0.2175 0.2975 0.205 0.3425 0.4625 0.55 0.68 0.74 0.7275 0.6 0.6925 0.785 0.7875
0.4097 31.98 533 1.0470 0.775 0.1575 0.22 0.3275 0.2575 0.36 0.51 0.56 0.695 0.7375 0.7275 0.5925 0.6725 0.785 0.775
0.4097 33.0 550 1.0538 0.785 0.17 0.2225 0.335 0.2775 0.375 0.5075 0.5575 0.69 0.7425 0.73 0.5725 0.7125 0.785 0.78
0.4097 33.96 566 1.0645 0.7775 0.155 0.22 0.3225 0.2425 0.3425 0.4675 0.555 0.7025 0.745 0.725 0.56 0.7175 0.7775 0.78
0.4097 34.98 583 1.0745 0.7825 0.155 0.23 0.305 0.2625 0.3725 0.4575 0.5625 0.7025 0.7375 0.7325 0.6275 0.735 0.7775 0.7775
0.4097 36.0 600 1.1022 0.7775 0.155 0.2275 0.305 0.245 0.3675 0.4775 0.5725 0.7025 0.7375 0.735 0.6425 0.7425 0.775 0.78
0.4097 36.96 616 1.1185 0.7675 0.1575 0.23 0.3175 0.2825 0.3575 0.48 0.575 0.7 0.74 0.7325 0.6425 0.7475 0.775 0.7725
0.4097 37.98 633 1.1017 0.7775 0.1575 0.23 0.31 0.2575 0.3575 0.4875 0.58 0.6975 0.75 0.735 0.6475 0.76 0.7775 0.775
0.4097 39.0 650 1.1036 0.7775 0.16 0.2325 0.3225 0.265 0.3375 0.49 0.5775 0.695 0.7425 0.72 0.6825 0.7375 0.7825 0.7775
0.4097 39.96 666 1.1200 0.775 0.165 0.2325 0.345 0.255 0.3325 0.4875 0.5875 0.7 0.7375 0.7175 0.6875 0.7425 0.7725 0.7775
0.4097 40.98 683 1.1207 0.775 0.165 0.23 0.3375 0.2575 0.3125 0.4975 0.5875 0.705 0.75 0.7275 0.6925 0.7525 0.7775 0.775
0.4097 42.0 700 1.1443 0.7725 0.165 0.2375 0.3475 0.245 0.3175 0.4925 0.5875 0.705 0.7525 0.72 0.675 0.7475 0.7675 0.77
0.4097 42.96 716 1.1497 0.7675 0.165 0.235 0.3575 0.2475 0.3325 0.51 0.585 0.7 0.7575 0.7275 0.695 0.7625 0.77 0.76
0.4097 43.98 733 1.1605 0.7625 0.17 0.2375 0.355 0.2675 0.3375 0.505 0.5875 0.7075 0.75 0.73 0.6975 0.7525 0.77 0.7625
0.4097 45.0 750 1.1612 0.7675 0.165 0.2375 0.345 0.2775 0.3225 0.5 0.6 0.705 0.755 0.7425 0.7025 0.755 0.775 0.77
0.4097 45.96 766 1.1632 0.7725 0.16 0.24 0.34 0.2625 0.3175 0.5 0.6 0.705 0.7575 0.7425 0.7225 0.7675 0.77 0.7725
0.4097 46.98 783 1.1555 0.7725 0.16 0.24 0.34 0.2725 0.3175 0.505 0.6 0.71 0.7525 0.7525 0.7175 0.76 0.7775 0.7775
0.4097 48.0 800 1.1437 0.775 0.155 0.2375 0.3325 0.2725 0.3175 0.4925 0.6 0.715 0.7575 0.7475 0.715 0.765 0.7825 0.7825
0.4097 48.96 816 1.1423 0.7775 0.155 0.2375 0.3325 0.2625 0.3175 0.5 0.6 0.715 0.76 0.75 0.7225 0.76 0.7775 0.7775
0.4097 49.98 833 1.1504 0.775 0.16 0.24 0.345 0.2625 0.32 0.505 0.6025 0.7175 0.75 0.75 0.7275 0.765 0.775 0.775
0.4097 51.0 850 1.1660 0.77 0.1575 0.24 0.3475 0.27 0.3175 0.5 0.605 0.715 0.755 0.7475 0.7325 0.7625 0.775 0.77
0.4097 51.96 866 1.1543 0.7775 0.16 0.2375 0.3425 0.27 0.325 0.5025 0.6075 0.72 0.76 0.7475 0.735 0.76 0.7825 0.775
0.4097 52.98 883 1.1363 0.7825 0.155 0.2375 0.3425 0.2875 0.325 0.505 0.605 0.7175 0.755 0.755 0.73 0.7625 0.7825 0.78
0.4097 54.0 900 1.1357 0.78 0.1575 0.2375 0.3425 0.275 0.3275 0.51 0.6075 0.7225 0.7575 0.755 0.735 0.7625 0.785 0.7775
0.4097 54.96 916 1.1354 0.78 0.1575 0.2375 0.3425 0.2725 0.33 0.51 0.6075 0.725 0.755 0.755 0.7375 0.7625 0.7875 0.78
0.4097 55.98 933 1.1356 0.78 0.1575 0.2375 0.3425 0.2725 0.3275 0.51 0.6075 0.725 0.7575 0.76 0.7375 0.7625 0.7875 0.78
0.4097 57.0 950 1.1362 0.7825 0.1575 0.2375 0.3475 0.27 0.3275 0.51 0.6075 0.7225 0.7575 0.76 0.735 0.7625 0.785 0.7775
0.4097 57.6 960 1.1363 0.78 0.1575 0.2375 0.3475 0.27 0.3275 0.51 0.6075 0.7225 0.7575 0.76 0.735 0.7625 0.785 0.7775

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
9

Finetuned from