Edit model card

layout_qa_hparam_tuning

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3973

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 5
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
6.0364 0.28 50 5.7109
5.6991 0.55 100 5.3444
5.3564 0.83 150 5.0481
5.1086 1.1 200 4.8591
4.8464 1.38 250 4.6824
4.7178 1.66 300 4.5995
4.6003 1.93 350 4.4761
4.4415 2.21 400 4.3781
4.3911 2.49 450 4.3017
4.2507 2.76 500 4.2496
4.1975 3.04 550 4.2142
4.0971 3.31 600 4.1524
4.0671 3.59 650 4.1038
4.0225 3.87 700 4.0486
3.9641 4.14 750 4.0478
3.9662 4.42 800 4.0082
3.8185 4.7 850 4.0001
3.8798 4.97 900 3.9235
3.7622 5.25 950 3.9549
3.7109 5.52 1000 3.8668
3.7218 5.8 1050 3.8849
3.6718 6.08 1100 3.9426
3.6925 6.35 1150 3.8288
3.5893 6.63 1200 3.8240
3.5545 6.91 1250 3.8149
3.4922 7.18 1300 3.8104
3.5117 7.46 1350 3.8128
3.3699 7.73 1400 3.7590
3.4538 8.01 1450 3.7577
3.3669 8.29 1500 3.7370
3.3516 8.56 1550 3.7278
3.3264 8.84 1600 3.6671
3.3102 9.12 1650 3.6953
3.241 9.39 1700 3.6474
3.278 9.67 1750 3.8793
3.2593 9.94 1800 3.6447
3.1663 10.22 1850 3.8442
3.0952 10.5 1900 3.6431
3.1355 10.77 1950 3.6261
3.0874 11.05 2000 3.5631
3.0178 11.33 2050 3.5662
2.9257 11.6 2100 3.4744
2.9164 11.88 2150 3.4374
2.8061 12.15 2200 3.4550
2.8664 12.43 2250 3.4217
2.7886 12.71 2300 3.4294
2.8398 12.98 2350 3.3906
2.7823 13.26 2400 3.4311
2.7024 13.54 2450 3.4267
2.7443 13.81 2500 3.3412
2.6747 14.09 2550 3.3656
2.723 14.36 2600 3.5019
2.6278 14.64 2650 3.4287
2.5001 14.92 2700 3.5152
2.5718 15.19 2750 3.3615
2.5734 15.47 2800 3.3193
2.5112 15.75 2850 3.4028
2.4499 16.02 2900 3.4374
2.4631 16.3 2950 3.3910
2.4246 16.57 3000 3.2926
2.4075 16.85 3050 3.1869
2.3691 17.13 3100 3.2002
2.3557 17.4 3150 3.1995
2.309 17.68 3200 3.3596
2.2738 17.96 3250 3.2819
2.2371 18.23 3300 3.2793
2.2578 18.51 3350 3.1955
2.1887 18.78 3400 3.1516
2.2166 19.06 3450 3.1920
2.1767 19.34 3500 3.0891
2.1307 19.61 3550 3.1467
2.1769 19.89 3600 3.1935
2.0798 20.17 3650 3.2426
2.1029 20.44 3700 3.1828
2.0654 20.72 3750 3.2298
1.997 20.99 3800 3.2313
1.9933 21.27 3850 3.1501
2.0084 21.55 3900 3.0830
1.9963 21.82 3950 3.2029
1.889 22.1 4000 3.2676
2.0014 22.38 4050 3.0189
1.9031 22.65 4100 3.0549
1.9464 22.93 4150 3.2659
1.8972 23.2 4200 3.2271
1.8549 23.48 4250 3.0585
1.8106 23.76 4300 3.2286
1.8222 24.03 4350 3.2233
1.8537 24.31 4400 2.9525
1.7717 24.59 4450 3.1129
1.8045 24.86 4500 3.1795
1.7783 25.14 4550 3.1206
1.7119 25.41 4600 3.1325
1.6936 25.69 4650 3.0850
1.776 25.97 4700 2.8785
1.7269 26.24 4750 2.9847
1.6276 26.52 4800 3.0743
1.6228 26.8 4850 3.1257
1.7509 27.07 4900 3.0451
1.6658 27.35 4950 3.1540
1.6688 27.62 5000 2.9553
1.5576 27.9 5050 3.0843
1.5457 28.18 5100 3.1677
1.638 28.45 5150 3.0357
1.5004 28.73 5200 3.0918
1.6639 29.01 5250 3.0215
1.5465 29.28 5300 3.1257
1.4719 29.56 5350 3.0513
1.5599 29.83 5400 3.0366
1.5755 30.11 5450 2.9535
1.496 30.39 5500 3.0343
1.5915 30.66 5550 3.1121
1.4198 30.94 5600 3.0673
1.5062 31.22 5650 2.9743
1.3817 31.49 5700 3.0471
1.4361 31.77 5750 2.9827
1.4624 32.04 5800 3.2212
1.4895 32.32 5850 3.0745
1.4598 32.6 5900 3.0424
1.4379 32.87 5950 3.0214
1.429 33.15 6000 3.9556
1.4837 33.43 6050 3.0527
1.4427 33.7 6100 3.0360
1.6037 33.98 6150 3.0011
1.3789 34.25 6200 2.9842
1.4559 34.53 6250 2.9825
1.3494 34.81 6300 3.0216
1.3313 35.08 6350 2.9506
1.3074 35.36 6400 2.9899
1.3534 35.64 6450 3.3824
1.4189 35.91 6500 2.9109
1.2795 36.19 6550 3.2013
1.377 36.46 6600 3.1894
1.3627 36.74 6650 3.0203
1.3731 37.02 6700 3.0597
1.2557 37.29 6750 3.1781
1.362 37.57 6800 3.3320
1.3448 37.85 6850 3.0893
1.3337 38.12 6900 3.3698
1.3455 38.4 6950 3.0614
1.3397 38.67 7000 3.2179
1.2439 38.95 7050 3.1908
1.25 39.23 7100 3.3292
1.3099 39.5 7150 3.1604
1.3465 39.78 7200 3.1365
1.2703 40.06 7250 3.2937
1.2662 40.33 7300 3.3199
1.233 40.61 7350 3.1995
1.2786 40.88 7400 3.1360
1.3409 41.16 7450 3.1513
1.2395 41.44 7500 3.2488
1.1858 41.71 7550 3.3637
1.3312 41.99 7600 3.2043
1.2245 42.27 7650 3.3381
1.2631 42.54 7700 3.3504
1.257 42.82 7750 3.1843
1.1715 43.09 7800 3.3320
1.2017 43.37 7850 3.1980
1.2711 43.65 7900 3.2528
1.2091 43.92 7950 3.1928
1.2574 44.2 8000 3.4765
1.1915 44.48 8050 3.2830
1.1754 44.75 8100 3.3196
1.263 45.03 8150 3.2323
1.1522 45.3 8200 3.2954
1.1563 45.58 8250 3.3078
1.2196 45.86 8300 3.4295
1.2375 46.13 8350 3.3431
1.2307 46.41 8400 3.3140
1.1926 46.69 8450 3.3558
1.1743 46.96 8500 3.2817
1.1721 47.24 8550 3.2732
1.192 47.51 8600 3.3022
1.1642 47.79 8650 3.3513
1.2049 48.07 8700 3.3494
1.1157 48.34 8750 3.3900
1.2006 48.62 8800 3.3109
1.1384 48.9 8850 3.3915
1.1437 49.17 8900 3.4193
1.2226 49.45 8950 3.3782
1.1074 49.72 9000 3.3965
1.1955 50.0 9050 3.3973

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
24
Safetensors
Model size
200M params
Tensor type
F32
Β·

Finetuned from

Spaces using PrimWong/layout_qa_hparam_tuning 2