Edit model card

layout_lmqa2

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 5.9655

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
2.2013 0.22 50 3.7438
2.3774 0.44 100 2.5267
2.2229 0.66 150 2.7223
2.1498 0.88 200 2.8705
1.4533 1.11 250 3.0854
1.5673 1.33 300 2.8091
1.5592 1.55 350 2.6279
1.485 1.77 400 3.3509
1.681 1.99 450 3.1878
1.1367 2.21 500 2.9910
1.1526 2.43 550 3.2224
1.1906 2.65 600 2.5136
1.074 2.88 650 2.6405
1.0517 3.1 700 3.2235
0.8895 3.32 750 3.3759
1.0624 3.54 800 3.0902
0.9233 3.76 850 2.7508
1.132 3.98 900 3.3255
0.8189 4.2 950 3.2207
0.7035 4.42 1000 3.0181
0.9637 4.65 1050 3.1403
0.8076 4.87 1100 3.2147
1.0471 5.09 1150 3.3560
0.5627 5.31 1200 3.4104
0.5421 5.53 1250 3.5375
0.5731 5.75 1300 3.0006
0.73 5.97 1350 3.0435
0.6019 6.19 1400 3.3767
0.4892 6.42 1450 3.6138
0.4614 6.64 1500 3.9243
0.5162 6.86 1550 3.5715
0.4991 7.08 1600 3.2088
0.3386 7.3 1650 3.5060
0.575 7.52 1700 3.4207
0.2257 7.74 1750 3.4735
0.5039 7.96 1800 3.5317
0.3965 8.19 1850 3.6146
0.3873 8.41 1900 3.2403
0.5208 8.63 1950 3.8434
0.3888 8.85 2000 3.9669
0.4219 9.07 2050 3.5631
0.3614 9.29 2100 3.9210
0.3197 9.51 2150 3.9130
0.2054 9.73 2200 3.7559
0.5968 9.96 2250 3.2244
0.2377 10.18 2300 3.5148
0.297 10.4 2350 3.5326
0.2642 10.62 2400 3.8176
0.2677 10.84 2450 4.1067
0.2095 11.06 2500 4.3299
0.3002 11.28 2550 3.9936
0.2364 11.5 2600 3.9769
0.1413 11.73 2650 3.9973
0.2019 11.95 2700 4.0319
0.1415 12.17 2750 4.2180
0.15 12.39 2800 4.3259
0.1762 12.61 2850 4.5584
0.4126 12.83 2900 3.9886
0.2911 13.05 2950 4.0256
0.1152 13.27 3000 4.5233
0.2455 13.5 3050 4.2867
0.2496 13.72 3100 4.4002
0.0928 13.94 3150 4.3525
0.3045 14.16 3200 4.2719
0.1555 14.38 3250 3.8901
0.1172 14.6 3300 4.4369
0.2843 14.82 3350 3.7676
0.3302 15.04 3400 3.3223
0.092 15.27 3450 4.1753
0.1656 15.49 3500 3.9628
0.1641 15.71 3550 4.3255
0.165 15.93 3600 3.8722
0.1983 16.15 3650 3.6260
0.0686 16.37 3700 3.9814
0.0061 16.59 3750 4.4310
0.1354 16.81 3800 4.7237
0.0754 17.04 3850 4.7184
0.0883 17.26 3900 4.3898
0.1116 17.48 3950 4.7914
0.1589 17.7 4000 4.5216
0.1113 17.92 4050 4.6836
0.0655 18.14 4100 4.9408
0.0051 18.36 4150 5.1494
0.0871 18.58 4200 4.7780
0.0981 18.81 4250 4.6118
0.21 19.03 4300 4.2467
0.048 19.25 4350 5.1837
0.068 19.47 4400 4.7416
0.1022 19.69 4450 5.4841
0.175 19.91 4500 4.9699
0.1534 20.13 4550 4.7240
0.0797 20.35 4600 4.8518
0.0188 20.58 4650 5.5081
0.1331 20.8 4700 5.1632
0.0603 21.02 4750 5.0985
0.0343 21.24 4800 4.7654
0.0045 21.46 4850 4.9135
0.0891 21.68 4900 4.9972
0.0801 21.9 4950 4.5666
0.0022 22.12 5000 4.8593
0.0517 22.35 5050 4.7227
0.0367 22.57 5100 5.0086
0.0481 22.79 5150 4.8354
0.139 23.01 5200 4.8345
0.1258 23.23 5250 4.4733
0.005 23.45 5300 4.7410
0.0116 23.67 5350 5.0803
0.1254 23.89 5400 4.4456
0.0638 24.12 5450 5.0900
0.0216 24.34 5500 5.2054
0.0039 24.56 5550 5.3762
0.0889 24.78 5600 5.5210
0.0839 25.0 5650 5.6013
0.0269 25.22 5700 5.2511
0.0363 25.44 5750 5.2066
0.0042 25.66 5800 5.3123
0.1419 25.88 5850 5.2073
0.0727 26.11 5900 5.0850
0.009 26.33 5950 5.2158
0.1018 26.55 6000 5.2223
0.0017 26.77 6050 5.2139
0.1191 26.99 6100 5.6648
0.0256 27.21 6150 5.3956
0.0618 27.43 6200 5.2004
0.0266 27.65 6250 5.1969
0.0005 27.88 6300 5.2097
0.0917 28.1 6350 4.6288
0.0186 28.32 6400 5.0034
0.1229 28.54 6450 5.4629
0.0064 28.76 6500 5.7815
0.0585 28.98 6550 5.3538
0.2033 29.2 6600 4.8341
0.104 29.42 6650 5.3541
0.074 29.65 6700 5.0912
0.0066 29.87 6750 5.3359
0.1029 30.09 6800 4.8182
0.1277 30.31 6850 4.3439
0.0568 30.53 6900 4.3320
0.0103 30.75 6950 5.0165
0.0159 30.97 7000 5.1813
0.0005 31.19 7050 5.3596
0.0467 31.42 7100 4.9367
0.0004 31.64 7150 5.1753
0.0026 31.86 7200 5.5320
0.0239 32.08 7250 5.3541
0.0004 32.3 7300 5.4588
0.0253 32.52 7350 5.6500
0.0197 32.74 7400 5.6978
0.0011 32.96 7450 5.8706
0.0411 33.19 7500 5.7531
0.0011 33.41 7550 5.7070
0.0195 33.63 7600 5.6306
0.0182 33.85 7650 5.5179
0.0098 34.07 7700 5.6809
0.0695 34.29 7750 6.0599
0.0017 34.51 7800 5.8505
0.0222 34.73 7850 5.8474
0.014 34.96 7900 5.9761
0.0014 35.18 7950 5.9167
0.068 35.4 8000 5.1020
0.0237 35.62 8050 5.1683
0.015 35.84 8100 5.1664
0.0006 36.06 8150 5.2310
0.0142 36.28 8200 5.4119
0.0004 36.5 8250 5.5409
0.0027 36.73 8300 5.5143
0.0228 36.95 8350 5.5045
0.0004 37.17 8400 5.4856
0.0029 37.39 8450 5.6607
0.0619 37.61 8500 5.7278
0.1015 37.83 8550 5.7307
0.0006 38.05 8600 6.0086
0.0845 38.27 8650 5.5904
0.0139 38.5 8700 5.7250
0.0033 38.72 8750 5.7300
0.0911 38.94 8800 5.3312
0.0015 39.16 8850 5.4900
0.0714 39.38 8900 5.4430
0.0742 39.6 8950 5.3748
0.0156 39.82 9000 5.3902
0.0696 40.04 9050 5.2539
0.0514 40.27 9100 5.3639
0.0013 40.49 9150 5.4466
0.0021 40.71 9200 5.5072
0.0005 40.93 9250 5.6767
0.0004 41.15 9300 5.7561
0.0458 41.37 9350 5.6678
0.0168 41.59 9400 5.6505
0.0005 41.81 9450 5.7674
0.0004 42.04 9500 5.8361
0.0028 42.26 9550 5.7886
0.0042 42.48 9600 5.7266
0.0004 42.7 9650 5.7970
0.0058 42.92 9700 5.8543
0.0627 43.14 9750 5.8685
0.0004 43.36 9800 5.8885
0.0003 43.58 9850 5.9231
0.0044 43.81 9900 5.9154
0.0047 44.03 9950 5.9383
0.0033 44.25 10000 5.9505
0.005 44.47 10050 5.9172
0.0649 44.69 10100 5.9263
0.0003 44.91 10150 5.8487
0.0003 45.13 10200 5.8564
0.0003 45.35 10250 5.8637
0.0034 45.58 10300 5.8800
0.0003 45.8 10350 5.9121
0.0052 46.02 10400 5.9066
0.0003 46.24 10450 5.9025
0.0043 46.46 10500 5.8860
0.0007 46.68 10550 5.9075
0.0003 46.9 10600 5.9482
0.0043 47.12 10650 5.9420
0.0003 47.35 10700 5.9459
0.0003 47.57 10750 5.9508
0.0075 47.79 10800 5.9489
0.0004 48.01 10850 5.9076
0.0485 48.23 10900 5.9280
0.0003 48.45 10950 5.9304
0.0031 48.67 11000 5.9398
0.0045 48.89 11050 5.9457
0.0003 49.12 11100 5.9482
0.0003 49.34 11150 5.9483
0.0003 49.56 11200 5.9479
0.0091 49.78 11250 5.9656
0.0003 50.0 11300 5.9655

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
200M params
Tensor type
F32
·

Space using PrimWong/layout_lmqa2 1