Edit model card

detr-finetuned-cppe-5-10k-steps

This model is a fine-tuned version of facebook/detr-resnet-50 on the namnguyen059/PDFextract dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3774
  • Map: 0.0076
  • Map 50: 0.0237
  • Map 75: 0.0004
  • Map Small: 0.0
  • Map Medium: 0.0117
  • Map Large: 0.0
  • Mar 1: 0.0014
  • Mar 10: 0.0459
  • Mar 100: 0.0622
  • Mar Small: 0.0
  • Mar Medium: 0.0958
  • Mar Large: 0.0
  • Map Table: 0.0
  • Mar 100 Table: 0.0
  • Map Text: 0.0152
  • Mar 100 Text: 0.1243

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Table Mar 100 Table Map Text Mar 100 Text
6.2935 1.0 5 5.7125 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
5.6711 2.0 10 5.4708 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
5.3937 3.0 15 5.2153 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
5.001 4.0 20 4.8025 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
4.8611 5.0 25 4.6390 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
4.6947 6.0 30 4.2093 0.0 0.0002 0.0 0.0 0.0 0.0021 0.0 0.0 0.0027 0.0 0.0 0.1 0.0 0.0 0.0001 0.0054
4.1881 7.0 35 3.8453 0.0003 0.0031 0.0 0.0 0.0006 0.0012 0.0 0.0014 0.0054 0.0 0.0021 0.15 0.0 0.0 0.0007 0.0108
3.9652 8.0 40 4.0073 0.003 0.0182 0.0 0.0 0.0052 0.0017 0.0014 0.0081 0.0122 0.0 0.0125 0.15 0.0 0.0 0.0059 0.0243
3.9875 9.0 45 4.0535 0.0001 0.0005 0.0001 0.0 0.0001 0.0019 0.0 0.0 0.0122 0.0 0.0063 0.3 0.0 0.0 0.0002 0.0243
4.041 10.0 50 4.0097 0.0002 0.0017 0.0 0.0 0.0004 0.0009 0.0 0.0 0.0081 0.0 0.0083 0.1 0.0 0.0 0.0003 0.0162
3.959 11.0 55 3.8365 0.0005 0.0021 0.0 0.0 0.0012 0.0003 0.0 0.0068 0.0122 0.0 0.0167 0.05 0.0 0.0 0.0009 0.0243
3.8681 12.0 60 3.6800 0.0007 0.0041 0.0 0.0 0.0014 0.001 0.0 0.0068 0.0149 0.0 0.0167 0.15 0.0 0.0 0.0013 0.0297
3.7309 13.0 65 3.6922 0.0029 0.0249 0.0 0.0 0.0068 0.0008 0.0014 0.0081 0.0189 0.0 0.025 0.1 0.0 0.0 0.0058 0.0378
3.647 14.0 70 3.7221 0.0024 0.0158 0.0 0.0 0.0045 0.0016 0.0014 0.0108 0.0162 0.0 0.0167 0.2 0.0 0.0 0.0048 0.0324
3.6564 15.0 75 3.6637 0.0007 0.0048 0.0 0.0 0.0012 0.0025 0.0014 0.0054 0.0189 0.0 0.0229 0.15 0.0 0.0 0.0014 0.0378
3.5905 16.0 80 3.6443 0.0078 0.0179 0.0 0.0 0.0116 0.0074 0.0135 0.0203 0.023 0.0 0.0312 0.1 0.0 0.0 0.0157 0.0459
3.8091 17.0 85 3.8066 0.0018 0.0122 0.0 0.0 0.003 0.0 0.0041 0.0108 0.0108 0.0 0.0167 0.0 0.0 0.0 0.0035 0.0216
3.8171 18.0 90 3.6044 0.0006 0.0032 0.0 0.0 0.0012 0.0 0.0 0.0054 0.0095 0.0 0.0146 0.0 0.0 0.0 0.0013 0.0189
3.7048 19.0 95 3.6161 0.0 0.0 0.0 0.0 0.0 0.0018 0.0 0.0 0.0068 0.0 0.0 0.25 0.0 0.0 0.0 0.0135
3.8635 20.0 100 3.6542 0.0004 0.0015 0.0 0.0 0.0009 0.0 0.0 0.0054 0.0095 0.0 0.0146 0.0 0.0 0.0 0.0009 0.0189
3.7021 21.0 105 4.0802 0.0032 0.016 0.0 0.0 0.0053 0.0 0.0027 0.0068 0.0108 0.0 0.0167 0.0 0.0 0.0 0.0064 0.0216
3.7035 22.0 110 4.0508 0.0006 0.0041 0.0 0.0 0.001 0.0 0.0 0.0014 0.0162 0.0 0.025 0.0 0.0 0.0 0.0012 0.0324
3.6404 23.0 115 4.1771 0.0013 0.0057 0.0005 0.0 0.0022 0.0 0.0068 0.0162 0.0189 0.0 0.0292 0.0 0.0 0.0 0.0027 0.0378
3.7922 24.0 120 4.4617 0.006 0.0149 0.0 0.0 0.0099 0.0 0.0054 0.0054 0.0095 0.0 0.0146 0.0 0.0 0.0 0.0119 0.0189
3.7619 25.0 125 3.8533 0.001 0.0026 0.0 0.0 0.0053 0.0008 0.0 0.0108 0.0135 0.0 0.0167 0.1 0.0 0.0 0.0019 0.027
3.804 26.0 130 3.7589 0.001 0.0073 0.0 0.0 0.0036 0.0 0.0 0.0081 0.0122 0.0 0.0188 0.0 0.0 0.0 0.0021 0.0243
3.7785 27.0 135 3.6378 0.003 0.0197 0.0 0.0 0.01 0.0003 0.0027 0.0162 0.0189 0.0 0.0271 0.05 0.0 0.0 0.006 0.0378
3.726 28.0 140 3.7696 0.0003 0.0021 0.0001 0.0 0.0009 0.0 0.0 0.0041 0.0149 0.0 0.0229 0.0 0.0 0.0 0.0006 0.0297
3.682 29.0 145 3.8782 0.0003 0.0006 0.0003 0.0 0.001 0.0 0.0 0.0081 0.0108 0.0 0.0167 0.0 0.0 0.0 0.0006 0.0216
3.7331 30.0 150 4.3841 0.0005 0.0027 0.0 0.0 0.002 0.0009 0.0 0.0081 0.0189 0.0 0.025 0.1 0.0 0.0 0.001 0.0378
3.7627 31.0 155 4.7688 0.0007 0.0054 0.0 0.0 0.003 0.0 0.0 0.0081 0.0189 0.0 0.0292 0.0 0.0 0.0 0.0015 0.0378
3.8656 32.0 160 5.0427 0.0009 0.0046 0.0 0.0 0.0039 0.0007 0.0 0.0027 0.0324 0.0 0.0417 0.2 0.0 0.0 0.0018 0.0649
3.8845 33.0 165 4.1952 0.0012 0.0056 0.0 0.0 0.0027 0.0 0.0 0.0108 0.0243 0.0 0.0375 0.0 0.0 0.0 0.0024 0.0486
3.8416 34.0 170 4.1403 0.0048 0.0128 0.0006 0.0 0.0118 0.0048 0.0 0.0311 0.0703 0.0 0.1021 0.15 0.0 0.0 0.0096 0.1405
3.807 35.0 175 4.6646 0.0031 0.0218 0.0002 0.0 0.0103 0.0125 0.0014 0.0176 0.05 0.0 0.075 0.05 0.0 0.0 0.0062 0.1
3.8728 36.0 180 4.5890 0.0009 0.0039 0.0004 0.0 0.002 0.0 0.0 0.0014 0.0432 0.0 0.0667 0.0 0.0 0.0 0.0018 0.0865
3.7952 37.0 185 4.0300 0.0013 0.0048 0.0007 0.0 0.0023 0.0 0.0 0.0176 0.023 0.0 0.0354 0.0 0.0 0.0 0.0025 0.0459
3.725 38.0 190 3.9829 0.0003 0.0025 0.0001 0.0 0.0005 0.0 0.0 0.0014 0.0135 0.0 0.0208 0.0 0.0 0.0 0.0006 0.027
3.6884 39.0 195 3.7402 0.0008 0.0036 0.0 0.0 0.0009 0.0119 0.0 0.0014 0.027 0.0 0.0333 0.2 0.0 0.0 0.0016 0.0541
3.6963 40.0 200 3.4723 0.0002 0.0015 0.0 0.0 0.0004 0.0 0.0 0.0 0.0081 0.0 0.0125 0.0 0.0 0.0 0.0004 0.0162
3.7368 41.0 205 3.6067 0.0 0.0004 0.0 0.0 0.0001 0.0 0.0 0.0 0.0027 0.0 0.0042 0.0 0.0 0.0 0.0001 0.0054
3.7349 42.0 210 3.6135 0.0 0.0002 0.0 0.0 0.0 0.0012 0.0 0.0014 0.0014 0.0 0.0 0.05 0.0 0.0 0.0 0.0027
3.6353 43.0 215 3.5835 0.0018 0.0062 0.0014 0.0 0.0074 0.0 0.0 0.0122 0.027 0.0 0.0417 0.0 0.0 0.0 0.0036 0.0541
3.6948 44.0 220 3.7714 0.0023 0.0077 0.0001 0.0 0.0045 0.0 0.0 0.0162 0.0324 0.0 0.05 0.0 0.0 0.0 0.0046 0.0649
3.7546 45.0 225 3.9600 0.0074 0.0207 0.0 0.0 0.0126 0.0 0.0054 0.0189 0.0243 0.0 0.0375 0.0 0.0 0.0 0.0149 0.0486
3.6691 46.0 230 3.8211 0.0018 0.0094 0.0 0.0 0.0037 0.0 0.0 0.0095 0.0297 0.0 0.0458 0.0 0.0 0.0 0.0037 0.0595
3.5355 47.0 235 3.6537 0.0006 0.0036 0.0001 0.0 0.0017 0.0 0.0 0.0068 0.0216 0.0 0.0333 0.0 0.0 0.0 0.0012 0.0432
3.5232 48.0 240 3.5552 0.0004 0.002 0.0 0.0 0.0018 0.0 0.0 0.0027 0.0149 0.0 0.0229 0.0 0.0 0.0 0.0008 0.0297
3.48 49.0 245 3.4924 0.0014 0.0126 0.0 0.0 0.0043 0.0017 0.0014 0.0108 0.0122 0.0 0.0167 0.05 0.0 0.0 0.0028 0.0243
3.6056 50.0 250 3.4697 0.0007 0.0024 0.0 0.0 0.0019 0.0002 0.0 0.0095 0.0122 0.0 0.0167 0.05 0.0 0.0 0.0014 0.0243
3.4806 51.0 255 3.5357 0.0045 0.0339 0.0 0.0 0.0129 0.0006 0.0014 0.023 0.0284 0.0 0.0417 0.05 0.0 0.0 0.0091 0.0568
3.5547 52.0 260 3.5216 0.0016 0.0141 0.0 0.0 0.004 0.0021 0.0 0.0108 0.0257 0.0 0.0292 0.25 0.0 0.0 0.0032 0.0514
3.483 53.0 265 3.4858 0.0048 0.028 0.0005 0.0 0.0077 0.0 0.0 0.0311 0.0405 0.0 0.0625 0.0 0.0 0.0 0.0095 0.0811
3.6952 54.0 270 3.4944 0.0083 0.0661 0.0 0.0 0.0131 0.0 0.0041 0.027 0.0284 0.0 0.0437 0.0 0.0 0.0 0.0165 0.0568
3.5447 55.0 275 3.5247 0.0049 0.0251 0.0 0.0 0.0079 0.0 0.0027 0.0176 0.0257 0.0 0.0396 0.0 0.0 0.0 0.0099 0.0514
3.6204 56.0 280 3.5457 0.0069 0.0452 0.0 0.0 0.0168 0.0 0.0068 0.0135 0.0257 0.0 0.0396 0.0 0.0 0.0 0.0137 0.0514
3.3873 57.0 285 3.4963 0.0134 0.0576 0.0 0.0 0.0218 0.0 0.0081 0.0365 0.0473 0.0 0.0729 0.0 0.0 0.0 0.0268 0.0946
3.5384 58.0 290 3.5618 0.0016 0.0083 0.0 0.0 0.0025 0.0 0.0 0.0122 0.0338 0.0 0.0521 0.0 0.0 0.0 0.0032 0.0676
3.6034 59.0 295 3.4600 0.0016 0.0102 0.0 0.0 0.0025 0.0 0.0 0.0081 0.0324 0.0 0.05 0.0 0.0 0.0 0.0032 0.0649
3.5524 60.0 300 3.4434 0.0012 0.0063 0.0004 0.0 0.0019 0.0 0.0 0.0041 0.0324 0.0 0.05 0.0 0.0 0.0 0.0024 0.0649
3.4222 61.0 305 3.5144 0.0043 0.0113 0.0002 0.0 0.0066 0.0 0.0 0.027 0.0419 0.0 0.0646 0.0 0.0 0.0 0.0086 0.0838
3.451 62.0 310 3.5189 0.0198 0.0582 0.0002 0.0 0.0304 0.0007 0.0189 0.0351 0.0446 0.0 0.0646 0.1 0.0 0.0 0.0396 0.0892
3.4485 63.0 315 3.4585 0.0125 0.0428 0.0 0.0 0.0202 0.0 0.0135 0.0297 0.0405 0.0 0.0625 0.0 0.0 0.0 0.0249 0.0811
3.475 64.0 320 3.3541 0.0063 0.0297 0.0012 0.0 0.0111 0.0016 0.0027 0.0216 0.0405 0.0 0.0583 0.1 0.0 0.0 0.0126 0.0811
3.4189 65.0 325 3.3517 0.0051 0.0315 0.0 0.0 0.0094 0.0 0.0014 0.0257 0.0378 0.0 0.0583 0.0 0.0 0.0 0.0102 0.0757
3.324 66.0 330 3.2557 0.0025 0.0134 0.0 0.0 0.0045 0.0 0.0 0.0216 0.0243 0.0 0.0375 0.0 0.0 0.0 0.005 0.0486
3.5049 67.0 335 3.3093 0.0031 0.0108 0.0 0.0 0.0051 0.0 0.0 0.027 0.027 0.0 0.0417 0.0 0.0 0.0 0.0061 0.0541
3.3219 68.0 340 3.3455 0.0119 0.0596 0.0008 0.0 0.0189 0.0 0.0095 0.0284 0.0432 0.0 0.0667 0.0 0.0 0.0 0.0239 0.0865
3.3633 69.0 345 3.4110 0.0268 0.0747 0.0034 0.0 0.0414 0.0 0.0203 0.0662 0.0662 0.0 0.1021 0.0 0.0 0.0 0.0536 0.1324
3.4628 70.0 350 3.3594 0.0024 0.0151 0.0002 0.0 0.0037 0.0 0.0014 0.0216 0.0405 0.0 0.0625 0.0 0.0 0.0 0.0048 0.0811
3.272 71.0 355 3.3655 0.0039 0.0174 0.0008 0.0 0.0061 0.0 0.0014 0.0149 0.0473 0.0 0.0729 0.0 0.0 0.0 0.0079 0.0946
3.3713 72.0 360 3.3455 0.0097 0.0609 0.0018 0.0 0.015 0.0 0.0108 0.0122 0.0473 0.0 0.0729 0.0 0.0 0.0 0.0195 0.0946
3.258 73.0 365 3.3679 0.0006 0.0025 0.0 0.0 0.0011 0.0 0.0 0.0068 0.023 0.0 0.0354 0.0 0.0 0.0 0.0013 0.0459
3.3666 74.0 370 3.3173 0.0004 0.0024 0.0 0.0 0.0008 0.0 0.0 0.0014 0.0216 0.0 0.0333 0.0 0.0 0.0 0.0009 0.0432
3.2339 75.0 375 3.3463 0.002 0.0103 0.0 0.0 0.0033 0.0 0.0 0.0081 0.0405 0.0 0.0625 0.0 0.0 0.0 0.004 0.0811
3.2601 76.0 380 3.3633 0.002 0.0111 0.0 0.0 0.0032 0.0 0.0 0.0081 0.0392 0.0 0.0604 0.0 0.0 0.0 0.0039 0.0784
3.3584 77.0 385 3.3331 0.0025 0.0108 0.0 0.0 0.004 0.0 0.0 0.0081 0.0486 0.0 0.075 0.0 0.0 0.0 0.0051 0.0973
3.2928 78.0 390 3.3012 0.0043 0.0139 0.0002 0.0 0.0066 0.0 0.0 0.0162 0.0635 0.0 0.0979 0.0 0.0 0.0 0.0085 0.127
3.2633 79.0 395 3.2913 0.0076 0.0226 0.0003 0.0 0.0117 0.0 0.0027 0.0392 0.0432 0.0 0.0667 0.0 0.0 0.0 0.0153 0.0865
3.3965 80.0 400 3.3141 0.008 0.0224 0.0051 0.0 0.0126 0.0 0.0014 0.0405 0.0419 0.0 0.0646 0.0 0.0 0.0 0.016 0.0838
3.2753 81.0 405 3.4127 0.0077 0.0231 0.0036 0.0 0.012 0.0 0.0 0.0405 0.0432 0.0 0.0667 0.0 0.0 0.0 0.0155 0.0865
3.3362 82.0 410 3.4194 0.0099 0.0365 0.0032 0.0 0.0152 0.0 0.0054 0.0473 0.0568 0.0 0.0875 0.0 0.0 0.0 0.0197 0.1135
3.3752 83.0 415 3.3421 0.0086 0.0411 0.0 0.0 0.0134 0.0 0.0014 0.0351 0.0432 0.0 0.0667 0.0 0.0 0.0 0.0173 0.0865
3.3222 84.0 420 3.2956 0.004 0.0234 0.0 0.0 0.0063 0.0 0.0014 0.0243 0.0243 0.0 0.0375 0.0 0.0 0.0 0.008 0.0486
3.3732 85.0 425 3.2515 0.0281 0.0614 0.0 0.0 0.0434 0.0 0.0257 0.0338 0.0459 0.0 0.0708 0.0 0.0 0.0 0.0562 0.0919
3.1932 86.0 430 3.2197 0.0263 0.0603 0.0 0.0 0.0409 0.0 0.0243 0.0297 0.0459 0.0 0.0708 0.0 0.0 0.0 0.0527 0.0919
3.3181 87.0 435 3.3373 0.0027 0.0142 0.0 0.0 0.0042 0.0 0.0 0.023 0.0297 0.0 0.0458 0.0 0.0 0.0 0.0055 0.0595
3.3382 88.0 440 3.3707 0.0019 0.0069 0.0 0.0 0.003 0.0 0.0 0.0243 0.0297 0.0 0.0458 0.0 0.0 0.0 0.0039 0.0595
3.1525 89.0 445 3.4728 0.0025 0.0056 0.0 0.0 0.0038 0.0 0.0 0.0324 0.0324 0.0 0.05 0.0 0.0 0.0 0.005 0.0649
3.1836 90.0 450 3.4726 0.0103 0.0357 0.0 0.0 0.0163 0.0 0.0041 0.0486 0.05 0.0 0.0771 0.0 0.0 0.0 0.0206 0.1
3.2294 91.0 455 3.4077 0.0116 0.0309 0.0 0.0 0.0183 0.0 0.0041 0.0581 0.0595 0.0 0.0917 0.0 0.0 0.0 0.0233 0.1189
3.2468 92.0 460 3.3905 0.0101 0.0324 0.0004 0.0 0.0158 0.0 0.0027 0.0581 0.0581 0.0 0.0896 0.0 0.0 0.0 0.0202 0.1162
3.2433 93.0 465 3.3989 0.0099 0.0309 0.0004 0.0 0.0155 0.0 0.0027 0.0581 0.0581 0.0 0.0896 0.0 0.0 0.0 0.0198 0.1162
3.152 94.0 470 3.3957 0.0113 0.0352 0.0003 0.0 0.0177 0.0 0.0027 0.0568 0.0568 0.0 0.0875 0.0 0.0 0.0 0.0226 0.1135
3.2372 95.0 475 3.3941 0.0105 0.0345 0.0003 0.0 0.0163 0.0 0.0027 0.0527 0.0527 0.0 0.0812 0.0 0.0 0.0 0.0209 0.1054
3.2734 96.0 480 3.3851 0.0103 0.0256 0.0003 0.0 0.016 0.0 0.0041 0.0568 0.0568 0.0 0.0875 0.0 0.0 0.0 0.0205 0.1135
3.2559 97.0 485 3.3807 0.0113 0.0308 0.0003 0.0 0.0174 0.0 0.0041 0.0595 0.0608 0.0 0.0938 0.0 0.0 0.0 0.0226 0.1216
3.3223 98.0 490 3.3773 0.0094 0.03 0.0003 0.0 0.0146 0.0 0.0027 0.0527 0.0608 0.0 0.0938 0.0 0.0 0.0 0.0189 0.1216
3.1813 99.0 495 3.3789 0.0084 0.025 0.0004 0.0 0.0129 0.0 0.0027 0.0486 0.0622 0.0 0.0958 0.0 0.0 0.0 0.0168 0.1243
3.1959 100.0 500 3.3774 0.0076 0.0237 0.0004 0.0 0.0117 0.0 0.0014 0.0459 0.0622 0.0 0.0958 0.0 0.0 0.0 0.0152 0.1243

Framework versions

  • Transformers 4.44.0.dev0
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
41.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for namnguyen059/detr-finetuned-cppe-5-10k-steps

Finetuned
(404)
this model