Edit model card

fb_detr_finetuned_diana

This model is a fine-tuned version of facebook/detr-resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3238
  • Map: 0.7325
  • Map 50: 0.8805
  • Map 75: 0.8535
  • Map Small: -1.0
  • Map Medium: 0.7891
  • Map Large: 0.7273
  • Mar 1: 0.0881
  • Mar 10: 0.7675
  • Mar 100: 0.8712
  • Mar Small: -1.0
  • Mar Medium: 0.8348
  • Mar Large: 0.886
  • Map Per Class: -1.0
  • Mar 100 Per Class: -1.0
  • Classes: 0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Per Class Mar 100 Per Class Classes
2.0159 1.0 10 1.9364 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.731 2.0 20 1.6869 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.5773 3.0 30 1.6082 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.7825 4.0 40 1.4512 0.0042 0.0099 0.0 -1.0 0.0119 0.0025 0.0025 0.0056 0.0056 -1.0 0.0087 0.0044 -1.0 -1.0 0
1.4306 5.0 50 1.2470 0.0424 0.075 0.0531 -1.0 0.0473 0.0424 0.0431 0.0688 0.0688 -1.0 0.0652 0.0702 -1.0 -1.0 0
1.2657 6.0 60 1.1537 0.0327 0.0521 0.0363 -1.0 0.0436 0.0286 0.0281 0.0344 0.0344 -1.0 0.0478 0.0289 -1.0 -1.0 0
1.3114 7.0 70 1.0515 0.0933 0.148 0.114 -1.0 0.072 0.1029 0.0556 0.1156 0.1156 -1.0 0.0739 0.1325 -1.0 -1.0 0
1.0453 8.0 80 0.9494 0.1865 0.2552 0.2174 -1.0 0.1565 0.1983 0.0737 0.2219 0.2219 -1.0 0.1674 0.2439 -1.0 -1.0 0
0.8954 9.0 90 0.8976 0.2919 0.4271 0.3403 -1.0 0.2369 0.3188 0.0756 0.3906 0.4038 -1.0 0.3522 0.4246 -1.0 -1.0 0
0.9844 10.0 100 0.9485 0.3314 0.5102 0.4147 -1.0 0.3038 0.351 0.0688 0.4244 0.4412 -1.0 0.3543 0.4763 -1.0 -1.0 0
0.8282 11.0 110 0.8403 0.4211 0.6222 0.5098 -1.0 0.3505 0.4606 0.0819 0.5256 0.5694 -1.0 0.413 0.6325 -1.0 -1.0 0
0.7106 12.0 120 0.7935 0.431 0.6385 0.5331 -1.0 0.3495 0.4761 0.0794 0.5612 0.6281 -1.0 0.5609 0.6553 -1.0 -1.0 0
0.6526 13.0 130 0.7319 0.5166 0.7537 0.6577 -1.0 0.4806 0.5376 0.075 0.6 0.6762 -1.0 0.6152 0.7009 -1.0 -1.0 0
0.7241 14.0 140 0.7162 0.5073 0.715 0.6429 -1.0 0.4718 0.5377 0.0806 0.5688 0.6594 -1.0 0.5696 0.6956 -1.0 -1.0 0
0.771 15.0 150 0.6368 0.5778 0.7847 0.716 -1.0 0.5218 0.6118 0.0812 0.6325 0.7219 -1.0 0.6261 0.7605 -1.0 -1.0 0
0.6217 16.0 160 0.6107 0.5797 0.8084 0.7379 -1.0 0.5941 0.586 0.085 0.6313 0.7319 -1.0 0.6913 0.7482 -1.0 -1.0 0
0.6656 17.0 170 0.6412 0.5931 0.8074 0.7118 -1.0 0.5751 0.6115 0.0938 0.6562 0.7344 -1.0 0.6391 0.7728 -1.0 -1.0 0
0.4162 18.0 180 0.6373 0.5991 0.8189 0.7538 -1.0 0.6026 0.6037 0.0913 0.6438 0.7487 -1.0 0.7 0.7684 -1.0 -1.0 0
0.538 19.0 190 0.5741 0.6192 0.8493 0.7732 -1.0 0.6516 0.6082 0.0844 0.6363 0.7644 -1.0 0.7587 0.7667 -1.0 -1.0 0
0.3672 20.0 200 0.5242 0.6256 0.8228 0.7781 -1.0 0.6944 0.6089 0.0794 0.6694 0.7906 -1.0 0.7957 0.7886 -1.0 -1.0 0
0.3685 21.0 210 0.4954 0.6548 0.8544 0.8291 -1.0 0.6707 0.6608 0.0875 0.6856 0.7862 -1.0 0.7217 0.8123 -1.0 -1.0 0
0.5199 22.0 220 0.4870 0.665 0.8679 0.8399 -1.0 0.681 0.6694 0.09 0.7006 0.805 -1.0 0.7565 0.8246 -1.0 -1.0 0
0.545 23.0 230 0.4580 0.6669 0.8667 0.8309 -1.0 0.6986 0.6642 0.0913 0.7013 0.8025 -1.0 0.787 0.8088 -1.0 -1.0 0
0.3932 24.0 240 0.4901 0.6494 0.8679 0.7951 -1.0 0.6509 0.6644 0.0831 0.6775 0.7887 -1.0 0.713 0.8193 -1.0 -1.0 0
0.5348 25.0 250 0.4542 0.6652 0.8623 0.8064 -1.0 0.6761 0.6794 0.0962 0.7 0.805 -1.0 0.7478 0.8281 -1.0 -1.0 0
0.4903 26.0 260 0.4622 0.6599 0.855 0.8212 -1.0 0.6611 0.6745 0.0956 0.69 0.7981 -1.0 0.7261 0.8272 -1.0 -1.0 0
0.586 27.0 270 0.4812 0.6611 0.8652 0.8148 -1.0 0.6619 0.6732 0.0962 0.68 0.795 -1.0 0.7413 0.8167 -1.0 -1.0 0
0.3898 28.0 280 0.4864 0.6606 0.8411 0.8023 -1.0 0.6398 0.678 0.1037 0.69 0.8012 -1.0 0.7522 0.8211 -1.0 -1.0 0
0.3738 29.0 290 0.5301 0.6366 0.8379 0.7826 -1.0 0.6521 0.6493 0.0862 0.6737 0.7725 -1.0 0.7196 0.7939 -1.0 -1.0 0
0.5067 30.0 300 0.4871 0.6597 0.8465 0.8017 -1.0 0.65 0.6705 0.0956 0.7006 0.8031 -1.0 0.7696 0.8167 -1.0 -1.0 0
0.4212 31.0 310 0.4221 0.7033 0.8798 0.8324 -1.0 0.7001 0.7165 0.0887 0.7237 0.8338 -1.0 0.7826 0.8544 -1.0 -1.0 0
0.4625 32.0 320 0.4304 0.7179 0.8835 0.8514 -1.0 0.7284 0.721 0.1006 0.7275 0.8338 -1.0 0.7848 0.8535 -1.0 -1.0 0
0.4318 33.0 330 0.4246 0.6794 0.8574 0.8077 -1.0 0.6674 0.6904 0.0938 0.715 0.8206 -1.0 0.7761 0.8386 -1.0 -1.0 0
0.5166 34.0 340 0.4357 0.6692 0.8476 0.8146 -1.0 0.6649 0.6812 0.0862 0.71 0.8169 -1.0 0.7543 0.8421 -1.0 -1.0 0
0.4435 35.0 350 0.4045 0.6748 0.8531 0.8211 -1.0 0.662 0.6862 0.0938 0.7113 0.8163 -1.0 0.7587 0.8395 -1.0 -1.0 0
0.3236 36.0 360 0.4110 0.6776 0.8527 0.8265 -1.0 0.6929 0.6858 0.0831 0.7163 0.8256 -1.0 0.7413 0.8596 -1.0 -1.0 0
0.4923 37.0 370 0.4340 0.6749 0.8531 0.8088 -1.0 0.7001 0.6776 0.0938 0.7081 0.8144 -1.0 0.7543 0.8386 -1.0 -1.0 0
0.4057 38.0 380 0.4256 0.69 0.8742 0.8284 -1.0 0.7185 0.6876 0.0925 0.7119 0.8194 -1.0 0.7739 0.8377 -1.0 -1.0 0
0.3833 39.0 390 0.4131 0.7058 0.8661 0.8408 -1.0 0.703 0.7183 0.0981 0.7244 0.8238 -1.0 0.7522 0.8526 -1.0 -1.0 0
0.3845 40.0 400 0.4475 0.6838 0.8654 0.8107 -1.0 0.7109 0.6807 0.0975 0.7188 0.8188 -1.0 0.787 0.8316 -1.0 -1.0 0
0.3881 41.0 410 0.4030 0.7142 0.8812 0.8569 -1.0 0.7143 0.727 0.095 0.7362 0.8338 -1.0 0.7826 0.8544 -1.0 -1.0 0
0.3441 42.0 420 0.3964 0.7087 0.8683 0.841 -1.0 0.7332 0.7113 0.0938 0.7356 0.8406 -1.0 0.8043 0.8553 -1.0 -1.0 0
0.3544 43.0 430 0.4157 0.6905 0.8643 0.8324 -1.0 0.6984 0.6989 0.0938 0.7312 0.8231 -1.0 0.7783 0.8412 -1.0 -1.0 0
0.3639 44.0 440 0.3973 0.6977 0.8656 0.8399 -1.0 0.7067 0.7048 0.0931 0.7394 0.8369 -1.0 0.7783 0.8605 -1.0 -1.0 0
0.2883 45.0 450 0.3685 0.7138 0.8743 0.8473 -1.0 0.7572 0.7094 0.0931 0.7456 0.85 -1.0 0.8 0.8702 -1.0 -1.0 0
0.3492 46.0 460 0.3839 0.7069 0.88 0.8323 -1.0 0.7024 0.7238 0.0938 0.7469 0.8356 -1.0 0.7413 0.8737 -1.0 -1.0 0
0.2949 47.0 470 0.3781 0.7074 0.8604 0.8427 -1.0 0.7014 0.7273 0.1 0.7431 0.8388 -1.0 0.7565 0.8719 -1.0 -1.0 0
0.4604 48.0 480 0.3945 0.7123 0.8803 0.8474 -1.0 0.7103 0.722 0.1 0.73 0.84 -1.0 0.7957 0.8579 -1.0 -1.0 0
0.321 49.0 490 0.4030 0.7034 0.8772 0.8431 -1.0 0.7357 0.7052 0.0944 0.7425 0.8394 -1.0 0.8 0.8553 -1.0 -1.0 0
0.3609 50.0 500 0.3718 0.7088 0.8648 0.839 -1.0 0.751 0.7042 0.095 0.7462 0.8438 -1.0 0.8065 0.8588 -1.0 -1.0 0
0.3086 51.0 510 0.3798 0.7087 0.8698 0.8387 -1.0 0.7261 0.7106 0.0869 0.7469 0.8419 -1.0 0.7957 0.8605 -1.0 -1.0 0
0.3784 52.0 520 0.3669 0.7224 0.8709 0.847 -1.0 0.7484 0.7224 0.0913 0.7481 0.8506 -1.0 0.8109 0.8667 -1.0 -1.0 0
0.3195 53.0 530 0.3884 0.719 0.8717 0.8384 -1.0 0.72 0.7274 0.0887 0.7519 0.8494 -1.0 0.7826 0.8763 -1.0 -1.0 0
0.3255 54.0 540 0.3421 0.7255 0.8763 0.8435 -1.0 0.7377 0.7317 0.0956 0.7581 0.8544 -1.0 0.8022 0.8754 -1.0 -1.0 0
0.4141 55.0 550 0.3541 0.7143 0.8728 0.8388 -1.0 0.7587 0.7078 0.0887 0.7556 0.8537 -1.0 0.8217 0.8667 -1.0 -1.0 0
0.2409 56.0 560 0.3493 0.7317 0.8747 0.8485 -1.0 0.7642 0.7298 0.0975 0.7606 0.8644 -1.0 0.8196 0.8825 -1.0 -1.0 0
0.2739 57.0 570 0.3364 0.7324 0.8825 0.8489 -1.0 0.7642 0.7313 0.1 0.7619 0.8644 -1.0 0.8283 0.8789 -1.0 -1.0 0
0.2322 58.0 580 0.3423 0.7279 0.8817 0.849 -1.0 0.7463 0.7321 0.0994 0.7606 0.86 -1.0 0.8283 0.8728 -1.0 -1.0 0
0.3043 59.0 590 0.3376 0.7285 0.8812 0.8455 -1.0 0.7519 0.7362 0.0988 0.7644 0.8669 -1.0 0.8043 0.8921 -1.0 -1.0 0
0.2806 60.0 600 0.3639 0.714 0.8667 0.8374 -1.0 0.7335 0.7263 0.0969 0.7506 0.8481 -1.0 0.7739 0.8781 -1.0 -1.0 0
0.318 61.0 610 0.3444 0.7169 0.8734 0.8391 -1.0 0.7709 0.7108 0.0894 0.755 0.8531 -1.0 0.8283 0.8632 -1.0 -1.0 0
0.3217 62.0 620 0.3392 0.7223 0.8816 0.8381 -1.0 0.7649 0.7182 0.0913 0.7594 0.8525 -1.0 0.8217 0.8649 -1.0 -1.0 0
0.2648 63.0 630 0.3447 0.7304 0.8981 0.8486 -1.0 0.7663 0.7293 0.0913 0.7588 0.8562 -1.0 0.8174 0.8719 -1.0 -1.0 0
0.3217 64.0 640 0.3501 0.7195 0.8944 0.8453 -1.0 0.762 0.7178 0.0906 0.7538 0.8512 -1.0 0.8196 0.864 -1.0 -1.0 0
0.2912 65.0 650 0.3418 0.7329 0.8944 0.8616 -1.0 0.7591 0.7373 0.0925 0.7625 0.8625 -1.0 0.813 0.8825 -1.0 -1.0 0
0.2591 66.0 660 0.3400 0.7314 0.8877 0.8562 -1.0 0.7498 0.736 0.0913 0.7619 0.8637 -1.0 0.8174 0.8825 -1.0 -1.0 0
0.2844 67.0 670 0.3459 0.7237 0.8786 0.8412 -1.0 0.7613 0.7255 0.0925 0.7506 0.8587 -1.0 0.8217 0.8737 -1.0 -1.0 0
0.3221 68.0 680 0.3366 0.7345 0.8877 0.8484 -1.0 0.7735 0.7333 0.0913 0.7581 0.8669 -1.0 0.8326 0.8807 -1.0 -1.0 0
0.2491 69.0 690 0.3325 0.7344 0.8858 0.8472 -1.0 0.7777 0.7336 0.09 0.765 0.8675 -1.0 0.8326 0.8816 -1.0 -1.0 0
0.2743 70.0 700 0.3317 0.7345 0.885 0.8475 -1.0 0.7754 0.7333 0.0913 0.7638 0.8681 -1.0 0.8304 0.8833 -1.0 -1.0 0
0.249 71.0 710 0.3184 0.7371 0.8827 0.8521 -1.0 0.7685 0.7391 0.09 0.7669 0.8737 -1.0 0.8283 0.8921 -1.0 -1.0 0
0.2721 72.0 720 0.3230 0.7393 0.8939 0.8545 -1.0 0.7696 0.7442 0.0969 0.7588 0.8625 -1.0 0.8109 0.8833 -1.0 -1.0 0
0.2454 73.0 730 0.3237 0.7362 0.8887 0.8515 -1.0 0.7709 0.7396 0.0887 0.7688 0.8662 -1.0 0.8174 0.886 -1.0 -1.0 0
0.2494 74.0 740 0.3218 0.7319 0.8865 0.8486 -1.0 0.7672 0.735 0.0894 0.7719 0.8669 -1.0 0.8152 0.8877 -1.0 -1.0 0
0.2481 75.0 750 0.3195 0.7349 0.8834 0.8549 -1.0 0.782 0.7315 0.09 0.7781 0.8712 -1.0 0.837 0.8851 -1.0 -1.0 0
0.2278 76.0 760 0.3284 0.7275 0.8751 0.8398 -1.0 0.7878 0.7173 0.0894 0.7581 0.8625 -1.0 0.837 0.8728 -1.0 -1.0 0
0.2639 77.0 770 0.3241 0.7257 0.8745 0.8468 -1.0 0.7836 0.719 0.09 0.7613 0.865 -1.0 0.8283 0.8798 -1.0 -1.0 0
0.2158 78.0 780 0.3249 0.7287 0.8809 0.8523 -1.0 0.7779 0.7241 0.0906 0.7581 0.8619 -1.0 0.8239 0.8772 -1.0 -1.0 0
0.2259 79.0 790 0.3289 0.7277 0.8803 0.8534 -1.0 0.7718 0.7247 0.09 0.7544 0.8581 -1.0 0.8174 0.8746 -1.0 -1.0 0
0.2361 80.0 800 0.3318 0.73 0.8758 0.8598 -1.0 0.7768 0.7308 0.0906 0.7575 0.8612 -1.0 0.813 0.8807 -1.0 -1.0 0
0.2708 81.0 810 0.3279 0.7327 0.8865 0.8588 -1.0 0.7773 0.7344 0.0906 0.7638 0.8656 -1.0 0.8152 0.886 -1.0 -1.0 0
0.2111 82.0 820 0.3223 0.7336 0.8849 0.859 -1.0 0.7817 0.731 0.09 0.7681 0.8681 -1.0 0.8261 0.8851 -1.0 -1.0 0
0.2663 83.0 830 0.3269 0.7337 0.885 0.8573 -1.0 0.7893 0.727 0.0913 0.7644 0.8681 -1.0 0.837 0.8807 -1.0 -1.0 0
0.2105 84.0 840 0.3259 0.7339 0.8851 0.859 -1.0 0.7812 0.7296 0.0906 0.7625 0.8669 -1.0 0.8326 0.8807 -1.0 -1.0 0
0.2211 85.0 850 0.3169 0.7355 0.8863 0.8611 -1.0 0.7978 0.7258 0.0962 0.7606 0.8675 -1.0 0.8457 0.8763 -1.0 -1.0 0
0.2243 86.0 860 0.3155 0.7369 0.8836 0.8592 -1.0 0.7924 0.7307 0.0956 0.765 0.8712 -1.0 0.8391 0.8842 -1.0 -1.0 0
0.1953 87.0 870 0.3225 0.7318 0.8818 0.8561 -1.0 0.7817 0.7283 0.0906 0.7644 0.8681 -1.0 0.8304 0.8833 -1.0 -1.0 0
0.2024 88.0 880 0.3245 0.7308 0.8832 0.8571 -1.0 0.7827 0.7272 0.0906 0.7625 0.865 -1.0 0.8261 0.8807 -1.0 -1.0 0
0.2507 89.0 890 0.3273 0.7262 0.8815 0.8459 -1.0 0.7833 0.7205 0.0881 0.76 0.8631 -1.0 0.8283 0.8772 -1.0 -1.0 0
0.2073 90.0 900 0.3236 0.7302 0.8798 0.8521 -1.0 0.7872 0.7254 0.0887 0.7644 0.8687 -1.0 0.8348 0.8825 -1.0 -1.0 0
0.1975 91.0 910 0.3268 0.7269 0.8788 0.8512 -1.0 0.7827 0.7236 0.0881 0.7631 0.8669 -1.0 0.8326 0.8807 -1.0 -1.0 0
0.2445 92.0 920 0.3260 0.7285 0.8797 0.8519 -1.0 0.7872 0.724 0.0881 0.7638 0.8681 -1.0 0.8348 0.8816 -1.0 -1.0 0
0.1991 93.0 930 0.3266 0.7296 0.8797 0.8522 -1.0 0.7874 0.7251 0.0881 0.7644 0.8687 -1.0 0.8348 0.8825 -1.0 -1.0 0
0.2979 94.0 940 0.3252 0.7318 0.8806 0.853 -1.0 0.7925 0.7267 0.0881 0.7669 0.8712 -1.0 0.837 0.8851 -1.0 -1.0 0
0.2172 95.0 950 0.3243 0.7317 0.8813 0.8542 -1.0 0.7901 0.7267 0.0881 0.7663 0.87 -1.0 0.8348 0.8842 -1.0 -1.0 0
0.2317 96.0 960 0.3236 0.733 0.8808 0.8539 -1.0 0.7935 0.7269 0.0881 0.7681 0.8719 -1.0 0.837 0.886 -1.0 -1.0 0
0.2501 97.0 970 0.3237 0.733 0.8805 0.8536 -1.0 0.7927 0.7273 0.0881 0.7681 0.8719 -1.0 0.837 0.886 -1.0 -1.0 0
0.204 98.0 980 0.3238 0.7325 0.8805 0.8535 -1.0 0.7891 0.7273 0.0881 0.7675 0.8712 -1.0 0.8348 0.886 -1.0 -1.0 0
0.3266 99.0 990 0.3238 0.7325 0.8805 0.8535 -1.0 0.7891 0.7273 0.0881 0.7675 0.8712 -1.0 0.8348 0.886 -1.0 -1.0 0
0.1921 100.0 1000 0.3238 0.7325 0.8805 0.8535 -1.0 0.7891 0.7273 0.0881 0.7675 0.8712 -1.0 0.8348 0.886 -1.0 -1.0 0

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
41.6M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LLyq/fb_detr_finetuned_diana

Finetuned
this model