Edit model card

ms_detr_finetuned_diana

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3617
  • Map: 0.6874
  • Map 50: 0.789
  • Map 75: 0.7871
  • Map Small: -1.0
  • Map Medium: 0.7147
  • Map Large: 0.6892
  • Mar 1: 0.0969
  • Mar 10: 0.7163
  • Mar 100: 0.7819
  • Mar Small: -1.0
  • Mar Medium: 0.75
  • Mar Large: 0.7947
  • Map Per Class: -1.0
  • Mar 100 Per Class: -1.0
  • Classes: 0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Per Class Mar 100 Per Class Classes
2.892 1.0 10 2.2713 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.717 2.0 20 1.6999 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.5162 3.0 30 1.4320 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.3622 4.0 40 1.2202 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.1926 5.0 50 1.1617 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.2362 6.0 60 1.1772 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.3114 7.0 70 1.0437 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.1188 8.0 80 0.9656 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.9169 9.0 90 0.8787 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.7998 10.0 100 0.7928 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.7385 11.0 110 0.6800 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.6697 12.0 120 0.6025 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.4984 13.0 130 0.5722 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.5245 14.0 140 0.5460 0.0079 0.0099 0.0099 -1.0 0.0238 0.0 0.005 0.005 0.005 -1.0 0.0174 0.0 -1.0 -1.0 0
0.3993 15.0 150 0.5030 0.0168 0.0198 0.0198 -1.0 0.0267 0.0079 0.0106 0.0106 0.0106 -1.0 0.0196 0.007 -1.0 -1.0 0
0.4789 16.0 160 0.4662 0.2283 0.2665 0.2665 -1.0 0.1539 0.2598 0.0794 0.2438 0.2438 -1.0 0.1543 0.2798 -1.0 -1.0 0
0.5485 17.0 170 0.4586 0.2041 0.2369 0.2369 -1.0 0.1679 0.2202 0.0731 0.2175 0.2175 -1.0 0.1717 0.236 -1.0 -1.0 0
0.3419 18.0 180 0.4637 0.3674 0.4292 0.4292 -1.0 0.2666 0.4049 0.0856 0.3994 0.3994 -1.0 0.2696 0.4518 -1.0 -1.0 0
0.4885 19.0 190 0.5509 0.4468 0.5407 0.5314 -1.0 0.354 0.4898 0.0925 0.5156 0.5156 -1.0 0.3761 0.5719 -1.0 -1.0 0
0.3336 20.0 200 0.5122 0.1809 0.2149 0.2149 -1.0 0.0802 0.2257 0.0763 0.195 0.195 -1.0 0.0783 0.2421 -1.0 -1.0 0
0.3471 21.0 210 0.4619 0.4291 0.519 0.519 -1.0 0.3115 0.4787 0.0906 0.465 0.4706 -1.0 0.3174 0.5325 -1.0 -1.0 0
0.3953 22.0 220 0.4313 0.4938 0.5927 0.581 -1.0 0.3728 0.5476 0.0887 0.5319 0.5437 -1.0 0.3891 0.6061 -1.0 -1.0 0
0.5373 23.0 230 0.4308 0.5264 0.6339 0.623 -1.0 0.4223 0.5733 0.0938 0.5744 0.5894 -1.0 0.4391 0.65 -1.0 -1.0 0
0.3092 24.0 240 0.4193 0.5561 0.6661 0.6571 -1.0 0.4775 0.5927 0.0944 0.5844 0.6169 -1.0 0.5022 0.6632 -1.0 -1.0 0
0.477 25.0 250 0.4125 0.41 0.4769 0.4769 -1.0 0.3283 0.4467 0.1006 0.4387 0.4487 -1.0 0.3348 0.4947 -1.0 -1.0 0
0.3867 26.0 260 0.4114 0.6146 0.743 0.7168 -1.0 0.6036 0.6294 0.0938 0.6381 0.6981 -1.0 0.6304 0.7254 -1.0 -1.0 0
0.3658 27.0 270 0.4001 0.6643 0.7856 0.7727 -1.0 0.6505 0.6817 0.1 0.6787 0.7456 -1.0 0.6935 0.7667 -1.0 -1.0 0
0.3053 28.0 280 0.4282 0.5218 0.6326 0.6201 -1.0 0.4163 0.5676 0.095 0.5631 0.5844 -1.0 0.4261 0.6482 -1.0 -1.0 0
0.3105 29.0 290 0.4398 0.6187 0.7567 0.7331 -1.0 0.6045 0.6336 0.0919 0.6413 0.6944 -1.0 0.6478 0.7132 -1.0 -1.0 0
0.3713 30.0 300 0.4202 0.5643 0.6683 0.6673 -1.0 0.4959 0.5996 0.0931 0.5969 0.6306 -1.0 0.5348 0.6693 -1.0 -1.0 0
0.2874 31.0 310 0.4018 0.6327 0.7547 0.7357 -1.0 0.5647 0.6713 0.0981 0.6575 0.715 -1.0 0.5957 0.7632 -1.0 -1.0 0
0.2809 32.0 320 0.4007 0.6444 0.7618 0.7412 -1.0 0.6189 0.667 0.1013 0.6694 0.7381 -1.0 0.6587 0.7702 -1.0 -1.0 0
0.2801 33.0 330 0.4528 0.6125 0.7386 0.727 -1.0 0.5399 0.6519 0.0938 0.6338 0.6994 -1.0 0.5739 0.75 -1.0 -1.0 0
0.3484 34.0 340 0.4233 0.6226 0.7505 0.7381 -1.0 0.6268 0.642 0.0988 0.64 0.7194 -1.0 0.6543 0.7456 -1.0 -1.0 0
0.3591 35.0 350 0.4156 0.6475 0.759 0.7576 -1.0 0.6733 0.6536 0.095 0.6712 0.74 -1.0 0.7065 0.7535 -1.0 -1.0 0
0.2204 36.0 360 0.4363 0.5723 0.6577 0.6577 -1.0 0.548 0.5962 0.0956 0.6112 0.6456 -1.0 0.5717 0.6754 -1.0 -1.0 0
0.3912 37.0 370 0.4261 0.6855 0.8128 0.8032 -1.0 0.6777 0.7012 0.1006 0.7063 0.7806 -1.0 0.7217 0.8044 -1.0 -1.0 0
0.3377 38.0 380 0.4260 0.6743 0.7969 0.7847 -1.0 0.6147 0.7086 0.1025 0.69 0.7444 -1.0 0.6413 0.786 -1.0 -1.0 0
0.3047 39.0 390 0.4211 0.6519 0.7793 0.7647 -1.0 0.6241 0.6742 0.1006 0.6681 0.7175 -1.0 0.6457 0.7465 -1.0 -1.0 0
0.2563 40.0 400 0.4313 0.6441 0.769 0.769 -1.0 0.724 0.6211 0.1006 0.6669 0.7275 -1.0 0.7543 0.7167 -1.0 -1.0 0
0.3127 41.0 410 0.4297 0.5739 0.6679 0.6593 -1.0 0.5456 0.5969 0.0975 0.5987 0.6438 -1.0 0.563 0.6763 -1.0 -1.0 0
0.2782 42.0 420 0.4133 0.6234 0.7267 0.7179 -1.0 0.6279 0.6346 0.0962 0.6394 0.7019 -1.0 0.6457 0.7246 -1.0 -1.0 0
0.2965 43.0 430 0.4304 0.5708 0.684 0.664 -1.0 0.5152 0.6021 0.0938 0.6225 0.6538 -1.0 0.5326 0.7026 -1.0 -1.0 0
0.2599 44.0 440 0.4240 0.6451 0.7575 0.7445 -1.0 0.67 0.6528 0.0975 0.6762 0.7506 -1.0 0.6978 0.7719 -1.0 -1.0 0
0.2821 45.0 450 0.4361 0.695 0.8218 0.796 -1.0 0.7193 0.6945 0.0969 0.7287 0.79 -1.0 0.7587 0.8026 -1.0 -1.0 0
0.3117 46.0 460 0.4164 0.7032 0.8334 0.8051 -1.0 0.7114 0.7094 0.1019 0.7331 0.7925 -1.0 0.7391 0.814 -1.0 -1.0 0
0.2484 47.0 470 0.4007 0.6757 0.7805 0.7805 -1.0 0.7268 0.6714 0.1037 0.7088 0.7656 -1.0 0.7543 0.7702 -1.0 -1.0 0
0.3059 48.0 480 0.4269 0.651 0.7539 0.7496 -1.0 0.599 0.6866 0.0956 0.6819 0.7294 -1.0 0.6174 0.7746 -1.0 -1.0 0
0.2492 49.0 490 0.3877 0.6452 0.7506 0.7277 -1.0 0.6343 0.6624 0.09 0.6794 0.7244 -1.0 0.6565 0.7518 -1.0 -1.0 0
0.3828 50.0 500 0.4237 0.6721 0.7953 0.7744 -1.0 0.7174 0.6672 0.0887 0.7056 0.7606 -1.0 0.75 0.7649 -1.0 -1.0 0
0.2737 51.0 510 0.3713 0.6619 0.7594 0.7453 -1.0 0.6988 0.6615 0.0925 0.6919 0.7419 -1.0 0.7283 0.7474 -1.0 -1.0 0
0.3283 52.0 520 0.3737 0.6298 0.7286 0.7254 -1.0 0.6199 0.6431 0.0894 0.6575 0.7019 -1.0 0.6478 0.7237 -1.0 -1.0 0
0.2819 53.0 530 0.4077 0.6919 0.8094 0.8004 -1.0 0.7656 0.6797 0.0894 0.7106 0.7831 -1.0 0.7978 0.7772 -1.0 -1.0 0
0.2533 54.0 540 0.4056 0.7032 0.8249 0.8045 -1.0 0.724 0.7105 0.0962 0.7381 0.7962 -1.0 0.7522 0.814 -1.0 -1.0 0
0.3408 55.0 550 0.3916 0.667 0.771 0.7591 -1.0 0.6257 0.6997 0.0969 0.6913 0.7356 -1.0 0.65 0.7702 -1.0 -1.0 0
0.2069 56.0 560 0.3931 0.7054 0.8197 0.7998 -1.0 0.7399 0.7061 0.0962 0.7188 0.785 -1.0 0.7652 0.793 -1.0 -1.0 0
0.2572 57.0 570 0.4012 0.6993 0.8117 0.7932 -1.0 0.729 0.699 0.1025 0.7275 0.7862 -1.0 0.7609 0.7965 -1.0 -1.0 0
0.1786 58.0 580 0.3830 0.7114 0.8231 0.8047 -1.0 0.7666 0.7065 0.0975 0.7325 0.8056 -1.0 0.7935 0.8105 -1.0 -1.0 0
0.2185 59.0 590 0.3609 0.705 0.8153 0.806 -1.0 0.7388 0.6997 0.0919 0.7194 0.7912 -1.0 0.7783 0.7965 -1.0 -1.0 0
0.2219 60.0 600 0.3783 0.7086 0.8241 0.8092 -1.0 0.7106 0.7192 0.0969 0.74 0.805 -1.0 0.7391 0.8316 -1.0 -1.0 0
0.3003 61.0 610 0.4098 0.7118 0.8178 0.8085 -1.0 0.7534 0.7084 0.0919 0.7394 0.8138 -1.0 0.7891 0.8237 -1.0 -1.0 0
0.3144 62.0 620 0.4339 0.6867 0.7967 0.7794 -1.0 0.7074 0.6913 0.0894 0.7294 0.7875 -1.0 0.7435 0.8053 -1.0 -1.0 0
0.2323 63.0 630 0.4086 0.6733 0.7901 0.7787 -1.0 0.692 0.6792 0.0944 0.715 0.7719 -1.0 0.7239 0.7912 -1.0 -1.0 0
0.3114 64.0 640 0.3946 0.6801 0.7905 0.7658 -1.0 0.6803 0.6931 0.0981 0.7188 0.7713 -1.0 0.7043 0.7982 -1.0 -1.0 0
0.2579 65.0 650 0.3899 0.6658 0.7882 0.7637 -1.0 0.6463 0.6861 0.0988 0.7013 0.7613 -1.0 0.6739 0.7965 -1.0 -1.0 0
0.2312 66.0 660 0.3816 0.6567 0.7604 0.7548 -1.0 0.6226 0.6793 0.0969 0.6919 0.7506 -1.0 0.6478 0.7921 -1.0 -1.0 0
0.209 67.0 670 0.3967 0.699 0.8281 0.7953 -1.0 0.7061 0.7074 0.0906 0.7394 0.7994 -1.0 0.7348 0.8254 -1.0 -1.0 0
0.3113 68.0 680 0.4126 0.6991 0.8308 0.7838 -1.0 0.7222 0.6999 0.0962 0.7337 0.7994 -1.0 0.7522 0.8184 -1.0 -1.0 0
0.2045 69.0 690 0.4276 0.703 0.8227 0.7956 -1.0 0.7344 0.7016 0.0838 0.735 0.8006 -1.0 0.7652 0.8149 -1.0 -1.0 0
0.2002 70.0 700 0.4094 0.6843 0.7886 0.7836 -1.0 0.6975 0.6908 0.0919 0.7175 0.7788 -1.0 0.7239 0.8009 -1.0 -1.0 0
0.2065 71.0 710 0.4052 0.7065 0.8196 0.8056 -1.0 0.7149 0.7146 0.0981 0.7337 0.8 -1.0 0.7457 0.8219 -1.0 -1.0 0
0.2716 72.0 720 0.4000 0.7039 0.8294 0.8059 -1.0 0.7229 0.7079 0.0956 0.7344 0.7987 -1.0 0.7522 0.8175 -1.0 -1.0 0
0.2935 73.0 730 0.3905 0.652 0.7532 0.7475 -1.0 0.6084 0.6786 0.0906 0.7038 0.7525 -1.0 0.6326 0.8009 -1.0 -1.0 0
0.2137 74.0 740 0.3959 0.627 0.7307 0.725 -1.0 0.6212 0.6476 0.0906 0.6794 0.7312 -1.0 0.6522 0.7632 -1.0 -1.0 0
0.2075 75.0 750 0.3786 0.6542 0.769 0.7535 -1.0 0.659 0.6667 0.0906 0.71 0.7619 -1.0 0.6913 0.7904 -1.0 -1.0 0
0.1713 76.0 760 0.3836 0.6695 0.7851 0.7703 -1.0 0.6683 0.6859 0.0906 0.7138 0.7781 -1.0 0.7065 0.807 -1.0 -1.0 0
0.2233 77.0 770 0.3947 0.6659 0.775 0.7694 -1.0 0.714 0.6616 0.0913 0.705 0.7763 -1.0 0.7543 0.7851 -1.0 -1.0 0
0.2398 78.0 780 0.3835 0.6854 0.7997 0.7883 -1.0 0.7067 0.6903 0.0906 0.7212 0.785 -1.0 0.7391 0.8035 -1.0 -1.0 0
0.1906 79.0 790 0.3811 0.6901 0.8028 0.7974 -1.0 0.7128 0.6948 0.0956 0.7231 0.7931 -1.0 0.7478 0.8114 -1.0 -1.0 0
0.1823 80.0 800 0.3831 0.6721 0.7821 0.7709 -1.0 0.7005 0.6764 0.0906 0.7094 0.775 -1.0 0.737 0.7904 -1.0 -1.0 0
0.2266 81.0 810 0.3831 0.6973 0.8131 0.8007 -1.0 0.6966 0.7103 0.0969 0.7319 0.7969 -1.0 0.7326 0.8228 -1.0 -1.0 0
0.1812 82.0 820 0.3809 0.6934 0.8057 0.7933 -1.0 0.6984 0.707 0.0962 0.7262 0.7919 -1.0 0.7348 0.8149 -1.0 -1.0 0
0.1811 83.0 830 0.3820 0.6836 0.7953 0.7829 -1.0 0.6797 0.7021 0.0969 0.7156 0.7819 -1.0 0.7087 0.8114 -1.0 -1.0 0
0.1677 84.0 840 0.3851 0.6809 0.7891 0.7759 -1.0 0.6854 0.6942 0.0962 0.7163 0.7763 -1.0 0.7109 0.8026 -1.0 -1.0 0
0.159 85.0 850 0.3791 0.6802 0.7889 0.7765 -1.0 0.6799 0.6954 0.0962 0.715 0.775 -1.0 0.7087 0.8018 -1.0 -1.0 0
0.1646 86.0 860 0.3712 0.6856 0.7903 0.7785 -1.0 0.6683 0.7073 0.0975 0.7212 0.7788 -1.0 0.6978 0.8114 -1.0 -1.0 0
0.1618 87.0 870 0.3736 0.6817 0.7861 0.7753 -1.0 0.7037 0.6883 0.0969 0.72 0.7781 -1.0 0.7326 0.7965 -1.0 -1.0 0
0.144 88.0 880 0.3724 0.6804 0.7917 0.7688 -1.0 0.7067 0.6832 0.0962 0.7138 0.7719 -1.0 0.7304 0.7886 -1.0 -1.0 0
0.2508 89.0 890 0.3643 0.6792 0.7823 0.7692 -1.0 0.7062 0.6828 0.0969 0.7125 0.7706 -1.0 0.7283 0.7877 -1.0 -1.0 0
0.1579 90.0 900 0.3623 0.6996 0.8094 0.7962 -1.0 0.7204 0.7044 0.0962 0.7256 0.7937 -1.0 0.7522 0.8105 -1.0 -1.0 0
0.1625 91.0 910 0.3630 0.6985 0.8083 0.7959 -1.0 0.718 0.7034 0.0962 0.7256 0.7937 -1.0 0.7522 0.8105 -1.0 -1.0 0
0.1734 92.0 920 0.3626 0.6876 0.7894 0.787 -1.0 0.7165 0.6891 0.0969 0.7169 0.7819 -1.0 0.75 0.7947 -1.0 -1.0 0
0.1452 93.0 930 0.3619 0.6838 0.7887 0.7781 -1.0 0.7163 0.684 0.0969 0.7119 0.7769 -1.0 0.7522 0.7868 -1.0 -1.0 0
0.1764 94.0 940 0.3633 0.6833 0.7888 0.7782 -1.0 0.7162 0.6831 0.0969 0.7113 0.7763 -1.0 0.7522 0.786 -1.0 -1.0 0
0.1862 95.0 950 0.3633 0.6825 0.7889 0.7781 -1.0 0.7153 0.6826 0.0969 0.71 0.7756 -1.0 0.75 0.786 -1.0 -1.0 0
0.1855 96.0 960 0.3622 0.6878 0.7891 0.7873 -1.0 0.7147 0.6899 0.0969 0.7163 0.7819 -1.0 0.75 0.7947 -1.0 -1.0 0
0.2982 97.0 970 0.3622 0.6877 0.789 0.7873 -1.0 0.7147 0.6896 0.0969 0.7163 0.7819 -1.0 0.75 0.7947 -1.0 -1.0 0
0.1764 98.0 980 0.3620 0.6877 0.789 0.7873 -1.0 0.7147 0.6896 0.0969 0.7163 0.7819 -1.0 0.75 0.7947 -1.0 -1.0 0
0.2245 99.0 990 0.3618 0.6874 0.789 0.7871 -1.0 0.7147 0.6892 0.0969 0.7163 0.7819 -1.0 0.75 0.7947 -1.0 -1.0 0
0.1775 100.0 1000 0.3617 0.6874 0.789 0.7871 -1.0 0.7147 0.6892 0.0969 0.7163 0.7819 -1.0 0.75 0.7947 -1.0 -1.0 0

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
599
Safetensors
Model size
43.5M params
Tensor type
F32
·

Finetuned from