LLyq's picture
End of training
e989cce verified
|
raw
history blame
No virus
13.3 kB
metadata
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
  - generated_from_trainer
datasets:
  - imagefolder
model-index:
  - name: ms_detr_finetuned_diana
    results: []

ms_detr_finetuned_diana

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3578
  • Map: 0.7134
  • Map 50: 0.8181
  • Map 75: 0.8181
  • Map Small: -1.0
  • Map Medium: 0.7864
  • Map Large: 0.7101
  • Mar 1: 0.1236
  • Mar 10: 0.7964
  • Mar 100: 0.825
  • Mar Small: -1.0
  • Mar Medium: 0.8
  • Mar Large: 0.8302
  • Map Per Class: -1.0
  • Mar 100 Per Class: -1.0
  • Classes: 0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Per Class Mar 100 Per Class Classes
2.1209 1.0 10 2.1209 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.4805 2.0 20 1.5062 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.205 3.0 30 1.3151 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.2767 4.0 40 1.1969 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
1.095 5.0 50 1.0561 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.8968 6.0 60 0.9283 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.8153 7.0 70 0.8459 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.7445 8.0 80 0.7019 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.5769 9.0 90 0.6067 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.5487 10.0 100 0.5308 0.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 0.0 -1.0 0.0 0.0 -1.0 -1.0 0
0.4492 11.0 110 0.5069 0.0089 0.0099 0.0099 -1.0 0.0 0.0089 0.0064 0.0064 0.0064 -1.0 0.0 0.0078 -1.0 -1.0 0
0.3912 12.0 120 0.5000 0.0385 0.0438 0.0438 -1.0 0.0 0.0461 0.0314 0.0386 0.0386 -1.0 0.0 0.0466 -1.0 -1.0 0
0.3875 13.0 130 0.4488 0.0901 0.1036 0.1036 -1.0 0.0 0.1072 0.0571 0.0886 0.0886 -1.0 0.0 0.1069 -1.0 -1.0 0
0.4356 14.0 140 0.4592 0.3255 0.3789 0.3789 -1.0 0.1382 0.3656 0.1007 0.3657 0.3657 -1.0 0.1417 0.4121 -1.0 -1.0 0
0.3536 15.0 150 0.4293 0.3127 0.3673 0.3584 -1.0 0.1914 0.3399 0.1064 0.3629 0.3629 -1.0 0.1958 0.3974 -1.0 -1.0 0
0.3617 16.0 160 0.4128 0.4625 0.5397 0.5288 -1.0 0.3614 0.4897 0.1164 0.5329 0.5329 -1.0 0.3625 0.5681 -1.0 -1.0 0
0.392 17.0 170 0.4258 0.4683 0.5332 0.5332 -1.0 0.3911 0.4868 0.1179 0.5486 0.5486 -1.0 0.4 0.5793 -1.0 -1.0 0
0.3694 18.0 180 0.4563 0.4006 0.4614 0.4614 -1.0 0.3313 0.4267 0.1157 0.4714 0.4714 -1.0 0.3333 0.5 -1.0 -1.0 0
0.3569 19.0 190 0.4160 0.4912 0.5672 0.5672 -1.0 0.386 0.5185 0.1157 0.58 0.58 -1.0 0.4 0.6172 -1.0 -1.0 0
0.3839 20.0 200 0.4665 0.5324 0.6311 0.6212 -1.0 0.4719 0.5561 0.115 0.6114 0.6114 -1.0 0.475 0.6397 -1.0 -1.0 0
0.3123 21.0 210 0.4144 0.4808 0.5519 0.5519 -1.0 0.3279 0.5235 0.1164 0.5643 0.5643 -1.0 0.3333 0.6121 -1.0 -1.0 0
0.2824 22.0 220 0.3918 0.5587 0.6403 0.6403 -1.0 0.468 0.5874 0.1186 0.6557 0.6557 -1.0 0.4792 0.6922 -1.0 -1.0 0
0.2545 23.0 230 0.3530 0.5577 0.6299 0.6299 -1.0 0.448 0.5846 0.1179 0.645 0.6514 -1.0 0.4542 0.6922 -1.0 -1.0 0
0.2716 24.0 240 0.3540 0.6501 0.7455 0.7369 -1.0 0.6292 0.6653 0.1236 0.7486 0.77 -1.0 0.6375 0.7974 -1.0 -1.0 0
0.2631 25.0 250 0.3608 0.5918 0.6733 0.6733 -1.0 0.5879 0.6012 0.1193 0.6936 0.6936 -1.0 0.6 0.7129 -1.0 -1.0 0
0.2628 26.0 260 0.3607 0.6089 0.6904 0.6904 -1.0 0.6516 0.608 0.1193 0.6943 0.7114 -1.0 0.6583 0.7224 -1.0 -1.0 0
0.2653 27.0 270 0.3692 0.6648 0.7623 0.7538 -1.0 0.7795 0.6512 0.1171 0.75 0.7771 -1.0 0.8042 0.7716 -1.0 -1.0 0
0.2272 28.0 280 0.3657 0.5998 0.6814 0.6814 -1.0 0.614 0.602 0.12 0.695 0.7007 -1.0 0.6292 0.7155 -1.0 -1.0 0
0.3795 29.0 290 0.3728 0.6409 0.7284 0.7277 -1.0 0.6901 0.6407 0.1264 0.7364 0.7486 -1.0 0.7042 0.7578 -1.0 -1.0 0
0.2568 30.0 300 0.3724 0.6933 0.7956 0.7854 -1.0 0.7381 0.6926 0.1236 0.7821 0.8043 -1.0 0.7542 0.8147 -1.0 -1.0 0
0.2632 31.0 310 0.3741 0.6626 0.7614 0.7522 -1.0 0.7747 0.651 0.1243 0.7479 0.7671 -1.0 0.7958 0.7612 -1.0 -1.0 0
0.3576 32.0 320 0.3649 0.6734 0.7746 0.7746 -1.0 0.7503 0.6686 0.1236 0.7586 0.7757 -1.0 0.7667 0.7776 -1.0 -1.0 0
0.2254 33.0 330 0.3683 0.6991 0.8085 0.8084 -1.0 0.7949 0.691 0.1243 0.7929 0.8121 -1.0 0.8125 0.8121 -1.0 -1.0 0
0.2495 34.0 340 0.3459 0.6975 0.811 0.8021 -1.0 0.7652 0.6919 0.1257 0.7793 0.805 -1.0 0.7833 0.8095 -1.0 -1.0 0
0.2051 35.0 350 0.3508 0.6903 0.7939 0.7845 -1.0 0.797 0.6835 0.1243 0.7693 0.7943 -1.0 0.8167 0.7897 -1.0 -1.0 0
0.2159 36.0 360 0.3510 0.693 0.7971 0.7971 -1.0 0.7619 0.6898 0.1214 0.7807 0.7986 -1.0 0.7792 0.8026 -1.0 -1.0 0
0.2234 37.0 370 0.3512 0.7033 0.8062 0.8062 -1.0 0.7588 0.7014 0.1236 0.78 0.8036 -1.0 0.7708 0.8103 -1.0 -1.0 0
0.2732 38.0 380 0.3603 0.6916 0.7917 0.7917 -1.0 0.6964 0.7019 0.1236 0.7857 0.7993 -1.0 0.7083 0.8181 -1.0 -1.0 0
0.2397 39.0 390 0.3633 0.7141 0.8125 0.804 -1.0 0.7074 0.7255 0.1264 0.7971 0.8186 -1.0 0.7167 0.8397 -1.0 -1.0 0
0.2534 40.0 400 0.3574 0.7115 0.8104 0.8104 -1.0 0.705 0.722 0.1236 0.7979 0.8179 -1.0 0.7125 0.8397 -1.0 -1.0 0
0.2168 41.0 410 0.3547 0.7106 0.8087 0.8087 -1.0 0.7594 0.7141 0.1257 0.8007 0.8229 -1.0 0.7708 0.8336 -1.0 -1.0 0
0.2237 42.0 420 0.3590 0.7055 0.8105 0.8105 -1.0 0.759 0.7089 0.1243 0.7964 0.8186 -1.0 0.7708 0.8284 -1.0 -1.0 0
0.2152 43.0 430 0.3582 0.7132 0.82 0.82 -1.0 0.7865 0.7109 0.1243 0.7971 0.8243 -1.0 0.8 0.8293 -1.0 -1.0 0
0.1932 44.0 440 0.3612 0.7056 0.8112 0.8112 -1.0 0.7825 0.7023 0.1229 0.7857 0.815 -1.0 0.8 0.8181 -1.0 -1.0 0
0.1897 45.0 450 0.3557 0.7077 0.8105 0.8105 -1.0 0.755 0.7111 0.1229 0.7957 0.8186 -1.0 0.7667 0.8293 -1.0 -1.0 0
0.213 46.0 460 0.3557 0.7136 0.8193 0.8193 -1.0 0.786 0.7101 0.1236 0.7943 0.8229 -1.0 0.8 0.8276 -1.0 -1.0 0
0.2169 47.0 470 0.3567 0.7142 0.8192 0.8192 -1.0 0.7864 0.711 0.1243 0.7957 0.8243 -1.0 0.8 0.8293 -1.0 -1.0 0
0.1971 48.0 480 0.3577 0.713 0.8181 0.8181 -1.0 0.7864 0.7097 0.1236 0.7957 0.8243 -1.0 0.8 0.8293 -1.0 -1.0 0
0.2515 49.0 490 0.3580 0.7134 0.8181 0.8181 -1.0 0.7864 0.71 0.1236 0.7964 0.825 -1.0 0.8 0.8302 -1.0 -1.0 0
0.2874 50.0 500 0.3578 0.7134 0.8181 0.8181 -1.0 0.7864 0.7101 0.1236 0.7964 0.825 -1.0 0.8 0.8302 -1.0 -1.0 0

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1