panels_detection_rtdetr_augmented

This model is a fine-tuned version of PekingU/rtdetr_r50vd_coco_o365 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 13.1458
  • Map: 0.2716
  • Map 50: 0.3676
  • Map 75: 0.2999
  • Map Small: -1.0
  • Map Medium: 0.339
  • Map Large: 0.2963
  • Mar 1: 0.4229
  • Mar 10: 0.5653
  • Mar 100: 0.5922
  • Mar Small: -1.0
  • Mar Medium: 0.4535
  • Mar Large: 0.6263
  • Map Radar (small): 0.1167
  • Mar 100 Radar (small): 0.6875
  • Map Ship management system (small): 0.615
  • Mar 100 Ship management system (small): 0.8077
  • Map Radar (large): 0.2059
  • Mar 100 Radar (large): 0.5217
  • Map Ship management system (large): 0.0695
  • Mar 100 Ship management system (large): 0.3702
  • Map Ship management system (top): 0.5528
  • Mar 100 Ship management system (top): 0.8087
  • Map Ecdis (large): 0.5496
  • Mar 100 Ecdis (large): 0.9193
  • Map Visual observation (small): 0.0041
  • Mar 100 Visual observation (small): 0.1021
  • Map Ecdis (small): 0.2262
  • Mar 100 Ecdis (small): 0.9154
  • Map Ship management system (table top): 0.3138
  • Mar 100 Ship management system (table top): 0.5657
  • Map Thruster control: 0.6685
  • Mar 100 Thruster control: 0.8256
  • Map Visual observation (left): 0.141
  • Mar 100 Visual observation (left): 0.7186
  • Map Visual observation (mid): 0.1513
  • Mar 100 Visual observation (mid): 0.5087
  • Map Visual observation (right): 0.0322
  • Mar 100 Visual observation (right): 0.2434
  • Map Bow thruster: 0.2794
  • Mar 100 Bow thruster: 0.5724
  • Map Me telegraph: 0.1478
  • Mar 100 Me telegraph: 0.3154

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Radar (small) Mar 100 Radar (small) Map Ship management system (small) Mar 100 Ship management system (small) Map Radar (large) Mar 100 Radar (large) Map Ship management system (large) Mar 100 Ship management system (large) Map Ship management system (top) Mar 100 Ship management system (top) Map Ecdis (large) Mar 100 Ecdis (large) Map Visual observation (small) Mar 100 Visual observation (small) Map Ecdis (small) Mar 100 Ecdis (small) Map Ship management system (table top) Mar 100 Ship management system (table top) Map Thruster control Mar 100 Thruster control Map Visual observation (left) Mar 100 Visual observation (left) Map Visual observation (mid) Mar 100 Visual observation (mid) Map Visual observation (right) Mar 100 Visual observation (right) Map Bow thruster Mar 100 Bow thruster Map Me telegraph Mar 100 Me telegraph
8.4513 1.0 397 10.3922 0.4094 0.514 0.4547 -1.0 0.295 0.416 0.5205 0.671 0.6975 -1.0 0.5291 0.7247 0.8486 0.9393 0.7263 0.8862 0.7814 0.914 0.7258 0.9289 0.7534 0.8481 0.5119 0.9035 0.0874 0.4833 0.0535 0.8423 0.1808 0.3943 0.303 0.7333 0.0478 0.7629 0.7613 0.8965 0.0059 0.2189 0.2985 0.4655 0.0559 0.2462
8.1253 2.0 794 10.2868 0.493 0.6032 0.5285 -1.0 0.2459 0.5437 0.6029 0.7667 0.7839 -1.0 0.5496 0.8401 0.8156 0.9446 0.7535 0.9169 0.6787 0.9171 0.7445 0.9653 0.6818 0.8558 0.7943 0.9561 0.0721 0.6708 0.4244 0.9192 0.413 0.5286 0.2675 0.5487 0.2213 0.93 0.7938 0.9617 0.1889 0.7434 0.3784 0.5379 0.1678 0.3615
8.0068 3.0 1191 11.7780 0.3272 0.4283 0.3611 -1.0 0.2647 0.3723 0.4608 0.5782 0.5984 -1.0 0.4623 0.6375 0.1775 0.5857 0.4329 0.6215 0.4254 0.7295 0.3831 0.6066 0.7085 0.8644 0.6623 0.9061 0.0002 0.0167 0.3838 0.8 0.3677 0.7657 0.6698 0.7949 0.1571 0.7557 0.1336 0.5904 0.0578 0.2528 0.2839 0.4862 0.064 0.2
7.6432 4.0 1588 12.1826 0.2727 0.3822 0.3009 -1.0 0.2386 0.2872 0.4224 0.5885 0.6213 -1.0 0.3722 0.6607 0.1547 0.7696 0.5703 0.8585 0.2211 0.6302 0.068 0.5826 0.6042 0.85 0.4937 0.9105 0.0039 0.1208 0.1132 0.9 0.5012 0.6629 0.5743 0.6385 0.1246 0.8343 0.0925 0.6009 0.0052 0.1321 0.3535 0.4862 0.21 0.3423
7.2118 5.0 1985 10.7370 0.422 0.523 0.4602 -1.0 0.4067 0.4716 0.5546 0.7167 0.7475 -1.0 0.5951 0.8004 0.2887 0.8054 0.6827 0.9092 0.4278 0.8651 0.6697 0.9248 0.729 0.8788 0.6685 0.9649 0.0397 0.3521 0.6509 0.9731 0.6702 0.7971 0.5996 0.8103 0.0968 0.8471 0.4106 0.7904 0.0739 0.4698 0.2219 0.5207 0.0993 0.3038
6.8503 6.0 2382 12.6236 0.3114 0.4103 0.3386 -1.0 0.3392 0.3656 0.4614 0.6187 0.6432 -1.0 0.5065 0.6996 0.2113 0.7643 0.6179 0.8831 0.1871 0.676 0.3889 0.8248 0.5547 0.7596 0.5439 0.9439 0.0034 0.0812 0.3491 0.9654 0.4448 0.6686 0.6626 0.7692 0.1653 0.7714 0.145 0.5113 0.0273 0.2792 0.1843 0.4621 0.1852 0.2885
6.5273 7.0 2779 12.6545 0.3121 0.4213 0.3466 -1.0 0.3182 0.3481 0.4635 0.626 0.649 -1.0 0.5283 0.6854 0.176 0.8125 0.6625 0.8738 0.2449 0.6605 0.1453 0.562 0.5223 0.8067 0.6556 0.9509 0.0134 0.1937 0.3682 0.9462 0.4038 0.6657 0.7023 0.8333 0.1228 0.6743 0.1926 0.6252 0.0452 0.2925 0.2814 0.5759 0.1451 0.2615
6.2721 8.0 3176 13.0793 0.2565 0.3545 0.2826 -1.0 0.284 0.286 0.4092 0.552 0.5818 -1.0 0.4221 0.6166 0.1159 0.6643 0.5472 0.7862 0.2094 0.5047 0.0775 0.395 0.5442 0.8135 0.5517 0.9009 0.0029 0.0708 0.2641 0.9269 0.2066 0.4657 0.6357 0.8077 0.0902 0.7071 0.1688 0.5617 0.0261 0.2208 0.2688 0.5483 0.1376 0.3538
6.1623 9.0 3573 13.0892 0.2651 0.3555 0.3017 -1.0 0.3047 0.2866 0.4149 0.5729 0.5997 -1.0 0.4693 0.6365 0.1156 0.7214 0.5878 0.8062 0.1642 0.5109 0.055 0.3678 0.53 0.8269 0.5833 0.9175 0.0049 0.1083 0.1511 0.9269 0.34 0.5771 0.7039 0.8744 0.11 0.7414 0.1774 0.5478 0.0305 0.2358 0.2654 0.5517 0.157 0.2808
6.1458 10.0 3970 13.1458 0.2716 0.3676 0.2999 -1.0 0.339 0.2963 0.4229 0.5653 0.5922 -1.0 0.4535 0.6263 0.1167 0.6875 0.615 0.8077 0.2059 0.5217 0.0695 0.3702 0.5528 0.8087 0.5496 0.9193 0.0041 0.1021 0.2262 0.9154 0.3138 0.5657 0.6685 0.8256 0.141 0.7186 0.1513 0.5087 0.0322 0.2434 0.2794 0.5724 0.1478 0.3154

Framework versions

  • Transformers 4.46.0
  • Pytorch 2.5.0+cu121
  • Datasets 3.0.2
  • Tokenizers 0.20.1
Downloads last month
4
Safetensors
Model size
42.9M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for cems-official/panels_detection_rtdetr_augmented

Finetuned
(15)
this model