mAP Drop

#1
by mhyatt000 - opened

I tried to reproduce the results mentioned on this model card. Seems like my results do not match the claimed mAP in the model card. I cannot figure out how to get the correct numbers, can you help me find my mistake?

  • Claimed mAP: 43.3
  • Recieved mAP: 41.2

Here are the details for my validation:

  • I instantiate pre-trained model with transformers.pipeline() and use COCO API to calculate AP from detection bboxes.
  • Evaluation was performed on macOS CPU.
  • Dataset was downloaded from cocodataset.org

 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.412
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.601
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.436
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.208
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.451
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.588
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.328
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.494
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.508
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.268
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.554
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.702

WhatsApp Image 2023-05-24 at 10.02.29.jpeg

this is the result of my experiment

Sign up or log in to comment