Evaluation

#12
by tigeu100 - opened

Hello,

I have a few questions regarding the evaluation. First of all, is the used F1 score as stated in the "Dataset" tab the macro averaged F1 score or as stated in the "New Submission" tab the weighted macro averaged F1 score?
Is the used metric for "public_track1" the Top 1 accuracy or the described custom metric? If it is Top 1 accuracy, is there a possibilty to get the score with the custom metric? Also this metric is different from the one described here (https://www.imageclef.org/SnakeCLEF2023). Which one will be used for the final evaluation?
Last but not least, do you know when the limit for submissions will be changed? It just says "before the end of the competition".

Thank you for your time.

Best regards
Tim

tigeu100 changed discussion status to closed

Sign up or log in to comment