Edit model card

detr_finetuned_oculardataset

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the dsi dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0672
  • Map: 0.3032
  • Map 50: 0.4973
  • Map 75: 0.3701
  • Map Small: 0.2981
  • Map Medium: 0.6746
  • Map Large: -1.0
  • Mar 1: 0.1
  • Mar 10: 0.3678
  • Mar 100: 0.4114
  • Mar Small: 0.4054
  • Mar Medium: 0.7421
  • Mar Large: -1.0
  • Map Falciparum Trophozoite: 0.0156
  • Mar 100 Falciparum Trophozoite: 0.1511
  • Map Wbc: 0.5908
  • Mar 100 Wbc: 0.6716

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Falciparum Trophozoite Mar 100 Falciparum Trophozoite Map Wbc Mar 100 Wbc
No log 1.0 86 1.6645 0.131 0.2562 0.1153 0.1289 0.3974 -1.0 0.0647 0.2312 0.3164 0.314 0.6159 -1.0 0.0004 0.0456 0.2616 0.5873
No log 2.0 172 1.4800 0.2028 0.4079 0.1766 0.1993 0.4876 -1.0 0.0677 0.2725 0.3282 0.3251 0.628 -1.0 0.0007 0.0648 0.405 0.5915
No log 3.0 258 1.3829 0.2264 0.4496 0.1936 0.2193 0.5542 -1.0 0.0729 0.2807 0.3215 0.3168 0.629 -1.0 0.0019 0.0706 0.451 0.5725
No log 4.0 344 1.3318 0.2089 0.4403 0.1427 0.2056 0.4726 -1.0 0.0691 0.2751 0.3221 0.3116 0.6748 -1.0 0.002 0.0941 0.4158 0.5502
No log 5.0 430 1.2739 0.2454 0.4562 0.2342 0.2354 0.614 -1.0 0.0777 0.3046 0.3482 0.338 0.7262 -1.0 0.002 0.0906 0.4888 0.6058
1.7665 6.0 516 1.2365 0.2599 0.4744 0.2599 0.2522 0.6258 -1.0 0.0846 0.3217 0.361 0.354 0.7 -1.0 0.005 0.1047 0.5149 0.6173
1.7665 7.0 602 1.2548 0.2488 0.4689 0.2302 0.2434 0.5622 -1.0 0.0788 0.31 0.3519 0.3446 0.6888 -1.0 0.0038 0.1012 0.4938 0.6026
1.7665 8.0 688 1.2031 0.2715 0.474 0.3074 0.2664 0.6153 -1.0 0.0897 0.3309 0.3744 0.3723 0.657 -1.0 0.0058 0.1164 0.5373 0.6325
1.7665 9.0 774 1.2492 0.2417 0.4715 0.2154 0.2349 0.5753 -1.0 0.0789 0.3064 0.3503 0.342 0.686 -1.0 0.0043 0.1129 0.4791 0.5877
1.7665 10.0 860 1.1861 0.2752 0.4772 0.2891 0.2683 0.6259 -1.0 0.0872 0.3342 0.3823 0.379 0.6813 -1.0 0.0061 0.1217 0.5443 0.6429
1.7665 11.0 946 1.1996 0.2607 0.4605 0.2779 0.2565 0.5972 -1.0 0.085 0.326 0.3722 0.3669 0.6813 -1.0 0.0041 0.1254 0.5173 0.6189
1.2663 12.0 1032 1.1664 0.2764 0.4753 0.3137 0.2718 0.6148 -1.0 0.0892 0.333 0.3781 0.3741 0.685 -1.0 0.0054 0.1188 0.5473 0.6375
1.2663 13.0 1118 1.1451 0.2804 0.4694 0.3212 0.2732 0.6595 -1.0 0.092 0.3412 0.3852 0.3787 0.7187 -1.0 0.0051 0.1282 0.5557 0.6421
1.2663 14.0 1204 1.1251 0.2889 0.4761 0.3401 0.2835 0.6619 -1.0 0.0926 0.3496 0.3979 0.393 0.714 -1.0 0.0091 0.1391 0.5687 0.6567
1.2663 15.0 1290 1.1493 0.2778 0.4695 0.3126 0.2706 0.6531 -1.0 0.0911 0.3415 0.3881 0.3792 0.743 -1.0 0.0054 0.1382 0.5502 0.6379
1.2663 16.0 1376 1.1125 0.2846 0.4799 0.3307 0.2804 0.6415 -1.0 0.0926 0.3498 0.4005 0.3954 0.7159 -1.0 0.0075 0.1452 0.5617 0.6558
1.2663 17.0 1462 1.1002 0.2909 0.4816 0.3471 0.2859 0.6545 -1.0 0.0956 0.3554 0.4036 0.3969 0.7421 -1.0 0.0077 0.145 0.5741 0.6622
1.1448 18.0 1548 1.1066 0.2853 0.484 0.3205 0.2796 0.6647 -1.0 0.0918 0.3472 0.3944 0.3883 0.7196 -1.0 0.0092 0.1415 0.5613 0.6474
1.1448 19.0 1634 1.0993 0.2933 0.4838 0.3441 0.2884 0.6683 -1.0 0.0978 0.3581 0.401 0.3958 0.7252 -1.0 0.0079 0.1374 0.5787 0.6645
1.1448 20.0 1720 1.0850 0.298 0.4855 0.3594 0.2923 0.6669 -1.0 0.0963 0.3606 0.4011 0.3952 0.7374 -1.0 0.0093 0.1348 0.5867 0.6675
1.1448 21.0 1806 1.0814 0.3006 0.4908 0.3618 0.2951 0.6868 -1.0 0.0994 0.3628 0.4056 0.4001 0.7355 -1.0 0.0117 0.1413 0.5896 0.67
1.1448 22.0 1892 1.0836 0.2975 0.495 0.3541 0.2924 0.6712 -1.0 0.0989 0.3628 0.4084 0.4036 0.7196 -1.0 0.0135 0.1534 0.5816 0.6633
1.1448 23.0 1978 1.0813 0.2996 0.4965 0.3567 0.2941 0.6792 -1.0 0.0979 0.3625 0.408 0.402 0.7364 -1.0 0.015 0.1505 0.5842 0.6655
1.0601 24.0 2064 1.0707 0.3048 0.4952 0.3624 0.2987 0.6876 -1.0 0.0981 0.3659 0.4118 0.4054 0.7486 -1.0 0.0144 0.1501 0.5951 0.6735
1.0601 25.0 2150 1.0736 0.2982 0.4935 0.3584 0.2931 0.6732 -1.0 0.0992 0.3638 0.41 0.4053 0.7224 -1.0 0.0126 0.1521 0.5839 0.6678
1.0601 26.0 2236 1.0717 0.3034 0.4978 0.3622 0.2986 0.6788 -1.0 0.0995 0.3659 0.411 0.405 0.7421 -1.0 0.015 0.1501 0.5918 0.6719
1.0601 27.0 2322 1.0688 0.3025 0.4978 0.3622 0.2975 0.6747 -1.0 0.1 0.3674 0.4108 0.4047 0.7421 -1.0 0.0161 0.1524 0.5888 0.6693
1.0601 28.0 2408 1.0679 0.3031 0.4968 0.3638 0.2976 0.6805 -1.0 0.0999 0.3679 0.4106 0.4046 0.7421 -1.0 0.0156 0.1507 0.5905 0.6705
1.0601 29.0 2494 1.0669 0.3035 0.4976 0.3717 0.2985 0.6751 -1.0 0.0999 0.368 0.4115 0.4055 0.743 -1.0 0.0156 0.1509 0.5915 0.6721
1.0103 30.0 2580 1.0672 0.3032 0.4973 0.3701 0.2981 0.6746 -1.0 0.1 0.3678 0.4114 0.4054 0.7421 -1.0 0.0156 0.1511 0.5908 0.6716

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
58
Safetensors
Model size
43.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for marthakk/detr_finetuned_oculardataset

Finetuned
(46)
this model