Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

conditional-detr-resnet-50-dsi-v1

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4557
  • Map: 0.464
  • Map 50: 0.8455
  • Map 75: 0.4707
  • Map Small: 0.2749
  • Map Medium: 0.5121
  • Map Large: 0.0
  • Mar 1: 0.107
  • Mar 10: 0.442
  • Mar 100: 0.5911
  • Mar Small: 0.4334
  • Mar Medium: 0.6371
  • Mar Large: 0.0
  • Map Falciparum Trophozoite: 0.33
  • Mar 100 Falciparum Trophozoite: 0.5045
  • Map Wbc: 0.598
  • Mar 100 Wbc: 0.6777

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 20
  • eval_batch_size: 20
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 40
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Falciparum Trophozoite Mar 100 Falciparum Trophozoite Map Wbc Mar 100 Wbc
4.7221 1.0 92 1.7449 0.0002 0.001 0.0 0.0001 0.0003 0.0 0.0003 0.0067 0.0595 0.0175 0.0753 0.0 0.0 0.0039 0.0004 0.1151
0.9978 2.0 184 0.7737 0.0425 0.1154 0.02 0.0004 0.066 0.0 0.0096 0.0532 0.1973 0.0122 0.2436 0.0 0.0843 0.3754 0.0007 0.0192
0.7366 3.0 276 0.6922 0.0933 0.2277 0.0592 0.0045 0.1223 0.0 0.0367 0.1627 0.3295 0.0883 0.3992 0.0 0.1307 0.4129 0.056 0.246
0.675 4.0 368 0.6484 0.1838 0.4025 0.1488 0.0209 0.2377 0.0 0.0588 0.2531 0.4501 0.159 0.5383 0.0 0.1696 0.435 0.1981 0.4651
0.6388 5.0 460 0.6139 0.2377 0.494 0.2059 0.0667 0.2942 0.0 0.0761 0.3152 0.5005 0.2146 0.5866 0.0 0.2014 0.4489 0.2739 0.5522
0.6162 6.0 552 0.6061 0.3054 0.6205 0.2707 0.1176 0.3678 0.0 0.0809 0.3384 0.4981 0.2407 0.5756 0.0 0.2209 0.4398 0.3899 0.5565
0.5872 7.0 644 0.5732 0.3392 0.6905 0.3015 0.1437 0.396 0.0 0.0827 0.3605 0.5122 0.2921 0.5768 0.0 0.2449 0.4433 0.4336 0.5811
0.5694 8.0 736 0.5545 0.3571 0.7159 0.3163 0.1592 0.4164 0.0 0.0856 0.3778 0.5211 0.2874 0.5901 0.0 0.2518 0.4433 0.4625 0.5988
0.5566 9.0 828 0.5635 0.3532 0.7262 0.3138 0.1461 0.414 0.0 0.086 0.3607 0.4992 0.2613 0.5701 0.0 0.2531 0.4319 0.4532 0.5664
0.5462 10.0 920 0.5554 0.3748 0.7365 0.3441 0.1652 0.4372 0.0 0.0899 0.3838 0.5199 0.2816 0.5907 0.0 0.2596 0.4302 0.4901 0.6096
0.5402 11.0 1012 0.5313 0.3817 0.7606 0.3513 0.1725 0.4418 0.0 0.0925 0.386 0.5272 0.3002 0.5948 0.0 0.2723 0.4495 0.4912 0.605
0.5288 12.0 1104 0.5280 0.3855 0.7707 0.3457 0.1872 0.4383 0.0 0.0925 0.3898 0.5298 0.3371 0.5855 0.0 0.2687 0.4404 0.5023 0.6191
0.5243 13.0 1196 0.5206 0.3996 0.7756 0.3808 0.1875 0.4581 0.0 0.0952 0.4011 0.5401 0.3348 0.6008 0.0 0.28 0.4501 0.5192 0.6301
0.5235 14.0 1288 0.5228 0.3932 0.7769 0.3634 0.1787 0.4557 0.0 0.0956 0.3939 0.5325 0.3017 0.601 0.0 0.281 0.4491 0.5055 0.616
0.5173 15.0 1380 0.5254 0.3904 0.7797 0.3519 0.1856 0.4468 0.0 0.0916 0.3907 0.5314 0.3251 0.5917 0.0 0.2818 0.4452 0.4989 0.6177
0.5139 16.0 1472 0.5161 0.4111 0.7943 0.3818 0.1983 0.4686 0.0 0.0958 0.4023 0.5438 0.347 0.6021 0.0 0.2938 0.4551 0.5283 0.6325
0.506 17.0 1564 0.5176 0.395 0.7903 0.3601 0.1826 0.4533 0.0 0.0919 0.3932 0.5298 0.3205 0.5914 0.0 0.2779 0.444 0.5122 0.6156
0.5034 18.0 1656 0.5138 0.3825 0.7854 0.3312 0.1845 0.4381 0.0 0.088 0.3914 0.5284 0.3272 0.5885 0.0 0.2863 0.4509 0.4786 0.6058
0.5008 19.0 1748 0.5093 0.4148 0.7981 0.392 0.2019 0.4725 0.0 0.0974 0.4092 0.5484 0.3377 0.6107 0.0 0.2918 0.4554 0.5378 0.6415
0.4989 20.0 1840 0.5151 0.4172 0.8046 0.3923 0.2022 0.475 0.0 0.0963 0.4082 0.5464 0.3361 0.6076 0.0 0.2931 0.4545 0.5413 0.6383
0.5017 21.0 1932 0.5083 0.4049 0.7912 0.3744 0.1888 0.4647 0.0 0.0964 0.3993 0.5389 0.3225 0.6026 0.0 0.2873 0.4575 0.5224 0.6203
0.4945 22.0 2024 0.5225 0.408 0.8057 0.3732 0.1938 0.4662 0.0 0.093 0.3986 0.5361 0.3277 0.5974 0.0 0.2922 0.4491 0.5237 0.6231
0.4925 23.0 2116 0.5122 0.4203 0.8092 0.402 0.21 0.4768 0.0 0.1006 0.4106 0.5488 0.3525 0.6064 0.0 0.2917 0.4549 0.5489 0.6427
0.49 24.0 2208 0.4960 0.4204 0.8056 0.4077 0.2244 0.4713 0.0 0.0976 0.4124 0.5532 0.378 0.604 0.0 0.2994 0.4658 0.5413 0.6406
0.4876 25.0 2300 0.5023 0.4254 0.8086 0.4104 0.218 0.4794 0.0 0.0993 0.4129 0.5507 0.3687 0.604 0.0 0.2969 0.4602 0.554 0.6412
0.4879 26.0 2392 0.5008 0.4256 0.8071 0.4155 0.2167 0.4791 0.0 0.0991 0.4186 0.5612 0.3794 0.6146 0.0 0.2913 0.4658 0.5598 0.6566
0.4874 27.0 2484 0.5059 0.4122 0.8017 0.3874 0.2054 0.4665 0.0 0.097 0.4035 0.5407 0.3503 0.5965 0.0 0.2832 0.4487 0.5412 0.6327
0.4871 28.0 2576 0.4942 0.4217 0.8088 0.4083 0.225 0.4719 0.0 0.0993 0.4124 0.5534 0.3832 0.6031 0.0 0.2992 0.4666 0.5441 0.6402
0.4816 29.0 2668 0.5000 0.402 0.8012 0.3715 0.2032 0.4541 0.0 0.0942 0.3996 0.5358 0.3517 0.5897 0.0 0.2861 0.4511 0.518 0.6206
0.4822 30.0 2760 0.4866 0.4339 0.8138 0.4283 0.2312 0.4857 0.0 0.1023 0.4216 0.5668 0.3931 0.6173 0.0 0.3024 0.4774 0.5653 0.6561
0.4795 31.0 2852 0.4961 0.4278 0.8102 0.4104 0.2215 0.4834 0.0 0.0998 0.4183 0.5591 0.3702 0.6143 0.0 0.3028 0.4665 0.5528 0.6516
0.4821 32.0 2944 0.4932 0.4281 0.8132 0.417 0.2247 0.4814 0.0 0.1003 0.4171 0.5631 0.3847 0.6152 0.0 0.2988 0.4747 0.5574 0.6515
0.4782 33.0 3036 0.4968 0.4297 0.8166 0.4233 0.2252 0.4828 0.0 0.1007 0.4194 0.5653 0.3742 0.6211 0.0 0.3031 0.4792 0.5562 0.6515
0.4772 34.0 3128 0.4929 0.4273 0.8143 0.4142 0.2273 0.479 0.0 0.1001 0.4196 0.561 0.3861 0.612 0.0 0.2989 0.4685 0.5557 0.6535
0.4758 35.0 3220 0.4857 0.4245 0.8112 0.4139 0.2263 0.4756 0.0 0.0981 0.4164 0.5587 0.3831 0.6097 0.0 0.3034 0.4787 0.5457 0.6387
0.4762 36.0 3312 0.4878 0.4345 0.8166 0.4238 0.2319 0.4856 0.0 0.1006 0.4201 0.5657 0.3906 0.6167 0.0 0.3072 0.4813 0.5617 0.6502
0.4748 37.0 3404 0.4911 0.4278 0.8174 0.4038 0.2217 0.4808 0.0 0.0993 0.4133 0.5584 0.3703 0.6128 0.0 0.3087 0.4808 0.547 0.636
0.4746 38.0 3496 0.4900 0.4356 0.8275 0.4197 0.2275 0.4893 0.0 0.1002 0.418 0.5633 0.3826 0.6165 0.0 0.3101 0.4774 0.5611 0.6492
0.4712 39.0 3588 0.4903 0.4377 0.8223 0.4319 0.2414 0.4894 0.0 0.1025 0.4227 0.5673 0.3893 0.6193 0.0 0.3079 0.4776 0.5676 0.6571
0.4703 40.0 3680 0.4841 0.433 0.8224 0.4165 0.2357 0.4844 0.0 0.1011 0.4205 0.5656 0.3897 0.617 0.0 0.306 0.4754 0.5601 0.6557
0.4708 41.0 3772 0.4832 0.4335 0.8194 0.4177 0.2335 0.4839 0.0 0.1009 0.4178 0.5608 0.389 0.6102 0.0 0.3078 0.4793 0.5592 0.6424
0.4688 42.0 3864 0.4914 0.4404 0.8234 0.4312 0.2395 0.4919 0.0 0.1023 0.4253 0.57 0.3928 0.6218 0.0 0.3109 0.4805 0.5699 0.6594
0.4684 43.0 3956 0.4840 0.4328 0.8169 0.4191 0.2259 0.4876 0.0 0.1 0.4184 0.5617 0.3687 0.6188 0.0 0.3104 0.4796 0.5552 0.6437
0.4684 44.0 4048 0.4822 0.4439 0.8286 0.4327 0.2389 0.4975 0.0 0.1027 0.4252 0.5763 0.3942 0.6296 0.0 0.3144 0.4907 0.5735 0.6619
0.4685 45.0 4140 0.4875 0.4284 0.8204 0.4103 0.2268 0.4816 0.0 0.0995 0.4142 0.5562 0.3747 0.6092 0.0 0.3014 0.4751 0.5554 0.6373
0.4632 46.0 4232 0.4841 0.4279 0.8218 0.4144 0.2378 0.4771 0.0 0.0989 0.4155 0.5583 0.3818 0.6089 0.0 0.3043 0.4772 0.5515 0.6394
0.4652 47.0 4324 0.4863 0.4343 0.8225 0.4226 0.2398 0.4854 0.0 0.1008 0.4223 0.5655 0.3897 0.617 0.0 0.3058 0.476 0.5628 0.6551
0.4621 48.0 4416 0.4788 0.437 0.828 0.4177 0.2361 0.4884 0.0 0.102 0.4195 0.5663 0.3919 0.6176 0.0 0.3137 0.4858 0.5603 0.6469
0.4614 49.0 4508 0.4823 0.4426 0.8262 0.4386 0.2459 0.4925 0.0 0.1041 0.426 0.5746 0.4085 0.6224 0.0 0.3129 0.4891 0.5724 0.66
0.4591 50.0 4600 0.4780 0.4364 0.8246 0.4234 0.2398 0.4853 0.0 0.1028 0.4224 0.5649 0.4007 0.612 0.0 0.3055 0.4773 0.5673 0.6525
0.4616 51.0 4692 0.4813 0.4296 0.8242 0.4212 0.242 0.4757 0.0 0.1001 0.4143 0.5638 0.4068 0.6085 0.0 0.3133 0.4896 0.5459 0.638
0.4589 52.0 4784 0.4801 0.4431 0.8323 0.4359 0.2389 0.4978 0.0 0.1029 0.4242 0.5696 0.3743 0.6264 0.0 0.3105 0.4807 0.5756 0.6585
0.4583 53.0 4876 0.4794 0.4433 0.8271 0.432 0.2489 0.4934 0.0 0.1039 0.4268 0.5734 0.4038 0.6228 0.0 0.3133 0.4897 0.5734 0.6571
0.4587 54.0 4968 0.4830 0.4382 0.8273 0.4267 0.2432 0.4893 0.0 0.1016 0.4213 0.5697 0.3986 0.6198 0.0 0.3094 0.4862 0.567 0.6531
0.4557 55.0 5060 0.4717 0.4442 0.8298 0.4451 0.2502 0.4941 0.0 0.104 0.4281 0.572 0.4011 0.6223 0.0 0.3123 0.4836 0.5762 0.6605
0.4575 56.0 5152 0.4693 0.4499 0.837 0.4462 0.247 0.5025 0.0 0.104 0.431 0.5782 0.3988 0.6311 0.0 0.3185 0.4916 0.5813 0.6647
0.4568 57.0 5244 0.4795 0.4408 0.8316 0.4312 0.2423 0.4921 0.0 0.1027 0.4244 0.5721 0.3939 0.6245 0.0 0.3149 0.4926 0.5667 0.6516
0.4539 58.0 5336 0.4743 0.4513 0.8341 0.4535 0.2551 0.5011 0.0 0.1045 0.4319 0.5827 0.4107 0.6329 0.0 0.3172 0.4971 0.5853 0.6682
0.4517 59.0 5428 0.4707 0.4495 0.8346 0.4503 0.2513 0.5006 0.0 0.1049 0.4318 0.5799 0.4032 0.6313 0.0 0.3175 0.4925 0.5815 0.6673
0.4513 60.0 5520 0.4709 0.4507 0.8325 0.4551 0.2538 0.5012 0.0 0.1055 0.4341 0.581 0.4089 0.6309 0.0 0.315 0.4896 0.5864 0.6723
0.4519 61.0 5612 0.4681 0.4521 0.8357 0.449 0.2545 0.502 0.0 0.1052 0.4348 0.5838 0.4133 0.6338 0.0 0.3217 0.4964 0.5824 0.6712
0.4508 62.0 5704 0.4671 0.4535 0.8363 0.4519 0.2577 0.5033 0.0 0.1036 0.4356 0.5833 0.4153 0.6324 0.0 0.3215 0.4959 0.5855 0.6706
0.4461 63.0 5796 0.4665 0.4514 0.835 0.4462 0.2592 0.5014 0.0 0.1043 0.4331 0.5837 0.4153 0.633 0.0 0.3194 0.4995 0.5834 0.6679
0.4487 64.0 5888 0.4673 0.4557 0.8363 0.4549 0.2573 0.5069 0.0 0.1059 0.4354 0.5837 0.4092 0.635 0.0 0.3224 0.4976 0.589 0.6698
0.4494 65.0 5980 0.4644 0.4535 0.8374 0.4548 0.2607 0.5033 0.0 0.1061 0.4363 0.5858 0.4188 0.634 0.0 0.3206 0.496 0.5865 0.6756
0.4468 66.0 6072 0.4663 0.4534 0.8362 0.4472 0.2517 0.505 0.0 0.1061 0.4331 0.5829 0.4129 0.6329 0.0 0.3228 0.4989 0.584 0.6669
0.4447 67.0 6164 0.4651 0.4518 0.8374 0.4506 0.2593 0.4993 0.0 0.1061 0.4344 0.5832 0.4255 0.6285 0.0 0.3225 0.5006 0.5812 0.6657
0.4469 68.0 6256 0.4642 0.4538 0.8371 0.457 0.2595 0.503 0.0 0.1069 0.4363 0.5842 0.4162 0.6329 0.0 0.3199 0.4969 0.5876 0.6715
0.4444 69.0 6348 0.4660 0.454 0.8382 0.4567 0.2633 0.5035 0.0 0.105 0.4338 0.5846 0.4172 0.6333 0.0 0.3224 0.4983 0.5856 0.671
0.4426 70.0 6440 0.4661 0.4578 0.839 0.4569 0.2586 0.5083 0.0 0.1066 0.4378 0.5873 0.4205 0.6362 0.0 0.324 0.497 0.5915 0.6777
0.442 71.0 6532 0.4626 0.4593 0.8388 0.4612 0.2683 0.5075 0.0 0.1066 0.4397 0.5882 0.4276 0.6348 0.0 0.3261 0.5 0.5925 0.6765
0.4411 72.0 6624 0.4647 0.4579 0.8405 0.4585 0.2616 0.5085 0.0 0.1067 0.4377 0.5854 0.4128 0.6357 0.0 0.3259 0.4983 0.59 0.6725
0.44 73.0 6716 0.4634 0.4581 0.839 0.4582 0.2676 0.5057 0.0 0.1063 0.4371 0.5869 0.4298 0.6325 0.0 0.3267 0.5007 0.5894 0.6731
0.4399 74.0 6808 0.4631 0.4557 0.8384 0.4583 0.2643 0.5053 0.0 0.107 0.4365 0.585 0.4185 0.6338 0.0 0.3259 0.5012 0.5855 0.6688
0.4391 75.0 6900 0.4622 0.4562 0.8397 0.4525 0.2556 0.5081 0.0 0.106 0.4357 0.5856 0.4151 0.6359 0.0 0.3259 0.501 0.5865 0.6702
0.4379 76.0 6992 0.4614 0.4599 0.8423 0.4569 0.269 0.5075 0.0 0.108 0.4396 0.5875 0.4311 0.6328 0.0 0.3257 0.4997 0.5942 0.6753
0.4397 77.0 7084 0.4613 0.4594 0.8423 0.4577 0.2638 0.5092 0.0 0.1074 0.4381 0.5861 0.4201 0.6347 0.0 0.3284 0.5003 0.5905 0.6719
0.4368 78.0 7176 0.4607 0.4614 0.8422 0.4654 0.2677 0.511 0.0 0.1075 0.4396 0.5892 0.4236 0.6374 0.0 0.3287 0.5027 0.594 0.6756
0.4383 79.0 7268 0.4618 0.4603 0.8428 0.4636 0.2672 0.5094 0.0 0.1078 0.4402 0.5876 0.4233 0.6354 0.0 0.3278 0.5011 0.5929 0.6742
0.437 80.0 7360 0.4603 0.4602 0.843 0.4653 0.27 0.5086 0.0 0.1072 0.4401 0.5887 0.4306 0.6347 0.0 0.3263 0.5012 0.594 0.6761
0.4367 81.0 7452 0.4594 0.4606 0.8444 0.4621 0.2686 0.5099 0.0 0.1073 0.4398 0.5884 0.423 0.6369 0.0 0.3264 0.501 0.5948 0.6759
0.4363 82.0 7544 0.4607 0.4606 0.8422 0.4633 0.2723 0.5088 0.0 0.1071 0.4401 0.5892 0.4288 0.6359 0.0 0.326 0.5021 0.5951 0.6763
0.4355 83.0 7636 0.4560 0.4636 0.8447 0.4688 0.2763 0.5113 0.0 0.108 0.4422 0.5914 0.4348 0.637 0.0 0.3288 0.5047 0.5984 0.6782
0.4343 84.0 7728 0.4575 0.4628 0.845 0.4722 0.2767 0.5105 0.0 0.107 0.442 0.5908 0.4338 0.6364 0.0 0.3271 0.5043 0.5984 0.6772
0.4343 85.0 7820 0.4580 0.4632 0.8448 0.4678 0.2741 0.5114 0.0 0.1065 0.4413 0.5901 0.4332 0.636 0.0 0.3285 0.5037 0.5978 0.6765
0.4351 86.0 7912 0.4586 0.4624 0.8443 0.4689 0.2758 0.5097 0.0 0.1068 0.4417 0.5906 0.4355 0.6356 0.0 0.3274 0.5032 0.5974 0.678
0.4339 87.0 8004 0.4575 0.4627 0.8453 0.4665 0.2754 0.5108 0.0 0.1069 0.441 0.5903 0.4337 0.6359 0.0 0.3298 0.5049 0.5957 0.6757
0.4347 88.0 8096 0.4578 0.4641 0.8459 0.47 0.2737 0.5128 0.0 0.1076 0.4426 0.5923 0.4316 0.6394 0.0 0.3297 0.5064 0.5985 0.6783
0.434 89.0 8188 0.4563 0.4634 0.8453 0.4657 0.2724 0.5124 0.0 0.1074 0.442 0.5921 0.431 0.6392 0.0 0.3297 0.5057 0.597 0.6785
0.4332 90.0 8280 0.4562 0.4638 0.8448 0.4704 0.275 0.5123 0.0 0.1077 0.442 0.5914 0.4332 0.6375 0.0 0.3291 0.5051 0.5984 0.6777
0.4338 91.0 8372 0.4562 0.4633 0.8461 0.4709 0.275 0.5116 0.0 0.107 0.4422 0.5912 0.4329 0.6373 0.0 0.3289 0.5043 0.5978 0.6781
0.4355 92.0 8464 0.4553 0.4633 0.8452 0.4696 0.2743 0.5115 0.0 0.1069 0.4414 0.5915 0.4332 0.6376 0.0 0.33 0.5048 0.5966 0.6781
0.433 93.0 8556 0.4553 0.464 0.8466 0.4699 0.2742 0.5122 0.0 0.107 0.442 0.591 0.4329 0.6371 0.0 0.33 0.5041 0.5979 0.678
0.4313 94.0 8648 0.4556 0.4638 0.8457 0.4718 0.2752 0.5117 0.0 0.1067 0.4422 0.5914 0.4351 0.6369 0.0 0.3302 0.5047 0.5975 0.6781
0.433 95.0 8740 0.4558 0.464 0.8466 0.4697 0.2756 0.5118 0.0 0.1071 0.4417 0.5911 0.4344 0.6366 0.0 0.33 0.5044 0.598 0.6777
0.4333 96.0 8832 0.4555 0.4643 0.8457 0.4705 0.276 0.5123 0.0 0.1073 0.4422 0.5912 0.4341 0.6369 0.0 0.33 0.5044 0.5986 0.678
0.4325 97.0 8924 0.4557 0.4647 0.8459 0.4718 0.2762 0.5127 0.0 0.1074 0.4427 0.5915 0.434 0.6374 0.0 0.3303 0.5047 0.5991 0.6784
0.4337 98.0 9016 0.4558 0.4644 0.8456 0.4713 0.2755 0.5125 0.0 0.107 0.4423 0.5913 0.4338 0.6372 0.0 0.3303 0.5047 0.5985 0.678
0.4321 99.0 9108 0.4558 0.464 0.8455 0.4707 0.2749 0.5122 0.0 0.107 0.442 0.5911 0.4334 0.637 0.0 0.33 0.5045 0.598 0.6777
0.4327 100.0 9200 0.4557 0.464 0.8455 0.4707 0.2749 0.5121 0.0 0.107 0.442 0.5911 0.4334 0.6371 0.0 0.33 0.5045 0.598 0.6777

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
43.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Alvin-Nahabwe/conditional-detr-resnet-50-dsi-v1

Finetuned
(44)
this model