Edit model card

detr_algae_0.25r_v0

This model is a fine-tuned version of facebook/detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1346

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
5.1721 1.0 69 3.3857
3.3617 2.0 138 2.9312
3.0175 3.0 207 2.6671
2.9799 4.0 276 2.7044
2.835 5.0 345 2.7200
2.7184 6.0 414 2.4760
2.7039 7.0 483 2.5395
2.6316 8.0 552 2.4040
2.683 9.0 621 2.4622
2.6908 10.0 690 2.5163
2.5706 11.0 759 2.2428
2.5066 12.0 828 2.3053
2.5146 13.0 897 2.2936
2.446 14.0 966 2.1469
2.3444 15.0 1035 2.2260
2.3997 16.0 1104 2.2093
2.3823 17.0 1173 2.2598
2.3741 18.0 1242 2.1737
2.3437 19.0 1311 2.1670
2.3134 20.0 1380 2.2041
2.3636 21.0 1449 2.1305
2.2936 22.0 1518 2.0564
2.2117 23.0 1587 2.0024
2.1955 24.0 1656 2.0013
2.1724 25.0 1725 1.9785
2.1765 26.0 1794 2.0120
2.1489 27.0 1863 1.9891
2.157 28.0 1932 1.9267
2.1454 29.0 2001 1.9686
2.1545 30.0 2070 2.0207
2.0847 31.0 2139 1.8773
2.0731 32.0 2208 1.8358
2.0717 33.0 2277 1.9076
2.0616 34.0 2346 1.9287
2.1044 35.0 2415 1.8625
2.04 36.0 2484 1.8683
2.0489 37.0 2553 1.8393
2.0562 38.0 2622 1.9013
2.0433 39.0 2691 1.8472
2.0093 40.0 2760 1.8122
2.0149 41.0 2829 1.7668
2.049 42.0 2898 1.8419
1.9992 43.0 2967 1.8292
1.9494 44.0 3036 1.8329
2.0128 45.0 3105 1.8827
2.0712 46.0 3174 1.8425
1.9346 47.0 3243 1.8509
1.899 48.0 3312 1.7352
1.9576 49.0 3381 1.7825
1.9877 50.0 3450 1.7996
1.9176 51.0 3519 1.7754
1.9217 52.0 3588 1.7418
1.9365 53.0 3657 1.7711
1.9032 54.0 3726 1.7001
1.8404 55.0 3795 1.6628
1.8447 56.0 3864 1.6939
1.8418 57.0 3933 1.7099
1.7911 58.0 4002 1.6751
1.7899 59.0 4071 1.7471
1.8368 60.0 4140 1.7111
1.853 61.0 4209 1.7785
1.88 62.0 4278 1.7709
1.8734 63.0 4347 1.6597
1.8107 64.0 4416 1.6720
1.8329 65.0 4485 1.6868
1.8129 66.0 4554 1.6611
1.7972 67.0 4623 1.6452
1.7828 68.0 4692 1.6538
1.7653 69.0 4761 1.6246
1.7343 70.0 4830 1.5364
1.6567 71.0 4899 1.5308
1.6873 72.0 4968 1.5473
1.7233 73.0 5037 1.6096
1.6934 74.0 5106 1.5679
1.7263 75.0 5175 1.6542
1.7109 76.0 5244 1.5674
1.6977 77.0 5313 1.5367
1.6761 78.0 5382 1.5456
1.69 79.0 5451 1.5624
1.7241 80.0 5520 1.5067
1.643 81.0 5589 1.5723
1.6358 82.0 5658 1.5349
1.6511 83.0 5727 1.5321
1.6932 84.0 5796 1.5640
1.7214 85.0 5865 1.5118
1.6988 86.0 5934 1.5471
1.6697 87.0 6003 1.5650
1.6828 88.0 6072 1.5087
1.7211 89.0 6141 1.5302
1.6195 90.0 6210 1.5018
1.5924 91.0 6279 1.4886
1.5746 92.0 6348 1.4365
1.6277 93.0 6417 1.4995
1.5936 94.0 6486 1.4569
1.6132 95.0 6555 1.4982
1.5637 96.0 6624 1.4032
1.5502 97.0 6693 1.4388
1.5535 98.0 6762 1.4101
1.5306 99.0 6831 1.4048
1.5425 100.0 6900 1.4133
1.529 101.0 6969 1.4244
1.5659 102.0 7038 1.4268
1.5234 103.0 7107 1.3829
1.498 104.0 7176 1.3884
1.4838 105.0 7245 1.3627
1.4774 106.0 7314 1.3501
1.479 107.0 7383 1.3738
1.475 108.0 7452 1.3537
1.4592 109.0 7521 1.3621
1.5015 110.0 7590 1.4022
1.4948 111.0 7659 1.4069
1.4875 112.0 7728 1.3325
1.437 113.0 7797 1.3080
1.4276 114.0 7866 1.3100
1.4489 115.0 7935 1.3712
1.4951 116.0 8004 1.4256
1.4585 117.0 8073 1.3720
1.4736 118.0 8142 1.4397
1.4664 119.0 8211 1.4036
1.4569 120.0 8280 1.3672
1.4627 121.0 8349 1.3809
1.4924 122.0 8418 1.3420
1.4487 123.0 8487 1.3142
1.4341 124.0 8556 1.3238
1.4025 125.0 8625 1.2891
1.4013 126.0 8694 1.3140
1.3909 127.0 8763 1.3329
1.4305 128.0 8832 1.3489
1.3771 129.0 8901 1.3640
1.4442 130.0 8970 1.3695
1.4272 131.0 9039 1.3752
1.4087 132.0 9108 1.3145
1.3648 133.0 9177 1.3222
1.3981 134.0 9246 1.3483
1.4116 135.0 9315 1.2986
1.4117 136.0 9384 1.3789
1.4416 137.0 9453 1.3491
1.3753 138.0 9522 1.3026
1.3721 139.0 9591 1.3292
1.3951 140.0 9660 1.2946
1.3406 141.0 9729 1.2646
1.3336 142.0 9798 1.3247
1.3182 143.0 9867 1.2960
1.3293 144.0 9936 1.2845
1.3242 145.0 10005 1.2849
1.3171 146.0 10074 1.2662
1.3193 147.0 10143 1.2827
1.3298 148.0 10212 1.2776
1.3014 149.0 10281 1.2603
1.3419 150.0 10350 1.2484
1.3385 151.0 10419 1.2477
1.3029 152.0 10488 1.2408
1.2803 153.0 10557 1.2191
1.2562 154.0 10626 1.2264
1.2667 155.0 10695 1.2250
1.2669 156.0 10764 1.2095
1.2643 157.0 10833 1.1964
1.2548 158.0 10902 1.2019
1.262 159.0 10971 1.2393
1.2596 160.0 11040 1.2018
1.2476 161.0 11109 1.2308
1.2755 162.0 11178 1.2103
1.237 163.0 11247 1.2020
1.226 164.0 11316 1.2173
1.2278 165.0 11385 1.1912
1.244 166.0 11454 1.2036
1.2467 167.0 11523 1.1831
1.2063 168.0 11592 1.1720
1.2141 169.0 11661 1.1580
1.224 170.0 11730 1.1897
1.2171 171.0 11799 1.1546
1.2151 172.0 11868 1.1621
1.1623 173.0 11937 1.1899
1.2037 174.0 12006 1.1649
1.1741 175.0 12075 1.1794
1.1921 176.0 12144 1.1584
1.1811 177.0 12213 1.1589
1.1956 178.0 12282 1.1555
1.1703 179.0 12351 1.1510
1.1727 180.0 12420 1.1363
1.1747 181.0 12489 1.1570
1.1524 182.0 12558 1.1655
1.1645 183.0 12627 1.1324
1.1549 184.0 12696 1.1529
1.1432 185.0 12765 1.1396
1.1552 186.0 12834 1.1406
1.1568 187.0 12903 1.1585
1.1407 188.0 12972 1.1417
1.1419 189.0 13041 1.1542
1.1451 190.0 13110 1.1330
1.1421 191.0 13179 1.1309
1.1283 192.0 13248 1.1271
1.1528 193.0 13317 1.1195
1.1367 194.0 13386 1.1300
1.1407 195.0 13455 1.1144
1.1456 196.0 13524 1.1584
1.1072 197.0 13593 1.1334
1.1081 198.0 13662 1.1378
1.1205 199.0 13731 1.1327
1.1275 200.0 13800 1.1346

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.13.1+cu117
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
5
Hosted inference API
Drag image file here or click to browse from your device
This model can be loaded on the Inference API on-demand.