--- library_name: transformers license: apache-2.0 base_model: facebook/detr-resnet-50 tags: - generated_from_trainer model-index: - name: chickens results: [] --- # chickens This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1196 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 120 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 0.9349 | 1.0 | 116 | 0.7539 | | 0.5649 | 2.0 | 232 | 0.3776 | | 0.4438 | 3.0 | 348 | 0.6132 | | 0.4345 | 4.0 | 464 | 0.3365 | | 0.3976 | 5.0 | 580 | 0.3252 | | 0.403 | 6.0 | 696 | 0.3667 | | 0.3751 | 7.0 | 812 | 0.2778 | | 0.3511 | 8.0 | 928 | 0.2519 | | 0.3696 | 9.0 | 1044 | 0.2642 | | 0.3251 | 10.0 | 1160 | 0.3030 | | 0.3524 | 11.0 | 1276 | 0.2610 | | 0.3643 | 12.0 | 1392 | 0.2414 | | 0.3125 | 13.0 | 1508 | 0.2545 | | 0.325 | 14.0 | 1624 | 0.3899 | | 0.3124 | 15.0 | 1740 | 0.2996 | | 0.328 | 16.0 | 1856 | 0.2277 | | 0.2735 | 17.0 | 1972 | 0.2658 | | 0.3105 | 18.0 | 2088 | 0.2262 | | 0.2852 | 19.0 | 2204 | 0.2434 | | 0.3124 | 20.0 | 2320 | 0.3230 | | 0.2979 | 21.0 | 2436 | 0.2338 | | 0.2857 | 22.0 | 2552 | 0.2319 | | 0.3598 | 23.0 | 2668 | 0.2157 | | 0.352 | 24.0 | 2784 | 0.2456 | | 0.3087 | 25.0 | 2900 | 0.2397 | | 0.3206 | 26.0 | 3016 | 0.2461 | | 0.2837 | 27.0 | 3132 | 0.1970 | | 0.2981 | 28.0 | 3248 | 0.2741 | | 0.2845 | 29.0 | 3364 | 0.2312 | | 0.2768 | 30.0 | 3480 | 0.2197 | | 0.2806 | 31.0 | 3596 | 0.2411 | | 0.2635 | 32.0 | 3712 | 0.2112 | | 0.2908 | 33.0 | 3828 | 0.2023 | | 0.2704 | 34.0 | 3944 | 0.2480 | | 0.349 | 35.0 | 4060 | 0.2171 | | 0.2829 | 36.0 | 4176 | 0.2516 | | 0.3237 | 37.0 | 4292 | 0.2373 | | 0.2747 | 38.0 | 4408 | 0.2233 | | 0.3058 | 39.0 | 4524 | 0.2511 | | 0.4718 | 40.0 | 4640 | 0.3368 | | 0.2992 | 41.0 | 4756 | 0.2639 | | 0.285 | 42.0 | 4872 | 0.2716 | | 0.2702 | 43.0 | 4988 | 0.2264 | | 0.2905 | 44.0 | 5104 | 0.1958 | | 0.2815 | 45.0 | 5220 | 0.2076 | | 0.2806 | 46.0 | 5336 | 0.2315 | | 0.2503 | 47.0 | 5452 | 0.1862 | | 0.258 | 48.0 | 5568 | 0.2162 | | 0.2413 | 49.0 | 5684 | 0.1840 | | 0.2348 | 50.0 | 5800 | 0.1666 | | 0.2374 | 51.0 | 5916 | 0.2053 | | 0.24 | 52.0 | 6032 | 0.1717 | | 0.2306 | 53.0 | 6148 | 0.1881 | | 0.2398 | 54.0 | 6264 | 0.1845 | | 0.2363 | 55.0 | 6380 | 0.1764 | | 0.2249 | 56.0 | 6496 | 0.1942 | | 0.2154 | 57.0 | 6612 | 0.1945 | | 0.2348 | 58.0 | 6728 | 0.2108 | | 0.2349 | 59.0 | 6844 | 0.1930 | | 0.2294 | 60.0 | 6960 | 0.1902 | | 0.2155 | 61.0 | 7076 | 0.2001 | | 0.2197 | 62.0 | 7192 | 0.1737 | | 0.2271 | 63.0 | 7308 | 0.1624 | | 0.215 | 64.0 | 7424 | 0.1705 | | 0.2284 | 65.0 | 7540 | 0.1554 | | 0.2134 | 66.0 | 7656 | 0.1680 | | 0.2182 | 67.0 | 7772 | 0.1682 | | 0.2088 | 68.0 | 7888 | 0.1448 | | 0.2023 | 69.0 | 8004 | 0.1507 | | 0.2115 | 70.0 | 8120 | 0.1836 | | 0.202 | 71.0 | 8236 | 0.1779 | | 0.1923 | 72.0 | 8352 | 0.1594 | | 0.1993 | 73.0 | 8468 | 0.1700 | | 0.2003 | 74.0 | 8584 | 0.1587 | | 0.1975 | 75.0 | 8700 | 0.1667 | | 0.1996 | 76.0 | 8816 | 0.1637 | | 0.1933 | 77.0 | 8932 | 0.1344 | | 0.1884 | 78.0 | 9048 | 0.1497 | | 0.1912 | 79.0 | 9164 | 0.1571 | | 0.191 | 80.0 | 9280 | 0.1426 | | 0.1866 | 81.0 | 9396 | 0.1529 | | 0.1859 | 82.0 | 9512 | 0.1494 | | 0.183 | 83.0 | 9628 | 0.1508 | | 0.182 | 84.0 | 9744 | 0.1482 | | 0.171 | 85.0 | 9860 | 0.1662 | | 0.1773 | 86.0 | 9976 | 0.1561 | | 0.1742 | 87.0 | 10092 | 0.1514 | | 0.1778 | 88.0 | 10208 | 0.1371 | | 0.1721 | 89.0 | 10324 | 0.1426 | | 0.1725 | 90.0 | 10440 | 0.1554 | | 0.1665 | 91.0 | 10556 | 0.1494 | | 0.1739 | 92.0 | 10672 | 0.1423 | | 0.1688 | 93.0 | 10788 | 0.1467 | | 0.1706 | 94.0 | 10904 | 0.1267 | | 0.1715 | 95.0 | 11020 | 0.1383 | | 0.1684 | 96.0 | 11136 | 0.1357 | | 0.1699 | 97.0 | 11252 | 0.1464 | | 0.172 | 98.0 | 11368 | 0.1429 | | 0.1673 | 99.0 | 11484 | 0.1387 | | 0.166 | 100.0 | 11600 | 0.1369 | | 0.1655 | 101.0 | 11716 | 0.1272 | | 0.1654 | 102.0 | 11832 | 0.1237 | | 0.1625 | 103.0 | 11948 | 0.1321 | | 0.1622 | 104.0 | 12064 | 0.1275 | | 0.1606 | 105.0 | 12180 | 0.1250 | | 0.1603 | 106.0 | 12296 | 0.1293 | | 0.1622 | 107.0 | 12412 | 0.1275 | | 0.1607 | 108.0 | 12528 | 0.1238 | | 0.1593 | 109.0 | 12644 | 0.1264 | | 0.1568 | 110.0 | 12760 | 0.1251 | | 0.1589 | 111.0 | 12876 | 0.1253 | | 0.1576 | 112.0 | 12992 | 0.1246 | | 0.1539 | 113.0 | 13108 | 0.1223 | | 0.1583 | 114.0 | 13224 | 0.1228 | | 0.1601 | 115.0 | 13340 | 0.1217 | | 0.1628 | 116.0 | 13456 | 0.1222 | | 0.1542 | 117.0 | 13572 | 0.1214 | | 0.1559 | 118.0 | 13688 | 0.1213 | | 0.1598 | 119.0 | 13804 | 0.1217 | | 0.1554 | 120.0 | 13920 | 0.1196 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 2.14.4 - Tokenizers 0.19.1