chickens / README.md
joe611's picture
End of training
caaf61d verified
|
raw
history blame
6.36 kB
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/detr-resnet-50
tags:
  - generated_from_trainer
model-index:
  - name: chickens
    results: []

chickens

This model is a fine-tuned version of facebook/detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2663

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss
1.5127 1.0 57 1.1473
1.2464 2.0 114 1.1654
1.1597 3.0 171 0.9371
1.0188 4.0 228 0.8206
0.892 5.0 285 0.6825
0.8044 6.0 342 0.6275
0.8262 7.0 399 0.5388
0.7248 8.0 456 0.5150
0.7478 9.0 513 0.5798
0.7335 10.0 570 0.6344
0.7018 11.0 627 0.5152
0.6755 12.0 684 0.6301
0.6846 13.0 741 0.6130
0.7388 14.0 798 0.5609
0.678 15.0 855 0.5973
0.7463 16.0 912 0.6333
0.7194 17.0 969 0.5422
0.6592 18.0 1026 0.4144
0.6508 19.0 1083 0.5325
0.6662 20.0 1140 0.5498
0.6588 21.0 1197 0.4653
0.6143 22.0 1254 0.4947
0.5999 23.0 1311 0.5844
0.5749 24.0 1368 0.3780
0.5767 25.0 1425 0.4995
0.5922 26.0 1482 0.4711
0.5968 27.0 1539 0.4946
0.5697 28.0 1596 0.4763
0.5413 29.0 1653 0.4314
0.5774 30.0 1710 0.4891
0.5531 31.0 1767 0.4204
0.5571 32.0 1824 0.4180
0.5464 33.0 1881 0.3826
0.5158 34.0 1938 0.3677
0.5061 35.0 1995 0.4286
0.5138 36.0 2052 0.4227
0.509 37.0 2109 0.3925
0.5201 38.0 2166 0.4016
0.5121 39.0 2223 0.3852
0.4911 40.0 2280 0.4250
0.5183 41.0 2337 0.4073
0.5198 42.0 2394 0.4061
0.4979 43.0 2451 0.4376
0.4898 44.0 2508 0.3740
0.4804 45.0 2565 0.3759
0.4957 46.0 2622 0.3428
0.4945 47.0 2679 0.3797
0.4959 48.0 2736 0.3737
0.4742 49.0 2793 0.3642
0.4478 50.0 2850 0.3605
0.4356 51.0 2907 0.3320
0.4449 52.0 2964 0.3693
0.4538 53.0 3021 0.3366
0.446 54.0 3078 0.4385
0.4545 55.0 3135 0.3581
0.4294 56.0 3192 0.3546
0.4384 57.0 3249 0.3595
0.4349 58.0 3306 0.3955
0.4272 59.0 3363 0.3247
0.4338 60.0 3420 0.3456
0.4309 61.0 3477 0.2937
0.4193 62.0 3534 0.3381
0.4253 63.0 3591 0.2959
0.4223 64.0 3648 0.3100
0.3973 65.0 3705 0.3546
0.4097 66.0 3762 0.3492
0.3966 67.0 3819 0.2990
0.4008 68.0 3876 0.3290
0.3955 69.0 3933 0.3070
0.3986 70.0 3990 0.3562
0.3971 71.0 4047 0.3074
0.3999 72.0 4104 0.3278
0.3933 73.0 4161 0.3116
0.387 74.0 4218 0.2914
0.3873 75.0 4275 0.3302
0.3814 76.0 4332 0.2788
0.3744 77.0 4389 0.2938
0.3701 78.0 4446 0.2761
0.3767 79.0 4503 0.2760
0.3644 80.0 4560 0.2855
0.366 81.0 4617 0.2659
0.3602 82.0 4674 0.2695
0.3634 83.0 4731 0.2927
0.3533 84.0 4788 0.2712
0.3608 85.0 4845 0.2708
0.3652 86.0 4902 0.2754
0.3576 87.0 4959 0.2589
0.3662 88.0 5016 0.2700
0.3568 89.0 5073 0.2602
0.3561 90.0 5130 0.2726
0.3568 91.0 5187 0.2714
0.3564 92.0 5244 0.2703
0.3622 93.0 5301 0.2737
0.3501 94.0 5358 0.2643
0.3514 95.0 5415 0.2639
0.3574 96.0 5472 0.2710
0.3502 97.0 5529 0.2672
0.3472 98.0 5586 0.2658
0.3489 99.0 5643 0.2662
0.36 100.0 5700 0.2663

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 2.14.4
  • Tokenizers 0.19.1