metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- image-classification
- vision
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: resnet50-cocoa
results: []
resnet50-cocoa
This model is a fine-tuned version of microsoft/resnet-50 on the SemilleroCV/Cocoa-dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3381
- Accuracy: 0.8917
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100.0
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.3793 | 1.0 | 196 | 1.4452 | 0.8628 |
0.9417 | 2.0 | 392 | 1.0832 | 0.8628 |
0.8546 | 3.0 | 588 | 0.7324 | 0.8628 |
0.6067 | 4.0 | 784 | 0.5761 | 0.8628 |
0.5583 | 5.0 | 980 | 0.5221 | 0.8628 |
0.6819 | 6.0 | 1176 | 0.4618 | 0.8628 |
0.4154 | 7.0 | 1372 | 0.4545 | 0.8628 |
0.4997 | 8.0 | 1568 | 0.4556 | 0.8628 |
0.6623 | 9.0 | 1764 | 0.4483 | 0.8628 |
0.8141 | 10.0 | 1960 | 0.4494 | 0.8628 |
0.5514 | 11.0 | 2156 | 0.4437 | 0.8628 |
0.6831 | 12.0 | 2352 | 0.4407 | 0.8664 |
0.2799 | 13.0 | 2548 | 0.4459 | 0.8700 |
0.451 | 14.0 | 2744 | 0.4313 | 0.8809 |
0.3901 | 15.0 | 2940 | 0.4340 | 0.8845 |
0.4778 | 16.0 | 3136 | 0.4219 | 0.8845 |
0.5531 | 17.0 | 3332 | 0.4304 | 0.8845 |
0.4904 | 18.0 | 3528 | 0.4429 | 0.8845 |
0.5398 | 19.0 | 3724 | 0.4144 | 0.8917 |
0.8024 | 20.0 | 3920 | 0.4253 | 0.8881 |
0.7022 | 21.0 | 4116 | 0.4232 | 0.8917 |
0.3868 | 22.0 | 4312 | 0.4167 | 0.8917 |
0.4075 | 23.0 | 4508 | 0.3917 | 0.8917 |
0.3873 | 24.0 | 4704 | 0.4269 | 0.8881 |
0.2382 | 25.0 | 4900 | 0.3913 | 0.8845 |
0.6525 | 26.0 | 5096 | 0.3949 | 0.8881 |
0.3207 | 27.0 | 5292 | 0.3967 | 0.8881 |
0.4569 | 28.0 | 5488 | 0.3901 | 0.8845 |
0.6184 | 29.0 | 5684 | 0.4114 | 0.8917 |
0.6055 | 30.0 | 5880 | 0.4342 | 0.8881 |
0.47 | 31.0 | 6076 | 0.4071 | 0.8917 |
0.3507 | 32.0 | 6272 | 0.3838 | 0.8881 |
0.4888 | 33.0 | 6468 | 0.4006 | 0.8881 |
0.4276 | 34.0 | 6664 | 0.3909 | 0.8881 |
0.5371 | 35.0 | 6860 | 0.4238 | 0.8917 |
0.4826 | 36.0 | 7056 | 0.3843 | 0.8917 |
0.5119 | 37.0 | 7252 | 0.3747 | 0.8845 |
0.4192 | 38.0 | 7448 | 0.4232 | 0.8881 |
1.1545 | 39.0 | 7644 | 0.4415 | 0.8881 |
0.3206 | 40.0 | 7840 | 0.3937 | 0.8881 |
0.3464 | 41.0 | 8036 | 0.3678 | 0.8881 |
0.4016 | 42.0 | 8232 | 0.3849 | 0.8881 |
0.2037 | 43.0 | 8428 | 0.3487 | 0.8881 |
0.3795 | 44.0 | 8624 | 0.4298 | 0.8881 |
0.403 | 45.0 | 8820 | 0.3966 | 0.8881 |
0.2754 | 46.0 | 9016 | 0.3785 | 0.8845 |
0.5228 | 47.0 | 9212 | 0.4117 | 0.8881 |
0.7263 | 48.0 | 9408 | 0.3726 | 0.8845 |
0.8995 | 49.0 | 9604 | 0.4559 | 0.8917 |
0.6844 | 50.0 | 9800 | 0.4164 | 0.8881 |
0.2734 | 51.0 | 9996 | 0.3862 | 0.8881 |
0.4179 | 52.0 | 10192 | 0.4386 | 0.8917 |
0.3354 | 53.0 | 10388 | 0.3949 | 0.8881 |
0.7031 | 54.0 | 10584 | 0.3910 | 0.8881 |
0.586 | 55.0 | 10780 | 0.4216 | 0.8881 |
0.3601 | 56.0 | 10976 | 0.4545 | 0.8881 |
0.362 | 57.0 | 11172 | 0.3760 | 0.8845 |
0.6132 | 58.0 | 11368 | 0.4258 | 0.8881 |
0.5605 | 59.0 | 11564 | 0.3972 | 0.8881 |
0.5071 | 60.0 | 11760 | 0.3873 | 0.8917 |
0.458 | 61.0 | 11956 | 0.4098 | 0.8881 |
0.4401 | 62.0 | 12152 | 0.3859 | 0.8845 |
0.5439 | 63.0 | 12348 | 0.4142 | 0.8917 |
0.6099 | 64.0 | 12544 | 0.3970 | 0.8881 |
0.2749 | 65.0 | 12740 | 0.3656 | 0.8809 |
0.581 | 66.0 | 12936 | 0.4203 | 0.8881 |
0.6009 | 67.0 | 13132 | 0.4074 | 0.8917 |
0.2388 | 68.0 | 13328 | 0.3594 | 0.8845 |
0.6006 | 69.0 | 13524 | 0.4045 | 0.8845 |
0.388 | 70.0 | 13720 | 0.3717 | 0.8881 |
0.552 | 71.0 | 13916 | 0.4239 | 0.8881 |
0.3875 | 72.0 | 14112 | 0.3731 | 0.8881 |
0.3105 | 73.0 | 14308 | 0.3434 | 0.8845 |
0.4627 | 74.0 | 14504 | 0.3946 | 0.8881 |
0.2931 | 75.0 | 14700 | 0.3950 | 0.8845 |
0.4639 | 76.0 | 14896 | 0.3875 | 0.8881 |
0.3534 | 77.0 | 15092 | 0.4009 | 0.8881 |
0.3175 | 78.0 | 15288 | 0.4109 | 0.8881 |
0.5334 | 79.0 | 15484 | 0.3918 | 0.8881 |
0.4827 | 80.0 | 15680 | 0.3807 | 0.8881 |
0.5162 | 81.0 | 15876 | 0.3624 | 0.8845 |
0.4377 | 82.0 | 16072 | 0.3729 | 0.8881 |
0.4487 | 83.0 | 16268 | 0.3981 | 0.8917 |
0.5057 | 84.0 | 16464 | 0.3995 | 0.8917 |
0.3421 | 85.0 | 16660 | 0.3554 | 0.8881 |
0.4083 | 86.0 | 16856 | 0.3634 | 0.8845 |
0.7634 | 87.0 | 17052 | 0.3970 | 0.8881 |
0.2588 | 88.0 | 17248 | 0.4121 | 0.8917 |
0.1584 | 89.0 | 17444 | 0.3711 | 0.8881 |
0.2643 | 90.0 | 17640 | 0.3743 | 0.8881 |
0.2771 | 91.0 | 17836 | 0.3726 | 0.8881 |
0.336 | 92.0 | 18032 | 0.3758 | 0.8845 |
0.3283 | 93.0 | 18228 | 0.4397 | 0.8917 |
0.7224 | 94.0 | 18424 | 0.3869 | 0.8917 |
0.1575 | 95.0 | 18620 | 0.3381 | 0.8917 |
0.4062 | 96.0 | 18816 | 0.3684 | 0.8845 |
0.3849 | 97.0 | 19012 | 0.3887 | 0.8881 |
0.2755 | 98.0 | 19208 | 0.3725 | 0.8881 |
0.4952 | 99.0 | 19404 | 0.4137 | 0.8917 |
0.3807 | 100.0 | 19600 | 0.3923 | 0.8881 |
Framework versions
- Transformers 4.48.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0