Edit model card

microsoft-resnet-50-cartoon-face-recognition

This model is a fine-tuned version of microsoft/resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8508
  • Accuracy: 0.7755
  • Precision: 0.7715
  • Recall: 0.7755
  • F1: 0.7676

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.00012
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
No log 0.89 6 3.1774 0.0370 0.0069 0.0370 0.0098
3.4185 1.89 12 3.1739 0.0301 0.0100 0.0301 0.0126
3.4185 2.89 18 3.1668 0.0440 0.0805 0.0440 0.0340
3.6463 3.89 24 3.1583 0.0370 0.0180 0.0370 0.0151
3.3899 4.89 30 3.1425 0.0741 0.0610 0.0741 0.0453
3.3899 5.89 36 3.1262 0.0856 0.0334 0.0856 0.0405
3.5947 6.89 42 3.1055 0.1019 0.0784 0.1019 0.0481
3.5947 7.89 48 3.0841 0.1181 0.1071 0.1181 0.0500
3.5553 8.89 54 3.0650 0.1065 0.0216 0.1065 0.0343
3.2713 9.89 60 3.0351 0.1273 0.0323 0.1273 0.0418
3.2713 10.89 66 3.0069 0.1227 0.0311 0.1227 0.0390
3.4382 11.89 72 2.9754 0.1204 0.0353 0.1204 0.0366
3.4382 12.89 78 2.9455 0.1227 0.0224 0.1227 0.0338
3.3573 13.89 84 2.9167 0.1204 0.0213 0.1204 0.0332
3.0549 14.89 90 2.8841 0.1227 0.0474 0.1227 0.0408
3.0549 15.89 96 2.8534 0.1412 0.1174 0.1412 0.0540
3.1853 16.89 102 2.8143 0.1505 0.1595 0.1505 0.0667
3.1853 17.89 108 2.7771 0.1667 0.1693 0.1667 0.0719
3.0871 18.89 114 2.7400 0.1759 0.1454 0.1759 0.0896
2.7666 19.89 120 2.7048 0.2014 0.0927 0.2014 0.1051
2.7666 20.89 126 2.6458 0.2315 0.1622 0.2315 0.1250
2.846 21.89 132 2.5803 0.2569 0.2386 0.2569 0.1470
2.846 22.89 138 2.5291 0.2639 0.2725 0.2639 0.1523
2.7428 23.89 144 2.4916 0.2870 0.2114 0.2870 0.1811
2.4183 24.89 150 2.4273 0.3079 0.2322 0.3079 0.2048
2.4183 25.89 156 2.3923 0.3194 0.2937 0.3194 0.2238
2.5064 26.89 162 2.3349 0.3403 0.3183 0.3403 0.2494
2.5064 27.89 168 2.2977 0.3542 0.3554 0.3542 0.2663
2.4046 28.89 174 2.2363 0.3773 0.3214 0.3773 0.2981
2.1201 29.89 180 2.1791 0.3889 0.4024 0.3889 0.3179
2.1201 30.89 186 2.1448 0.4144 0.4079 0.4144 0.3455
2.1705 31.89 192 2.0969 0.4306 0.4214 0.4306 0.3583
2.1705 32.89 198 2.0535 0.4468 0.4448 0.4468 0.3797
2.0295 33.89 204 1.9940 0.4745 0.4877 0.4745 0.4133
1.8114 34.89 210 1.9467 0.4861 0.4952 0.4861 0.4261
1.8114 35.89 216 1.8896 0.4931 0.4510 0.4931 0.4321
1.8048 36.89 222 1.8404 0.5046 0.4859 0.5046 0.4507
1.8048 37.89 228 1.7999 0.5278 0.5142 0.5278 0.4816
1.6862 38.89 234 1.7363 0.5324 0.5169 0.5324 0.4844
1.4545 39.89 240 1.7104 0.5440 0.5100 0.5440 0.4971
1.4545 40.89 246 1.6492 0.5648 0.5239 0.5648 0.5138
1.4444 41.89 252 1.6076 0.5671 0.5329 0.5671 0.5260
1.4444 42.89 258 1.5784 0.5741 0.5708 0.5741 0.5424
1.3124 43.89 264 1.5259 0.6019 0.5977 0.6019 0.5619
1.1645 44.89 270 1.4814 0.6181 0.6033 0.6181 0.5880
1.1645 45.89 276 1.4697 0.6088 0.6033 0.6088 0.5803
1.1307 46.89 282 1.4380 0.6088 0.6015 0.6088 0.5769
1.1307 47.89 288 1.3872 0.6227 0.6085 0.6227 0.5917
1.0347 48.89 294 1.3709 0.6157 0.6039 0.6157 0.5880
0.8962 49.89 300 1.3415 0.6296 0.6120 0.6296 0.6057
0.8962 50.89 306 1.3290 0.6389 0.6327 0.6389 0.6134
0.8898 51.89 312 1.2836 0.6389 0.6192 0.6389 0.6119
0.8898 52.89 318 1.2665 0.6412 0.6186 0.6412 0.6162
0.7886 53.89 324 1.2272 0.6551 0.6431 0.6551 0.6319
0.6794 54.89 330 1.2144 0.6806 0.6643 0.6806 0.6629
0.6794 55.89 336 1.1817 0.6806 0.6666 0.6806 0.6642
0.6459 56.89 342 1.1702 0.6782 0.6591 0.6782 0.6574
0.6459 57.89 348 1.0947 0.7037 0.6863 0.7037 0.6883
0.6075 58.89 354 1.1227 0.7037 0.6874 0.7037 0.6867
0.4979 59.89 360 1.0849 0.7083 0.6813 0.7083 0.6895
0.4979 60.89 366 1.0742 0.7153 0.6924 0.7153 0.6976
0.4895 61.89 372 1.0452 0.7245 0.7020 0.7245 0.7057
0.4895 62.89 378 1.0435 0.7361 0.7316 0.7361 0.7235
0.456 63.89 384 1.0698 0.6921 0.6835 0.6921 0.6783
0.3816 64.89 390 1.0126 0.7222 0.7064 0.7222 0.7091
0.3816 65.89 396 0.9934 0.7361 0.7247 0.7361 0.7205
0.3599 66.89 402 0.9960 0.7292 0.7213 0.7292 0.7170
0.3599 67.89 408 1.0141 0.7222 0.7148 0.7222 0.7087
0.3484 68.89 414 0.9934 0.7222 0.7125 0.7222 0.7107
0.2939 69.89 420 0.9835 0.7431 0.7417 0.7431 0.7349
0.2939 70.89 426 0.9870 0.7315 0.7275 0.7315 0.7217
0.285 71.89 432 0.9656 0.7431 0.7411 0.7431 0.7340
0.285 72.89 438 0.9462 0.7338 0.7320 0.7338 0.7267
0.2463 73.89 444 0.9513 0.7454 0.7467 0.7454 0.7384
0.2328 74.89 450 0.9334 0.7361 0.7389 0.7361 0.7286
0.2328 75.89 456 0.9375 0.7384 0.7278 0.7384 0.7291
0.2208 76.89 462 0.9332 0.7407 0.7357 0.7407 0.7322
0.2208 77.89 468 0.9408 0.7384 0.7406 0.7384 0.7346
0.2177 78.89 474 0.9059 0.7222 0.7183 0.7222 0.7136
0.1734 79.89 480 0.9517 0.7315 0.7371 0.7315 0.7257
0.1734 80.89 486 0.9063 0.7523 0.7462 0.7523 0.7424
0.1791 81.89 492 0.9171 0.7454 0.7461 0.7454 0.7386
0.1791 82.89 498 0.8846 0.7523 0.7561 0.7523 0.7485
0.1681 83.89 504 0.8871 0.7384 0.7431 0.7384 0.7320
0.1573 84.89 510 0.9118 0.7454 0.7474 0.7454 0.7395
0.1573 85.89 516 0.9006 0.7407 0.7432 0.7407 0.7366
0.1439 86.89 522 0.8703 0.7616 0.7693 0.7616 0.7579
0.1439 87.89 528 0.8988 0.7454 0.7570 0.7454 0.7401
0.1362 88.89 534 0.9234 0.7454 0.7477 0.7454 0.7396
0.1249 89.89 540 0.8860 0.75 0.7473 0.75 0.7425
0.1249 90.89 546 0.8608 0.7546 0.7601 0.7546 0.7513
0.1264 91.89 552 0.8871 0.7593 0.7640 0.7593 0.7560
0.1264 92.89 558 0.8432 0.7639 0.7727 0.7639 0.7599
0.1201 93.89 564 0.8654 0.7639 0.7698 0.7639 0.7569
0.1117 94.89 570 0.8856 0.7454 0.7569 0.7454 0.7415
0.1117 95.89 576 0.8668 0.7546 0.7686 0.7546 0.7535
0.1128 96.89 582 0.8630 0.7662 0.7698 0.7662 0.7619
0.1128 97.89 588 0.8551 0.7731 0.7826 0.7731 0.7696
0.1155 98.89 594 0.8697 0.7708 0.7738 0.7708 0.7643
0.0987 99.89 600 0.8613 0.7546 0.7518 0.7546 0.7484
0.0987 100.89 606 0.8742 0.7569 0.7597 0.7569 0.7524
0.1063 101.89 612 0.8498 0.7755 0.7807 0.7755 0.7712
0.1063 102.89 618 0.8557 0.7708 0.7749 0.7708 0.7655
0.097 103.89 624 0.8764 0.7546 0.7634 0.7546 0.7527
0.0947 104.89 630 0.8677 0.7616 0.7628 0.7616 0.7572
0.0947 105.89 636 0.8909 0.75 0.7614 0.75 0.7469
0.1013 106.89 642 0.8283 0.7639 0.7621 0.7639 0.7580
0.1013 107.89 648 0.8471 0.7662 0.7864 0.7662 0.7651
0.0963 108.89 654 0.8653 0.7593 0.7701 0.7593 0.7558
0.0874 109.89 660 0.8479 0.7731 0.7834 0.7731 0.7692
0.0874 110.89 666 0.8584 0.7639 0.7719 0.7639 0.7620
0.0876 111.89 672 0.8714 0.7616 0.7600 0.7616 0.7550
0.0876 112.89 678 0.8509 0.7731 0.7847 0.7731 0.7727
0.0974 113.89 684 0.8688 0.7685 0.7741 0.7685 0.7648
0.0869 114.89 690 0.8590 0.7847 0.7932 0.7847 0.7794
0.0869 115.89 696 0.8687 0.7593 0.7703 0.7593 0.7579
0.0877 116.89 702 0.8735 0.7593 0.7698 0.7593 0.7554
0.0877 117.89 708 0.8566 0.7546 0.7732 0.7546 0.7518
0.0883 118.89 714 0.8681 0.7569 0.7591 0.7569 0.7525
0.0762 119.89 720 0.8508 0.7755 0.7715 0.7755 0.7676

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.1+cu117
  • Datasets 2.8.0
  • Tokenizers 0.11.0
Downloads last month
18

Evaluation results