Edit model card

resnet-50-finetuned-pokemon-finetuned-pokemon

This model is a fine-tuned version of TeeA/resnet-50-finetuned-pokemon on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 14.1746
  • Accuracy: 0.0849

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1894 0.99 38 9.2115 0.0137
1.1389 1.99 76 9.2521 0.0129
1.0432 2.98 114 9.4765 0.0144
1.0625 4.0 153 9.7668 0.0137
1.0805 4.99 191 10.2526 0.0137
1.0353 5.99 229 10.3238 0.0129
0.9747 6.98 267 10.5779 0.0165
0.9708 8.0 306 10.7458 0.0180
0.8886 8.99 344 11.0072 0.0194
0.8408 9.99 382 11.3171 0.0223
0.802 10.98 420 11.5545 0.0245
0.7903 12.0 459 11.7722 0.0288
0.7553 12.99 497 11.9834 0.0353
0.7413 13.99 535 11.9815 0.0446
0.6272 14.98 573 12.0871 0.0496
0.6944 16.0 612 12.3713 0.0590
0.6322 16.99 650 12.6826 0.0554
0.6131 17.99 688 12.4819 0.0612
0.5916 18.98 726 12.6246 0.0647
0.5094 20.0 765 12.6641 0.0669
0.5201 20.99 803 12.8861 0.0662
0.4731 21.99 841 12.7431 0.0655
0.5132 22.98 879 12.7786 0.0705
0.5036 24.0 918 12.9990 0.0727
0.4863 24.99 956 13.0419 0.0727
0.4852 25.99 994 13.0573 0.0734
0.4983 26.98 1032 13.1310 0.0719
0.459 28.0 1071 13.0688 0.0748
0.4556 28.99 1109 13.4128 0.0748
0.4729 29.99 1147 13.3530 0.0741
0.4659 30.98 1185 13.2308 0.0763
0.4337 32.0 1224 13.3264 0.0748
0.456 32.99 1262 13.3506 0.0741
0.4423 33.99 1300 13.3607 0.0784
0.4037 34.98 1338 13.2521 0.0734
0.3891 36.0 1377 13.3702 0.0777
0.3992 36.99 1415 13.4762 0.0777
0.4014 37.99 1453 13.5382 0.0791
0.3549 38.98 1491 13.5550 0.0791
0.4048 40.0 1530 13.6406 0.0799
0.3711 40.99 1568 13.5120 0.0777
0.3834 41.99 1606 13.9230 0.0799
0.3475 42.98 1644 13.8602 0.0791
0.3465 44.0 1683 13.6931 0.0806
0.3682 44.99 1721 13.7774 0.0784
0.3613 45.99 1759 14.0235 0.0791
0.368 46.98 1797 13.9289 0.0813
0.3961 48.0 1836 14.2549 0.0806
0.365 48.99 1874 14.1114 0.0813
0.3259 49.99 1912 13.9710 0.0806
0.2998 50.98 1950 14.0288 0.0806
0.3203 52.0 1989 13.9398 0.0813
0.3104 52.99 2027 14.0255 0.0820
0.3232 53.99 2065 13.9355 0.0827
0.3521 54.98 2103 13.8627 0.0806
0.3322 56.0 2142 14.0179 0.0806
0.3129 56.99 2180 13.9640 0.0820
0.3159 57.99 2218 14.1997 0.0799
0.3118 58.98 2256 14.1639 0.0820
0.3196 60.0 2295 14.0334 0.0806
0.301 60.99 2333 13.9954 0.0820
0.3142 61.99 2371 14.1432 0.0799
0.3192 62.98 2409 14.0269 0.0784
0.3342 64.0 2448 14.0450 0.0806
0.3045 64.99 2486 14.1746 0.0849
0.2991 65.99 2524 14.3192 0.0806
0.3228 66.98 2562 14.1782 0.0784
0.2711 68.0 2601 14.4261 0.0849
0.2473 68.99 2639 14.2303 0.0827
0.3287 69.99 2677 14.2750 0.0827
0.2673 70.98 2715 14.2303 0.0820
0.2843 72.0 2754 14.4086 0.0806
0.3099 72.99 2792 14.5184 0.0827
0.3102 73.99 2830 14.2768 0.0835
0.2911 74.98 2868 14.1010 0.0835
0.2927 76.0 2907 14.4618 0.0813
0.2967 76.99 2945 14.3581 0.0820
0.2446 77.99 2983 14.4562 0.0835
0.3035 78.98 3021 14.2681 0.0835
0.2989 80.0 3060 14.2768 0.0827
0.2486 80.99 3098 14.4242 0.0820
0.2622 81.99 3136 14.3810 0.0835
0.2892 82.98 3174 14.4637 0.0827
0.2668 84.0 3213 14.4597 0.0835
0.2527 84.99 3251 14.3098 0.0820
0.2636 85.99 3289 14.3741 0.0835
0.247 86.98 3327 14.5369 0.0842
0.2693 88.0 3366 14.4039 0.0835
0.2692 88.99 3404 14.6161 0.0835
0.28 89.99 3442 14.5244 0.0835
0.2535 90.98 3480 14.4062 0.0842
0.2887 92.0 3519 14.4113 0.0806
0.257 92.99 3557 14.3442 0.0842
0.2627 93.99 3595 14.4693 0.0835
0.2804 94.98 3633 14.3223 0.0835
0.2529 96.0 3672 14.3844 0.0835
0.2327 96.99 3710 14.4284 0.0835
0.2643 97.99 3748 14.5567 0.0835
0.284 98.98 3786 14.6738 0.0813
0.2503 99.35 3800 14.5363 0.0842

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
22
Safetensors
Model size
23.9M params
Tensor type
F32
·

Finetuned from