Edit model card

swin-tiny-patch4-window7-224-finetuned-woody_LeftGR_clean_130epochs

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4504
  • Accuracy: 0.9023

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 130

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6569 0.99 52 0.6227 0.6720
0.6069 1.99 104 0.5891 0.6934
0.6044 2.99 156 0.5543 0.7202
0.5898 3.99 208 0.5440 0.7229
0.5774 4.99 260 0.5360 0.7282
0.5912 5.99 312 0.5466 0.7189
0.5685 6.99 364 0.5184 0.7336
0.5604 7.99 416 0.5138 0.7550
0.5455 8.99 468 0.5157 0.7376
0.5462 9.99 520 0.5078 0.7657
0.5729 10.99 572 0.4957 0.7523
0.5555 11.99 624 0.5016 0.7564
0.5291 12.99 676 0.5665 0.7323
0.524 13.99 728 0.5431 0.7390
0.5194 14.99 780 0.5019 0.7430
0.5368 15.99 832 0.4810 0.7724
0.4917 16.99 884 0.4793 0.7711
0.4892 17.99 936 0.4981 0.7631
0.5117 18.99 988 0.4969 0.7510
0.5033 19.99 1040 0.4711 0.7671
0.4807 20.99 1092 0.4959 0.7724
0.493 21.99 1144 0.4509 0.7898
0.4887 22.99 1196 0.4791 0.7738
0.4517 23.99 1248 0.4722 0.7831
0.4617 24.99 1300 0.4344 0.7992
0.4609 25.99 1352 0.4647 0.7952
0.4365 26.99 1404 0.4459 0.7912
0.4515 27.99 1456 0.5217 0.7644
0.4538 28.99 1508 0.4375 0.8166
0.4371 29.99 1560 0.4406 0.8005
0.4228 30.99 1612 0.4383 0.7912
0.4347 31.99 1664 0.4246 0.8153
0.4354 32.99 1716 0.4606 0.8112
0.4194 33.99 1768 0.4371 0.8112
0.4073 34.99 1820 0.4436 0.8126
0.3935 35.99 1872 0.4255 0.8273
0.3862 36.99 1924 0.4054 0.8233
0.3739 37.99 1976 0.4206 0.8126
0.3794 38.99 2028 0.4075 0.8220
0.3713 39.99 2080 0.3787 0.8353
0.3901 40.99 2132 0.3840 0.8246
0.3514 41.99 2184 0.4136 0.8367
0.3718 42.99 2236 0.3867 0.8394
0.3699 43.99 2288 0.3737 0.8487
0.3314 44.99 2340 0.3756 0.8527
0.3167 45.99 2392 0.4211 0.8474
0.301 46.99 2444 0.3870 0.8434
0.3048 47.99 2496 0.4236 0.8461
0.2735 48.99 2548 0.4122 0.8380
0.3003 49.99 2600 0.3609 0.8568
0.3147 50.99 2652 0.4258 0.8367
0.288 51.99 2704 0.3855 0.8394
0.2895 52.99 2756 0.3543 0.8527
0.2685 53.99 2808 0.3668 0.8541
0.2931 54.99 2860 0.3565 0.8541
0.2966 55.99 2912 0.3985 0.8568
0.2737 56.99 2964 0.4100 0.8581
0.2892 57.99 3016 0.3480 0.8768
0.2753 58.99 3068 0.3726 0.8661
0.2831 59.99 3120 0.3981 0.8635
0.261 60.99 3172 0.4217 0.8635
0.2662 61.99 3224 0.3516 0.8728
0.2464 62.99 3276 0.3821 0.8648
0.256 63.99 3328 0.3970 0.8688
0.2755 64.99 3380 0.4765 0.8541
0.2339 65.99 3432 0.5616 0.8541
0.2344 66.99 3484 0.3887 0.8648
0.1995 67.99 3536 0.4400 0.8675
0.2297 68.99 3588 0.4290 0.8688
0.227 69.99 3640 0.4521 0.8701
0.2084 70.99 3692 0.3855 0.8782
0.2225 71.99 3744 0.4201 0.8742
0.1897 72.99 3796 0.5138 0.8501
0.2136 73.99 3848 0.4111 0.8849
0.2155 74.99 3900 0.3800 0.8862
0.2338 75.99 3952 0.4014 0.8835
0.2021 76.99 4004 0.4214 0.8929
0.2028 77.99 4056 0.3997 0.8795
0.2162 78.99 4108 0.4911 0.8782
0.1889 79.99 4160 0.4651 0.8701
0.2056 80.99 4212 0.4156 0.8862
0.206 81.99 4264 0.4330 0.8742
0.1919 82.99 4316 0.4199 0.8956
0.1967 83.99 4368 0.4615 0.8822
0.2083 84.99 4420 0.4585 0.8715
0.1888 85.99 4472 0.5748 0.8728
0.1744 86.99 4524 0.4458 0.8902
0.1789 87.99 4576 0.4858 0.8688
0.1992 88.99 4628 0.5018 0.8715
0.1742 89.99 4680 0.5066 0.8755
0.1822 90.99 4732 0.4269 0.8929
0.1883 91.99 4784 0.4550 0.8795
0.1741 92.99 4836 0.4107 0.8942
0.1574 93.99 4888 0.5604 0.8809
0.193 94.99 4940 0.4775 0.8889
0.2018 95.99 4992 0.4200 0.8996
0.1832 96.99 5044 0.4504 0.9023
0.1624 97.99 5096 0.4859 0.8889
0.1739 98.99 5148 0.4955 0.8849
0.1439 99.99 5200 0.4792 0.8942
0.1716 100.99 5252 0.5112 0.8862
0.1537 101.99 5304 0.4572 0.8916
0.1655 102.99 5356 0.4774 0.8809
0.1515 103.99 5408 0.4635 0.8889
0.1594 104.99 5460 0.4794 0.8929
0.1488 105.99 5512 0.4941 0.8969
0.1634 106.99 5564 0.4841 0.8916
0.1471 107.99 5616 0.4919 0.9009
0.1453 108.99 5668 0.4617 0.9009
0.1578 109.99 5720 0.4328 0.9009
0.1754 110.99 5772 0.5240 0.8956
0.1657 111.99 5824 0.4821 0.8969
0.1516 112.99 5876 0.4411 0.9023
0.1542 113.99 5928 0.5313 0.8822
0.1496 114.99 5980 0.5038 0.8862
0.1597 115.99 6032 0.4908 0.8876
0.1175 116.99 6084 0.5504 0.8862
0.1415 117.99 6136 0.5018 0.8916
0.1614 118.99 6188 0.5221 0.8902
0.1396 119.99 6240 0.5042 0.8902
0.1673 120.99 6292 0.5078 0.8876
0.1303 121.99 6344 0.4994 0.8942
0.1355 122.99 6396 0.4834 0.8942
0.1452 123.99 6448 0.5145 0.8889
0.142 124.99 6500 0.5480 0.8822
0.1318 125.99 6552 0.5099 0.8916
0.122 126.99 6604 0.5159 0.8876
0.1678 127.99 6656 0.5080 0.8916
0.1444 128.99 6708 0.5114 0.8902
0.1282 129.99 6760 0.5224 0.8889

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.7.0
  • Tokenizers 0.13.2
Downloads last month
20

Evaluation results