Edit model card

swin-tiny-patch4-window7-224-finetuned-woody_130epochs

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4550
  • Accuracy: 0.8921

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 130

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6694 1.0 58 0.6370 0.6594
0.6072 2.0 116 0.5813 0.7030
0.6048 3.0 174 0.5646 0.7030
0.5849 4.0 232 0.5778 0.6970
0.5671 5.0 290 0.5394 0.7236
0.5575 6.0 348 0.5212 0.7382
0.568 7.0 406 0.5218 0.7358
0.5607 8.0 464 0.5183 0.7527
0.5351 9.0 522 0.5138 0.7467
0.5459 10.0 580 0.5290 0.7394
0.5454 11.0 638 0.5212 0.7345
0.5291 12.0 696 0.5130 0.7576
0.5378 13.0 754 0.5372 0.7503
0.5264 14.0 812 0.6089 0.6861
0.4909 15.0 870 0.4852 0.7636
0.5591 16.0 928 0.4817 0.76
0.4966 17.0 986 0.5673 0.6933
0.4988 18.0 1044 0.5131 0.7418
0.5339 19.0 1102 0.4998 0.7394
0.4804 20.0 1160 0.4655 0.7733
0.503 21.0 1218 0.4554 0.7685
0.4859 22.0 1276 0.4713 0.7770
0.504 23.0 1334 0.4545 0.7721
0.478 24.0 1392 0.4658 0.7830
0.4759 25.0 1450 0.4365 0.8012
0.4686 26.0 1508 0.4452 0.7855
0.4668 27.0 1566 0.4427 0.7879
0.4615 28.0 1624 0.4439 0.7685
0.4588 29.0 1682 0.4378 0.7830
0.4588 30.0 1740 0.4229 0.7988
0.4296 31.0 1798 0.4188 0.7976
0.4208 32.0 1856 0.4316 0.7891
0.4481 33.0 1914 0.4331 0.7891
0.4253 34.0 1972 0.4524 0.7879
0.4117 35.0 2030 0.4570 0.7952
0.4405 36.0 2088 0.4307 0.7927
0.4154 37.0 2146 0.4257 0.8024
0.3962 38.0 2204 0.5077 0.7818
0.414 39.0 2262 0.4602 0.8012
0.3937 40.0 2320 0.4741 0.7770
0.4186 41.0 2378 0.4250 0.8
0.4076 42.0 2436 0.4353 0.7988
0.3777 43.0 2494 0.4442 0.7879
0.3968 44.0 2552 0.4525 0.7879
0.377 45.0 2610 0.4198 0.7988
0.378 46.0 2668 0.4297 0.8097
0.3675 47.0 2726 0.4435 0.8085
0.3562 48.0 2784 0.4477 0.7952
0.381 49.0 2842 0.4206 0.8255
0.3603 50.0 2900 0.4136 0.8109
0.3331 51.0 2958 0.4141 0.8230
0.3471 52.0 3016 0.4253 0.8109
0.346 53.0 3074 0.5203 0.8048
0.3481 54.0 3132 0.4288 0.8242
0.3411 55.0 3190 0.4416 0.8194
0.3275 56.0 3248 0.4149 0.8291
0.3067 57.0 3306 0.4623 0.8218
0.3166 58.0 3364 0.4432 0.8255
0.3294 59.0 3422 0.4599 0.8267
0.3146 60.0 3480 0.4266 0.8291
0.3091 61.0 3538 0.4318 0.8315
0.3277 62.0 3596 0.4252 0.8242
0.296 63.0 3654 0.4332 0.8436
0.3241 64.0 3712 0.4729 0.8194
0.3104 65.0 3770 0.4228 0.8448
0.2878 66.0 3828 0.4173 0.8388
0.265 67.0 3886 0.4210 0.8497
0.3011 68.0 3944 0.4276 0.8436
0.2861 69.0 4002 0.4923 0.8315
0.2994 70.0 4060 0.4472 0.8182
0.276 71.0 4118 0.4541 0.8315
0.2796 72.0 4176 0.4218 0.8521
0.2727 73.0 4234 0.4053 0.8448
0.255 74.0 4292 0.4356 0.8376
0.276 75.0 4350 0.4193 0.8436
0.261 76.0 4408 0.4484 0.8533
0.2416 77.0 4466 0.4722 0.8194
0.2602 78.0 4524 0.4431 0.8533
0.2591 79.0 4582 0.4269 0.8606
0.2613 80.0 4640 0.4335 0.8485
0.2555 81.0 4698 0.4269 0.8594
0.2832 82.0 4756 0.3968 0.8715
0.264 83.0 4814 0.4173 0.8703
0.2462 84.0 4872 0.4150 0.8606
0.2424 85.0 4930 0.4377 0.8630
0.2574 86.0 4988 0.4120 0.8679
0.2273 87.0 5046 0.4393 0.8533
0.2334 88.0 5104 0.4366 0.8630
0.2258 89.0 5162 0.4189 0.8630
0.2153 90.0 5220 0.4474 0.8630
0.2462 91.0 5278 0.4362 0.8642
0.2356 92.0 5336 0.4454 0.8715
0.2019 93.0 5394 0.4413 0.88
0.209 94.0 5452 0.4410 0.8703
0.2201 95.0 5510 0.4323 0.8691
0.2245 96.0 5568 0.4999 0.8618
0.2178 97.0 5626 0.4612 0.8655
0.2163 98.0 5684 0.4340 0.8703
0.2228 99.0 5742 0.4504 0.8788
0.2151 100.0 5800 0.4602 0.8703
0.1988 101.0 5858 0.4414 0.8812
0.2227 102.0 5916 0.4392 0.8824
0.1772 103.0 5974 0.5069 0.8630
0.2199 104.0 6032 0.4648 0.8667
0.1936 105.0 6090 0.4806 0.8691
0.199 106.0 6148 0.4569 0.8764
0.2149 107.0 6206 0.4445 0.8739
0.1917 108.0 6264 0.4444 0.8727
0.201 109.0 6322 0.4594 0.8727
0.1938 110.0 6380 0.4564 0.8764
0.1977 111.0 6438 0.4398 0.8739
0.1776 112.0 6496 0.4356 0.88
0.1939 113.0 6554 0.4412 0.8848
0.178 114.0 6612 0.4373 0.88
0.1926 115.0 6670 0.4508 0.8812
0.1979 116.0 6728 0.4477 0.8848
0.1958 117.0 6786 0.4488 0.8897
0.189 118.0 6844 0.4553 0.8836
0.1838 119.0 6902 0.4605 0.8848
0.1755 120.0 6960 0.4463 0.8836
0.1958 121.0 7018 0.4474 0.8861
0.1857 122.0 7076 0.4550 0.8921
0.1466 123.0 7134 0.4494 0.8885
0.1751 124.0 7192 0.4560 0.8873
0.175 125.0 7250 0.4383 0.8897
0.207 126.0 7308 0.4601 0.8873
0.1756 127.0 7366 0.4425 0.8897
0.1695 128.0 7424 0.4533 0.8909
0.1873 129.0 7482 0.4510 0.8897
0.1726 130.0 7540 0.4463 0.8909

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
22

Evaluation results