Edit model card

cards-top_left_swin-tiny-patch4-window7-224-finetuned-dough_100_epoch

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0369
  • Accuracy: 0.5816

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Accuracy Validation Loss
1.2196 1.0 1240 0.5816 1.0369
1.2491 2.0 2481 0.5752 1.0638
1.2016 3.0 3721 0.5792 1.0546
1.2234 4.0 4962 0.5810 1.0560
1.2298 5.0 6202 0.5725 1.0795
1.287 6.0 7443 0.5731 1.0763
1.2472 7.0 8683 0.5635 1.1067
1.2171 8.0 9924 0.5775 1.0671
1.3164 9.0 11164 0.5701 1.0681
1.3019 10.0 12405 0.5698 1.0824
1.2977 11.0 13645 0.5694 1.0721
1.2587 12.0 14886 0.5704 1.0833
1.2704 13.0 16126 0.5675 1.0934
1.2604 14.0 17367 0.5730 1.0739
1.2834 15.0 18607 0.5524 1.1210
1.2082 16.0 19848 0.5611 1.1271
1.2307 17.0 21088 0.5720 1.1013
1.2136 18.0 22329 0.5753 1.1036
1.2133 19.0 23569 0.5610 1.1350
1.2478 20.0 24810 0.5676 1.1256
1.2006 21.0 26050 0.5682 1.1288
1.1934 22.0 27291 0.5619 1.1472
1.2136 23.0 28531 0.5713 1.1304
1.2449 24.0 29772 0.5581 1.1893
1.1968 25.0 31012 0.5633 1.1754
1.1582 26.0 32253 0.5651 1.1735
1.1404 27.0 33493 0.5642 1.1752
1.2011 28.0 34734 0.5538 1.2227
1.1223 29.0 35974 0.5578 1.2200
1.1427 30.0 37215 0.5608 1.2028
1.1751 31.0 38455 0.5635 1.2253
1.1012 32.0 39696 0.5543 1.2473
1.0912 33.0 40936 0.5673 1.2370
1.1085 34.0 42177 0.5534 1.2838
1.099 35.0 43417 0.5526 1.2760
1.1092 36.0 44658 0.5547 1.2769
1.0655 37.0 45898 0.5534 1.3178
1.0861 38.0 47139 0.5585 1.2943
1.0917 39.0 48379 0.5518 1.3659
1.0791 40.0 49620 0.5541 1.3413
1.0356 41.0 50860 0.5495 1.3567
1.0394 42.0 52101 0.5491 1.3648
1.0096 43.0 53341 0.5574 1.3671
1.0736 44.0 54582 0.5468 1.4142
1.0145 45.0 55822 0.5462 1.4340
1.0437 46.0 57063 0.5442 1.4734
0.9771 47.0 58303 0.5446 1.4496
0.9758 48.0 59544 0.5397 1.5071
1.0199 49.0 60784 0.5437 1.5119
0.9898 50.0 62025 0.5428 1.5066
1.0139 51.0 63265 0.5375 1.5314
1.0035 52.0 64506 0.5427 1.5604
0.9786 53.0 65746 0.5396 1.5899
0.9768 54.0 66987 0.5449 1.5642
0.968 55.0 68227 0.5394 1.6056
0.9254 56.0 69468 0.5380 1.6091
0.9764 57.0 70680 0.5340 1.6646
0.8998 58.0 71921 0.5323 1.6692
0.9592 59.0 73161 0.5353 1.6395
0.8722 60.0 74402 0.5393 1.6702
0.888 61.0 75642 0.5336 1.6771
0.872 62.0 76883 0.5331 1.6873
0.9133 63.0 78123 0.5325 1.7182
0.8815 64.0 79364 0.5310 1.7375
0.9144 65.0 80604 0.5337 1.7263
0.8712 66.0 81845 0.5284 1.7628
0.8576 67.0 83080 1.7786 0.5322
0.8677 68.0 84321 1.7947 0.5327
0.8448 69.0 85561 1.8100 0.5314
0.8102 70.0 86802 1.8256 0.5313
0.8438 71.0 88042 1.8325 0.5273
0.8015 72.0 89283 1.8564 0.5311
0.8025 73.0 90523 1.8451 0.5342
0.8295 74.0 91764 1.8748 0.5305
0.8101 75.0 93004 1.8884 0.5297
0.7883 76.0 94245 1.8777 0.5297
0.7989 77.0 95485 1.9185 0.5262
0.7791 78.0 96726 1.9436 0.5246
0.7197 79.0 97966 1.9615 0.5222
0.7639 80.0 99207 1.9567 0.5213
0.7922 81.0 100447 1.9746 0.5248
0.7874 82.0 101688 1.9960 0.5206
0.8155 83.0 102928 2.0131 0.5211
0.7791 84.0 104169 2.0559 0.5196
0.7731 85.0 105409 2.0255 0.5192
0.8018 86.0 106650 2.0784 0.5216
0.777 87.0 107890 2.0482 0.5224
0.7637 88.0 109131 2.0889 0.5201
0.7783 89.0 110371 2.0663 0.5222
0.7156 90.0 111612 2.0884 0.5200
0.702 91.0 112852 2.1034 0.5215
0.7136 92.0 114093 2.1380 0.5164
0.6889 93.0 115333 2.1321 0.5198
0.7117 94.0 116574 2.1175 0.5186
0.6903 95.0 117814 2.1155 0.5187
0.7334 96.0 119055 2.1197 0.5200
0.6684 97.0 120295 2.1435 0.5192
0.7471 98.0 121536 2.1403 0.5196
0.7197 99.0 122776 2.1465 0.5182
0.7026 99.99 124000 2.1492 0.5186

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.17.0
  • Tokenizers 0.15.2
Downloads last month
390,005
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·

Finetuned from

Evaluation results