Edit model card

swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-xyz-auc

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1060
  • Accuracy: 0.9766

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.586 1.0 53 1.4366 0.4559
0.9024 2.0 106 0.7526 0.7018
0.6157 3.0 159 0.5375 0.7944
0.5165 4.0 212 0.4298 0.8306
0.4315 5.0 265 0.3646 0.8609
0.3687 6.0 318 0.3054 0.8877
0.3352 7.0 371 0.2822 0.9005
0.3186 8.0 424 0.2764 0.9043
0.3155 9.0 477 0.2409 0.9215
0.2824 10.0 530 0.2459 0.9236
0.2575 11.0 583 0.2346 0.9129
0.2384 12.0 636 0.2445 0.9012
0.2117 13.0 689 0.1838 0.9342
0.2172 14.0 742 0.1789 0.9384
0.1918 15.0 795 0.1615 0.9480
0.1909 16.0 848 0.1516 0.9473
0.1911 17.0 901 0.1513 0.9494
0.2124 18.0 954 0.1524 0.9494
0.1631 19.0 1007 0.1729 0.9339
0.1766 20.0 1060 0.1329 0.9539
0.168 21.0 1113 0.1235 0.9590
0.1227 22.0 1166 0.1390 0.9483
0.1705 23.0 1219 0.1290 0.9566
0.1296 24.0 1272 0.1119 0.9621
0.1201 25.0 1325 0.1452 0.9497
0.1233 26.0 1378 0.1440 0.9487
0.1412 27.0 1431 0.1206 0.9573
0.1031 28.0 1484 0.1235 0.9580
0.1131 29.0 1537 0.1377 0.9501
0.1157 30.0 1590 0.1308 0.9580
0.0925 31.0 1643 0.1172 0.9601
0.0864 32.0 1696 0.1135 0.9621
0.0748 33.0 1749 0.0987 0.9656
0.1004 34.0 1802 0.0924 0.9728
0.0858 35.0 1855 0.1058 0.9659
0.0976 36.0 1908 0.1180 0.9587
0.0797 37.0 1961 0.1035 0.9676
0.0884 38.0 2014 0.0909 0.9707
0.0841 39.0 2067 0.0979 0.9707
0.0633 40.0 2120 0.0943 0.9697
0.0601 41.0 2173 0.1017 0.9687
0.0693 42.0 2226 0.1160 0.9652
0.0715 43.0 2279 0.0980 0.9704
0.0807 44.0 2332 0.1030 0.9711
0.0614 45.0 2385 0.0999 0.9707
0.0639 46.0 2438 0.1265 0.9632
0.0623 47.0 2491 0.1195 0.9614
0.0444 48.0 2544 0.1338 0.9659
0.0551 49.0 2597 0.1042 0.9728
0.0588 50.0 2650 0.0987 0.9731
0.0421 51.0 2703 0.1306 0.9607
0.0446 52.0 2756 0.1035 0.9718
0.0489 53.0 2809 0.1084 0.9714
0.0529 54.0 2862 0.1225 0.9663
0.0403 55.0 2915 0.1053 0.9711
0.0455 56.0 2968 0.1436 0.9645
0.0436 57.0 3021 0.1052 0.9714
0.0416 58.0 3074 0.1132 0.9666
0.0378 59.0 3127 0.1055 0.9721
0.0545 60.0 3180 0.1166 0.9704
0.0315 61.0 3233 0.1073 0.9711
0.0433 62.0 3286 0.1012 0.9735
0.0577 63.0 3339 0.1117 0.9714
0.0369 64.0 3392 0.1150 0.9697
0.0459 65.0 3445 0.1054 0.9731
0.0458 66.0 3498 0.1045 0.9745
0.0374 67.0 3551 0.1105 0.9725
0.0318 68.0 3604 0.1138 0.9718
0.0337 69.0 3657 0.1053 0.9728
0.0337 70.0 3710 0.1011 0.9738
0.0329 71.0 3763 0.1067 0.9738
0.0313 72.0 3816 0.1003 0.9756
0.0446 73.0 3869 0.1125 0.9714
0.047 74.0 3922 0.1040 0.9707
0.0256 75.0 3975 0.1165 0.9700
0.0535 76.0 4028 0.1129 0.9697
0.029 77.0 4081 0.1040 0.9752
0.044 78.0 4134 0.1116 0.9718
0.0405 79.0 4187 0.1130 0.9725
0.0417 80.0 4240 0.1094 0.9735
0.0257 81.0 4293 0.1143 0.9697
0.0293 82.0 4346 0.1111 0.9735
0.0234 83.0 4399 0.1253 0.9704
0.0295 84.0 4452 0.1133 0.9749
0.0261 85.0 4505 0.1048 0.9738
0.0215 86.0 4558 0.1072 0.9728
0.0304 87.0 4611 0.1061 0.9731
0.02 88.0 4664 0.1072 0.9742
0.0353 89.0 4717 0.1096 0.9738
0.0317 90.0 4770 0.1097 0.9745
0.0441 91.0 4823 0.1080 0.9745
0.0262 92.0 4876 0.1051 0.9752
0.0312 93.0 4929 0.1089 0.9738
0.025 94.0 4982 0.1094 0.9738
0.0243 95.0 5035 0.1106 0.9745
0.0245 96.0 5088 0.1076 0.9752
0.0233 97.0 5141 0.1068 0.9762
0.0279 98.0 5194 0.1063 0.9759
0.0285 99.0 5247 0.1057 0.9766
0.022 100.0 5300 0.1060 0.9766

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.5.0.dev20240829+cu118
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-xyz-auc

Finetuned
(441)
this model

Evaluation results