Edit model card

swin-tiny-patch4-window7-224-seg-swin-amal-finetuned-eurosat

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 5.6236
  • Accuracy: 0.4528

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.1452 1.0 268 2.2034 0.2247
0.9905 2.0 536 0.9729 0.7126
0.5262 3.0 804 0.5282 0.8314
0.36 4.0 1072 0.5618 0.8337
0.305 5.0 1340 0.9210 0.6535
0.2669 6.0 1608 1.1776 0.6317
0.2663 7.0 1876 1.2129 0.6290
0.2207 8.0 2144 2.2039 0.4068
0.2178 9.0 2412 1.9747 0.4740
0.1822 10.0 2680 1.4390 0.5526
0.1691 11.0 2948 2.1020 0.3814
0.1731 12.0 3216 2.0999 0.4251
0.1705 13.0 3484 2.4643 0.3700
0.1286 14.0 3752 2.7902 0.4345
0.1511 15.0 4020 2.5151 0.4165
0.1403 16.0 4288 4.4323 0.3099
0.1562 17.0 4556 2.0293 0.5096
0.1233 18.0 4824 2.5863 0.4236
0.1293 19.0 5092 2.6533 0.4506
0.1268 20.0 5360 2.1429 0.4998
0.1464 21.0 5628 2.3014 0.5470
0.1507 22.0 5896 2.3857 0.4911
0.1285 23.0 6164 1.4228 0.6406
0.1364 24.0 6432 3.6147 0.4842
0.1209 25.0 6700 2.4210 0.4896
0.1321 26.0 6968 2.7809 0.5344
0.0944 27.0 7236 3.5598 0.4226
0.1013 28.0 7504 4.0793 0.3905
0.1243 29.0 7772 4.5733 0.3443
0.0962 30.0 8040 2.9494 0.4199
0.0974 31.0 8308 3.1012 0.4496
0.113 32.0 8576 3.9522 0.3764
0.1067 33.0 8844 1.9792 0.6053
0.095 34.0 9112 2.8795 0.5302
0.1015 35.0 9380 5.9943 0.2941
0.0912 36.0 9648 2.9536 0.5242
0.1193 37.0 9916 3.5187 0.4226
0.0906 38.0 10184 3.0049 0.5114
0.1109 39.0 10452 2.6823 0.5675
0.0903 40.0 10720 4.7151 0.3109
0.0846 41.0 10988 3.1118 0.3880
0.0986 42.0 11256 3.9827 0.4792
0.1244 43.0 11524 4.7544 0.2860
0.1039 44.0 11792 4.4297 0.3178
0.077 45.0 12060 5.8973 0.3524
0.0718 46.0 12328 6.0338 0.3033
0.0838 47.0 12596 6.3524 0.3507
0.0935 48.0 12864 3.8675 0.4194
0.0922 49.0 13132 4.7731 0.3129
0.0903 50.0 13400 3.5435 0.4115
0.0927 51.0 13668 4.7606 0.4234
0.0757 52.0 13936 3.4110 0.4436
0.0738 53.0 14204 6.3143 0.3648
0.076 54.0 14472 4.9524 0.3604
0.0951 55.0 14740 5.5633 0.3680
0.1078 56.0 15008 5.9219 0.3082
0.0991 57.0 15276 4.9457 0.3344
0.0968 58.0 15544 4.0270 0.4271
0.0883 59.0 15812 5.3006 0.3574
0.0728 60.0 16080 6.9527 0.3119
0.0803 61.0 16348 2.9117 0.5
0.1022 62.0 16616 5.1631 0.3487
0.1155 63.0 16884 5.2602 0.3453
0.0737 64.0 17152 6.5281 0.3129
0.0735 65.0 17420 4.9847 0.3945
0.0948 66.0 17688 3.6684 0.4330
0.0765 67.0 17956 4.2188 0.4076
0.0597 68.0 18224 3.0067 0.5208
0.0866 69.0 18492 3.8993 0.4412
0.0825 70.0 18760 3.9058 0.3945
0.0897 71.0 19028 4.5870 0.3932
0.0687 72.0 19296 4.2837 0.3744
0.0774 73.0 19564 4.9028 0.3596
0.0755 74.0 19832 5.1321 0.3356
0.0728 75.0 20100 4.5533 0.3851
0.0753 76.0 20368 4.9765 0.3898
0.0582 77.0 20636 5.1959 0.3777
0.0714 78.0 20904 4.6735 0.3707
0.0928 79.0 21172 3.6359 0.4639
0.0593 80.0 21440 5.1507 0.3841
0.0972 81.0 21708 5.3122 0.3356
0.0903 82.0 21976 3.5833 0.4310
0.074 83.0 22244 2.3014 0.6349
0.0651 84.0 22512 3.8229 0.4387
0.0682 85.0 22780 3.5292 0.4627
0.0543 86.0 23048 4.0542 0.4266
0.0776 87.0 23316 3.8799 0.5240
0.0868 88.0 23584 4.1896 0.4750
0.0711 89.0 23852 3.1013 0.5381
0.077 90.0 24120 2.9132 0.5650
0.0672 91.0 24388 4.4834 0.3806
0.0737 92.0 24656 4.0161 0.5116
0.0868 93.0 24924 2.9386 0.4956
0.0778 94.0 25192 4.4806 0.4478
0.0586 95.0 25460 5.0668 0.4313
0.0713 96.0 25728 6.4632 0.3043
0.0897 97.0 25996 5.0227 0.4674
0.073 98.0 26264 3.6177 0.4854
0.0775 99.0 26532 5.5003 0.3702
0.0709 100.0 26800 5.6101 0.3863
0.078 101.0 27068 4.3187 0.4338
0.0702 102.0 27336 4.8467 0.4545
0.0498 103.0 27604 3.9094 0.4511
0.0785 104.0 27872 4.0952 0.3836
0.0767 105.0 28140 3.2816 0.4909
0.0611 106.0 28408 5.2239 0.4221
0.0753 107.0 28676 4.2586 0.4493
0.0758 108.0 28944 3.6094 0.4938
0.0951 109.0 29212 6.1982 0.3453
0.086 110.0 29480 6.4891 0.3191
0.0701 111.0 29748 5.8145 0.3235
0.0772 112.0 30016 3.7809 0.5133
0.0705 113.0 30284 4.9590 0.4372
0.0602 114.0 30552 5.5669 0.3959
0.0671 115.0 30820 4.4897 0.4429
0.0692 116.0 31088 5.1358 0.3319
0.0675 117.0 31356 5.0169 0.4226
0.0626 118.0 31624 5.6420 0.4170
0.0537 119.0 31892 5.1601 0.3683
0.0543 120.0 32160 5.4460 0.3663
0.0601 121.0 32428 7.2877 0.2981
0.0743 122.0 32696 6.5134 0.3337
0.0558 123.0 32964 4.4690 0.4469
0.0396 124.0 33232 4.4964 0.4212
0.0704 125.0 33500 4.5766 0.4011
0.0547 126.0 33768 4.0679 0.4538
0.0643 127.0 34036 3.3335 0.4545
0.0709 128.0 34304 3.6568 0.4750
0.0932 129.0 34572 4.7978 0.4614
0.0522 130.0 34840 6.1548 0.3366
0.0592 131.0 35108 5.0728 0.4409
0.0528 132.0 35376 5.5127 0.4088
0.087 133.0 35644 4.5838 0.3900
0.0566 134.0 35912 4.8733 0.3683
0.0474 135.0 36180 3.4370 0.4348
0.0517 136.0 36448 4.5547 0.3908
0.0627 137.0 36716 4.7011 0.4048
0.0693 138.0 36984 4.8039 0.4419
0.0753 139.0 37252 4.6905 0.4674
0.0542 140.0 37520 4.4103 0.4278
0.0629 141.0 37788 4.5332 0.4402
0.0636 142.0 38056 4.4822 0.4288
0.0551 143.0 38324 5.3970 0.3885
0.0677 144.0 38592 4.9337 0.3811
0.037 145.0 38860 4.7588 0.3979
0.0426 146.0 39128 4.5055 0.4110
0.0624 147.0 39396 4.9575 0.3722
0.0799 148.0 39664 3.9235 0.4350
0.0643 149.0 39932 3.2063 0.5297
0.0687 150.0 40200 3.1733 0.5692
0.0652 151.0 40468 3.8738 0.5178
0.078 152.0 40736 2.7892 0.5319
0.0644 153.0 41004 3.4909 0.5185
0.0639 154.0 41272 3.7233 0.5005
0.0517 155.0 41540 4.9475 0.4152
0.0546 156.0 41808 5.0784 0.4251
0.0704 157.0 42076 5.3511 0.3987
0.0753 158.0 42344 5.0345 0.4538
0.0504 159.0 42612 4.1655 0.4701
0.0645 160.0 42880 3.9242 0.4936
0.0543 161.0 43148 4.8499 0.4533
0.0592 162.0 43416 5.1871 0.4345
0.0716 163.0 43684 5.4487 0.4325
0.0613 164.0 43952 4.3626 0.4711
0.0616 165.0 44220 4.8649 0.4807
0.0506 166.0 44488 4.1038 0.5133
0.0802 167.0 44756 5.0038 0.4889
0.0672 168.0 45024 6.3643 0.4009
0.0562 169.0 45292 6.1359 0.4372
0.0367 170.0 45560 5.6726 0.4340
0.0687 171.0 45828 5.2015 0.4254
0.061 172.0 46096 5.0398 0.4491
0.0444 173.0 46364 5.8819 0.4414
0.0685 174.0 46632 6.0729 0.4263
0.0548 175.0 46900 5.6388 0.4298
0.084 176.0 47168 6.3042 0.4090
0.0575 177.0 47436 6.3381 0.4019
0.0678 178.0 47704 6.3679 0.4100
0.0445 179.0 47972 6.3634 0.4152
0.081 180.0 48240 6.4057 0.4051
0.0643 181.0 48508 6.6593 0.3648
0.0497 182.0 48776 6.7469 0.3799
0.0568 183.0 49044 5.9056 0.4221
0.0513 184.0 49312 6.4656 0.4046
0.0496 185.0 49580 6.1444 0.4140
0.0524 186.0 49848 5.9295 0.4357
0.0746 187.0 50116 5.6245 0.4612
0.0489 188.0 50384 5.6278 0.4476
0.0589 189.0 50652 5.6629 0.4595
0.0365 190.0 50920 5.9882 0.4392
0.0456 191.0 51188 6.0186 0.4496
0.0486 192.0 51456 5.6916 0.4427
0.0658 193.0 51724 5.7638 0.4461
0.0599 194.0 51992 5.7886 0.4387
0.0522 195.0 52260 5.7112 0.4464
0.0556 196.0 52528 5.7411 0.4419
0.0681 197.0 52796 5.6449 0.4516
0.0649 198.0 53064 5.6714 0.4508
0.0582 199.0 53332 5.6241 0.4521
0.0727 200.0 53600 5.6236 0.4528

Framework versions

  • Transformers 4.20.1
  • Pytorch 1.12.1+cu102
  • Datasets 2.3.2
  • Tokenizers 0.12.1
Downloads last month
16

Evaluation results