Edit model card

swin-tiny-patch4-window7-224-finetuned-eurosat_DATA7_20240410

This model was trained from scratch on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0817
  • Accuracy: 0.9731

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.98 1.0 182 1.0130 0.4827
0.843 2.0 365 0.8658 0.5978
0.742 3.0 548 0.7068 0.6848
0.69 4.0 731 0.6297 0.7254
0.6413 5.0 913 0.5716 0.7571
0.611 6.0 1096 0.5445 0.7671
0.5844 7.0 1279 0.5181 0.7802
0.5616 8.0 1462 0.4757 0.7985
0.5634 9.0 1644 0.5245 0.7896
0.5223 10.0 1827 0.4991 0.7902
0.4723 11.0 2010 0.4363 0.8216
0.4443 12.0 2193 0.3813 0.8403
0.4538 13.0 2375 0.3500 0.8574
0.4273 14.0 2558 0.3326 0.8624
0.4247 15.0 2741 0.3224 0.8684
0.4063 16.0 2924 0.3096 0.8707
0.367 17.0 3106 0.2713 0.8945
0.3605 18.0 3289 0.3160 0.8755
0.3475 19.0 3472 0.2559 0.8995
0.3262 20.0 3655 0.2437 0.9030
0.3218 21.0 3837 0.2343 0.9090
0.3125 22.0 4020 0.2267 0.9113
0.336 23.0 4203 0.2170 0.9138
0.2813 24.0 4386 0.2062 0.9199
0.2802 25.0 4568 0.1956 0.9196
0.2996 26.0 4751 0.1923 0.9244
0.2699 27.0 4934 0.1934 0.9273
0.2642 28.0 5117 0.1973 0.9242
0.2491 29.0 5299 0.1686 0.9394
0.2611 30.0 5482 0.1793 0.9326
0.2383 31.0 5665 0.1744 0.9332
0.2338 32.0 5848 0.1537 0.9448
0.2225 33.0 6030 0.1569 0.9405
0.2383 34.0 6213 0.1422 0.9480
0.2253 35.0 6396 0.1413 0.9455
0.2257 36.0 6579 0.1535 0.9442
0.2308 37.0 6761 0.1655 0.9423
0.2241 38.0 6944 0.1272 0.9530
0.2253 39.0 7127 0.1464 0.9440
0.1996 40.0 7310 0.1332 0.9527
0.225 41.0 7492 0.1311 0.9530
0.1918 42.0 7675 0.1546 0.9459
0.1937 43.0 7858 0.1388 0.9515
0.2043 44.0 8041 0.1185 0.9596
0.1802 45.0 8223 0.1195 0.9557
0.1821 46.0 8406 0.1152 0.9604
0.1712 47.0 8589 0.1273 0.9575
0.1865 48.0 8772 0.1209 0.9565
0.1706 49.0 8954 0.1057 0.9611
0.1817 50.0 9137 0.1114 0.9602
0.1753 51.0 9320 0.1114 0.9621
0.1826 52.0 9503 0.1055 0.9634
0.178 53.0 9685 0.1069 0.9644
0.1522 54.0 9868 0.1036 0.9625
0.171 55.0 10051 0.1068 0.9619
0.1656 56.0 10234 0.0925 0.9644
0.163 57.0 10416 0.0939 0.9677
0.1631 58.0 10599 0.1130 0.9592
0.169 59.0 10782 0.0907 0.9671
0.1491 60.0 10965 0.1055 0.9632
0.1572 61.0 11147 0.0940 0.9638
0.1617 62.0 11330 0.1008 0.9636
0.1584 63.0 11513 0.0989 0.9673
0.1597 64.0 11696 0.1026 0.9648
0.1466 65.0 11878 0.1008 0.9665
0.1468 66.0 12061 0.0947 0.9644
0.1562 67.0 12244 0.0864 0.9707
0.1589 68.0 12427 0.0980 0.9656
0.1505 69.0 12609 0.0908 0.9681
0.1497 70.0 12792 0.0879 0.9690
0.1362 71.0 12975 0.0864 0.9700
0.1418 72.0 13158 0.0949 0.9684
0.1345 73.0 13340 0.0994 0.9681
0.1333 74.0 13523 0.0859 0.9700
0.1414 75.0 13706 0.0912 0.9692
0.137 76.0 13889 0.0863 0.9719
0.1326 77.0 14071 0.0811 0.9707
0.1429 78.0 14254 0.0875 0.9690
0.1363 79.0 14437 0.0909 0.9690
0.1344 80.0 14620 0.0913 0.9692
0.1221 81.0 14802 0.0908 0.9706
0.1192 82.0 14985 0.0835 0.9715
0.1252 83.0 15168 0.0865 0.9711
0.1404 84.0 15351 0.0922 0.9700
0.124 85.0 15533 0.0845 0.9700
0.1278 86.0 15716 0.0859 0.9721
0.1271 87.0 15899 0.0835 0.9725
0.1254 88.0 16082 0.0843 0.9721
0.1363 89.0 16264 0.0852 0.9707
0.1144 90.0 16447 0.0846 0.9729
0.1217 91.0 16630 0.0822 0.9729
0.1185 92.0 16813 0.0818 0.9731
0.1095 93.0 16995 0.0825 0.9725
0.1181 94.0 17178 0.0811 0.9729
0.1191 95.0 17361 0.0839 0.9736
0.1107 96.0 17544 0.0825 0.9729
0.1093 97.0 17726 0.0825 0.9734
0.1187 98.0 17909 0.0811 0.9731
0.1302 99.0 18092 0.0823 0.9727
0.1146 99.59 18200 0.0817 0.9731

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results