Swin-dmae-DA3-N-Colab
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.6604
- Accuracy: 0.7826
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 120
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.4177 | 0.98 | 22 | 1.3455 | 0.4348 |
1.4043 | 2.0 | 45 | 1.3475 | 0.4565 |
1.3628 | 2.98 | 67 | 1.3397 | 0.4565 |
1.21 | 4.0 | 90 | 1.2593 | 0.4565 |
1.0504 | 4.98 | 112 | 1.1194 | 0.4130 |
0.9129 | 6.0 | 135 | 1.0522 | 0.4130 |
0.7811 | 6.98 | 157 | 1.1184 | 0.4348 |
0.6572 | 8.0 | 180 | 0.9951 | 0.5870 |
0.5207 | 8.98 | 202 | 0.9055 | 0.6522 |
0.6234 | 10.0 | 225 | 0.9277 | 0.6087 |
0.4721 | 10.98 | 247 | 0.8458 | 0.6739 |
0.3944 | 12.0 | 270 | 0.8837 | 0.6522 |
0.3572 | 12.98 | 292 | 0.8719 | 0.7174 |
0.2911 | 14.0 | 315 | 1.0367 | 0.6304 |
0.3224 | 14.98 | 337 | 0.9641 | 0.6087 |
0.2663 | 16.0 | 360 | 1.3670 | 0.5870 |
0.2132 | 16.98 | 382 | 1.3090 | 0.6304 |
0.266 | 18.0 | 405 | 1.1247 | 0.7174 |
0.1929 | 18.98 | 427 | 1.1458 | 0.6087 |
0.1831 | 20.0 | 450 | 1.0267 | 0.7391 |
0.2298 | 20.98 | 472 | 1.1863 | 0.6304 |
0.1825 | 22.0 | 495 | 1.0458 | 0.6957 |
0.1701 | 22.98 | 517 | 1.3520 | 0.6087 |
0.1964 | 24.0 | 540 | 1.3927 | 0.6522 |
0.1731 | 24.98 | 562 | 1.4361 | 0.6522 |
0.1565 | 26.0 | 585 | 1.0449 | 0.6957 |
0.1844 | 26.98 | 607 | 1.3166 | 0.6087 |
0.1187 | 28.0 | 630 | 1.7950 | 0.6304 |
0.129 | 28.98 | 652 | 1.2753 | 0.6957 |
0.1269 | 30.0 | 675 | 1.4244 | 0.6739 |
0.1522 | 30.98 | 697 | 1.4873 | 0.6522 |
0.1497 | 32.0 | 720 | 1.3693 | 0.6739 |
0.1215 | 32.98 | 742 | 1.8168 | 0.6739 |
0.1049 | 34.0 | 765 | 1.2749 | 0.7609 |
0.1013 | 34.98 | 787 | 1.5098 | 0.7609 |
0.1499 | 36.0 | 810 | 1.6464 | 0.6304 |
0.0823 | 36.98 | 832 | 1.7892 | 0.6957 |
0.092 | 38.0 | 855 | 1.6448 | 0.6739 |
0.1076 | 38.98 | 877 | 1.6955 | 0.6304 |
0.1163 | 40.0 | 900 | 1.5780 | 0.6739 |
0.0952 | 40.98 | 922 | 1.8121 | 0.6522 |
0.0833 | 42.0 | 945 | 1.4459 | 0.7174 |
0.1045 | 42.98 | 967 | 1.7307 | 0.6739 |
0.094 | 44.0 | 990 | 1.4970 | 0.7391 |
0.092 | 44.98 | 1012 | 1.5766 | 0.7174 |
0.0863 | 46.0 | 1035 | 1.7600 | 0.6522 |
0.101 | 46.98 | 1057 | 1.4763 | 0.6957 |
0.0995 | 48.0 | 1080 | 2.0018 | 0.6522 |
0.0893 | 48.98 | 1102 | 1.4872 | 0.7391 |
0.0965 | 50.0 | 1125 | 1.6165 | 0.7391 |
0.0595 | 50.98 | 1147 | 1.6608 | 0.7391 |
0.0606 | 52.0 | 1170 | 1.6604 | 0.7826 |
0.0794 | 52.98 | 1192 | 1.9967 | 0.7174 |
0.0919 | 54.0 | 1215 | 1.7728 | 0.6957 |
0.0666 | 54.98 | 1237 | 1.7364 | 0.7391 |
0.0842 | 56.0 | 1260 | 1.6661 | 0.7609 |
0.0781 | 56.98 | 1282 | 1.9340 | 0.7174 |
0.0565 | 58.0 | 1305 | 1.7399 | 0.7391 |
0.0939 | 58.98 | 1327 | 1.6644 | 0.7609 |
0.0666 | 60.0 | 1350 | 1.6804 | 0.6957 |
0.0577 | 60.98 | 1372 | 1.8968 | 0.6957 |
0.0534 | 62.0 | 1395 | 1.8967 | 0.7391 |
0.0592 | 62.98 | 1417 | 2.0113 | 0.7174 |
0.0732 | 64.0 | 1440 | 1.9213 | 0.6522 |
0.0768 | 64.98 | 1462 | 1.8912 | 0.7391 |
0.0415 | 66.0 | 1485 | 1.7955 | 0.7391 |
0.036 | 66.98 | 1507 | 1.6584 | 0.7391 |
0.0617 | 68.0 | 1530 | 1.9461 | 0.7391 |
0.0622 | 68.98 | 1552 | 1.7302 | 0.7826 |
0.0362 | 70.0 | 1575 | 1.7996 | 0.7609 |
0.0526 | 70.98 | 1597 | 1.6479 | 0.7391 |
0.0493 | 72.0 | 1620 | 1.7251 | 0.6739 |
0.0703 | 72.98 | 1642 | 2.0378 | 0.7391 |
0.0692 | 74.0 | 1665 | 2.0999 | 0.6522 |
0.0396 | 74.98 | 1687 | 2.0074 | 0.6739 |
0.0505 | 76.0 | 1710 | 1.7463 | 0.7174 |
0.0512 | 76.98 | 1732 | 1.6401 | 0.7609 |
0.0653 | 78.0 | 1755 | 1.7836 | 0.6957 |
0.0764 | 78.98 | 1777 | 1.7904 | 0.7609 |
0.0598 | 80.0 | 1800 | 1.8720 | 0.7609 |
0.0523 | 80.98 | 1822 | 1.7356 | 0.7174 |
0.0499 | 82.0 | 1845 | 1.9223 | 0.7609 |
0.0841 | 82.98 | 1867 | 1.9060 | 0.7609 |
0.0597 | 84.0 | 1890 | 1.9092 | 0.7174 |
0.0696 | 84.98 | 1912 | 1.9534 | 0.6957 |
0.0566 | 86.0 | 1935 | 1.9356 | 0.7609 |
0.0435 | 86.98 | 1957 | 2.0566 | 0.7391 |
0.024 | 88.0 | 1980 | 1.8717 | 0.7174 |
0.0137 | 88.98 | 2002 | 2.0280 | 0.7174 |
0.0663 | 90.0 | 2025 | 1.8829 | 0.7174 |
0.035 | 90.98 | 2047 | 1.9688 | 0.7174 |
0.0504 | 92.0 | 2070 | 2.0456 | 0.7391 |
0.025 | 92.98 | 2092 | 2.1568 | 0.6304 |
0.0405 | 94.0 | 2115 | 2.0628 | 0.6304 |
0.0247 | 94.98 | 2137 | 2.0686 | 0.6957 |
0.0429 | 96.0 | 2160 | 2.1125 | 0.7174 |
0.0408 | 96.98 | 2182 | 2.1003 | 0.7391 |
0.0385 | 98.0 | 2205 | 2.2997 | 0.6739 |
0.0364 | 98.98 | 2227 | 2.1442 | 0.7174 |
0.0415 | 100.0 | 2250 | 2.1043 | 0.7174 |
0.0175 | 100.98 | 2272 | 2.1847 | 0.6739 |
0.0281 | 102.0 | 2295 | 2.3262 | 0.6522 |
0.0268 | 102.98 | 2317 | 2.2843 | 0.6957 |
0.022 | 104.0 | 2340 | 2.3522 | 0.6957 |
0.0279 | 104.98 | 2362 | 2.4827 | 0.6739 |
0.0188 | 106.0 | 2385 | 2.5688 | 0.6522 |
0.0303 | 106.98 | 2407 | 2.4357 | 0.6522 |
0.0439 | 108.0 | 2430 | 2.4359 | 0.6522 |
0.0422 | 108.98 | 2452 | 2.4725 | 0.6739 |
0.032 | 110.0 | 2475 | 2.2899 | 0.6522 |
0.0414 | 110.98 | 2497 | 2.2685 | 0.6957 |
0.03 | 112.0 | 2520 | 2.3063 | 0.6739 |
0.0293 | 112.98 | 2542 | 2.3524 | 0.6739 |
0.0514 | 114.0 | 2565 | 2.3612 | 0.6739 |
0.0234 | 114.98 | 2587 | 2.3711 | 0.6522 |
0.0476 | 116.0 | 2610 | 2.3548 | 0.6739 |
0.0307 | 116.98 | 2632 | 2.3432 | 0.6739 |
0.028 | 117.33 | 2640 | 2.3432 | 0.6739 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for Augusto777/Swin-dmae-DA3-N-Colab
Base model
microsoft/swinv2-tiny-patch4-window8-256Evaluation results
- Accuracy on imagefoldervalidation set self-reported0.783