swin-tiny-patch4-window7-224-finetuned-eurosat_animals
This model was trained from scratch on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.0772
- Accuracy: 0.9817
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.1783 | 0.98 | 37 | 2.0846 | 0.1985 |
1.6593 | 1.99 | 75 | 1.2526 | 0.7228 |
0.47 | 2.99 | 113 | 0.2017 | 0.9494 |
0.2459 | 4.0 | 151 | 0.1004 | 0.9728 |
0.1891 | 4.98 | 188 | 0.0877 | 0.9785 |
0.1638 | 5.99 | 226 | 0.0666 | 0.9831 |
0.154 | 6.99 | 264 | 0.0693 | 0.9803 |
0.1542 | 8.0 | 302 | 0.0646 | 0.9822 |
0.1305 | 8.98 | 339 | 0.0663 | 0.9822 |
0.1337 | 9.99 | 377 | 0.0593 | 0.9841 |
0.129 | 10.99 | 415 | 0.0593 | 0.9836 |
0.1179 | 12.0 | 453 | 0.0598 | 0.9836 |
0.1025 | 12.98 | 490 | 0.0636 | 0.9813 |
0.0993 | 13.99 | 528 | 0.0637 | 0.9817 |
0.1043 | 14.99 | 566 | 0.0578 | 0.9827 |
0.0996 | 16.0 | 604 | 0.0644 | 0.9831 |
0.0866 | 16.98 | 641 | 0.0813 | 0.9785 |
0.0879 | 17.99 | 679 | 0.0734 | 0.9813 |
0.0812 | 18.99 | 717 | 0.0639 | 0.9855 |
0.0864 | 20.0 | 755 | 0.0619 | 0.9860 |
0.086 | 20.98 | 792 | 0.0693 | 0.9794 |
0.0781 | 21.99 | 830 | 0.0638 | 0.9831 |
0.0826 | 22.99 | 868 | 0.0681 | 0.9822 |
0.074 | 24.0 | 906 | 0.0687 | 0.9817 |
0.0828 | 24.98 | 943 | 0.0738 | 0.9836 |
0.0727 | 25.99 | 981 | 0.0655 | 0.9827 |
0.0692 | 26.99 | 1019 | 0.0713 | 0.9836 |
0.0696 | 28.0 | 1057 | 0.0729 | 0.9817 |
0.0792 | 28.98 | 1094 | 0.0707 | 0.9836 |
0.0657 | 29.99 | 1132 | 0.0647 | 0.9827 |
0.0747 | 30.99 | 1170 | 0.0769 | 0.9808 |
0.0861 | 32.0 | 1208 | 0.0665 | 0.9841 |
0.0693 | 32.98 | 1245 | 0.0617 | 0.9850 |
0.0682 | 33.99 | 1283 | 0.0636 | 0.9855 |
0.0615 | 34.99 | 1321 | 0.0685 | 0.9841 |
0.0581 | 36.0 | 1359 | 0.0702 | 0.9822 |
0.0713 | 36.98 | 1396 | 0.0675 | 0.9855 |
0.0593 | 37.99 | 1434 | 0.0697 | 0.9827 |
0.0543 | 38.99 | 1472 | 0.0701 | 0.9831 |
0.0628 | 40.0 | 1510 | 0.0720 | 0.9799 |
0.0606 | 40.98 | 1547 | 0.0794 | 0.9808 |
0.0619 | 41.99 | 1585 | 0.0720 | 0.9827 |
0.0612 | 42.99 | 1623 | 0.0768 | 0.9813 |
0.0435 | 44.0 | 1661 | 0.0748 | 0.9831 |
0.0614 | 44.98 | 1698 | 0.0738 | 0.9841 |
0.0544 | 45.99 | 1736 | 0.0750 | 0.9822 |
0.0569 | 46.99 | 1774 | 0.0802 | 0.9803 |
0.0527 | 48.0 | 1812 | 0.0772 | 0.9831 |
0.0535 | 48.98 | 1849 | 0.0724 | 0.9831 |
0.063 | 49.99 | 1887 | 0.0736 | 0.9831 |
0.0534 | 50.99 | 1925 | 0.0767 | 0.9822 |
0.0515 | 52.0 | 1963 | 0.0736 | 0.9817 |
0.0522 | 52.98 | 2000 | 0.0739 | 0.9827 |
0.0474 | 53.99 | 2038 | 0.0687 | 0.9831 |
0.0515 | 54.99 | 2076 | 0.0675 | 0.9846 |
0.0558 | 56.0 | 2114 | 0.0676 | 0.9822 |
0.0461 | 56.98 | 2151 | 0.0714 | 0.9813 |
0.0532 | 57.99 | 2189 | 0.0753 | 0.9803 |
0.0539 | 58.99 | 2227 | 0.0826 | 0.9803 |
0.0428 | 60.0 | 2265 | 0.0785 | 0.9827 |
0.0361 | 60.98 | 2302 | 0.0821 | 0.9813 |
0.0515 | 61.99 | 2340 | 0.0813 | 0.9817 |
0.047 | 62.99 | 2378 | 0.0817 | 0.9813 |
0.046 | 64.0 | 2416 | 0.0765 | 0.9827 |
0.039 | 64.98 | 2453 | 0.0794 | 0.9827 |
0.0399 | 65.99 | 2491 | 0.0805 | 0.9822 |
0.0478 | 66.99 | 2529 | 0.0758 | 0.9817 |
0.0426 | 68.0 | 2567 | 0.0726 | 0.9831 |
0.0383 | 68.98 | 2604 | 0.0785 | 0.9827 |
0.0407 | 69.99 | 2642 | 0.0795 | 0.9813 |
0.0413 | 70.99 | 2680 | 0.0774 | 0.9822 |
0.0428 | 72.0 | 2718 | 0.0765 | 0.9827 |
0.0375 | 72.98 | 2755 | 0.0730 | 0.9836 |
0.0428 | 73.99 | 2793 | 0.0715 | 0.9841 |
0.0443 | 74.99 | 2831 | 0.0744 | 0.9841 |
0.0383 | 76.0 | 2869 | 0.0748 | 0.9822 |
0.0338 | 76.98 | 2906 | 0.0818 | 0.9808 |
0.0445 | 77.99 | 2944 | 0.0759 | 0.9817 |
0.0374 | 78.99 | 2982 | 0.0757 | 0.9827 |
0.0404 | 80.0 | 3020 | 0.0793 | 0.9813 |
0.0336 | 80.98 | 3057 | 0.0750 | 0.9827 |
0.0364 | 81.99 | 3095 | 0.0816 | 0.9813 |
0.0403 | 82.99 | 3133 | 0.0795 | 0.9822 |
0.0287 | 84.0 | 3171 | 0.0818 | 0.9803 |
0.0425 | 84.98 | 3208 | 0.0819 | 0.9803 |
0.0446 | 85.99 | 3246 | 0.0775 | 0.9813 |
0.0341 | 86.99 | 3284 | 0.0772 | 0.9803 |
0.0414 | 88.0 | 3322 | 0.0757 | 0.9813 |
0.0401 | 88.98 | 3359 | 0.0751 | 0.9822 |
0.0442 | 89.99 | 3397 | 0.0775 | 0.9827 |
0.046 | 90.99 | 3435 | 0.0794 | 0.9808 |
0.0378 | 92.0 | 3473 | 0.0773 | 0.9822 |
0.0343 | 92.98 | 3510 | 0.0759 | 0.9817 |
0.0348 | 93.99 | 3548 | 0.0766 | 0.9822 |
0.0409 | 94.99 | 3586 | 0.0773 | 0.9817 |
0.0389 | 96.0 | 3624 | 0.0770 | 0.9822 |
0.0362 | 96.98 | 3661 | 0.0773 | 0.9817 |
0.0317 | 97.99 | 3699 | 0.0772 | 0.9817 |
0.0244 | 98.01 | 3700 | 0.0772 | 0.9817 |
Framework versions
- Transformers 4.38.1
- Pytorch 2.1.2+cu121
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.