SWIN_finetuned_linear
This model is a fine-tuned version of microsoft/swinv2-base-patch4-window12-192-22k on an unknown dataset. It achieves the following results on the evaluation set:
- Accuracy: 0.6934
- Loss: 2.0638
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0036
- train_batch_size: 256
- eval_batch_size: 256
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60.0
Training results
Training Loss | Epoch | Step | Accuracy | Validation Loss |
---|---|---|---|---|
9.5189 | 1.0 | 1313 | 0.0002 | 9.4097 |
9.1635 | 2.0 | 2626 | 0.0006 | 9.0418 |
8.3432 | 3.0 | 3939 | 0.0065 | 7.8454 |
6.9913 | 4.0 | 5252 | 0.0489 | 6.3118 |
5.5048 | 5.0 | 6565 | 0.1423 | 4.9493 |
4.6895 | 6.0 | 7878 | 0.2450 | 3.9601 |
3.8881 | 7.0 | 9191 | 0.3136 | 3.4186 |
3.391 | 8.0 | 10504 | 0.3766 | 2.9798 |
3.0887 | 9.0 | 11817 | 0.4221 | 2.7054 |
2.7935 | 10.0 | 13130 | 0.4552 | 2.5013 |
2.5629 | 11.0 | 14443 | 0.4804 | 2.3581 |
2.3777 | 12.0 | 15756 | 0.4809 | 2.3543 |
2.2264 | 13.0 | 17069 | 0.5179 | 2.1632 |
2.0932 | 14.0 | 18382 | 0.5219 | 2.1362 |
1.9667 | 15.0 | 19695 | 0.5591 | 1.9567 |
1.8788 | 16.0 | 21008 | 0.5610 | 1.9347 |
1.7705 | 17.0 | 22321 | 0.5684 | 1.9483 |
1.7089 | 18.0 | 23634 | 0.5791 | 1.8928 |
1.6068 | 19.0 | 24947 | 0.5855 | 1.8435 |
1.5572 | 20.0 | 26260 | 0.5880 | 1.8408 |
1.4938 | 21.0 | 27573 | 0.6110 | 1.7413 |
1.4182 | 22.0 | 28886 | 0.6155 | 1.7196 |
1.3784 | 23.0 | 30199 | 0.6238 | 1.7105 |
1.3578 | 24.0 | 31512 | 0.6176 | 1.7759 |
1.2763 | 25.0 | 32825 | 0.6219 | 1.7365 |
1.2484 | 26.0 | 34138 | 0.6199 | 1.7483 |
1.1936 | 27.0 | 35451 | 0.6314 | 1.7003 |
1.1499 | 28.0 | 36764 | 0.6247 | 1.7399 |
1.1418 | 29.0 | 38077 | 0.6317 | 1.7091 |
1.0895 | 30.0 | 39390 | 0.6383 | 1.7166 |
1.0706 | 31.0 | 40703 | 0.6374 | 1.7384 |
1.0541 | 32.0 | 42016 | 0.6409 | 1.7336 |
1.0013 | 33.0 | 43329 | 0.6451 | 1.7185 |
0.9811 | 34.0 | 44642 | 0.6479 | 1.7246 |
0.9447 | 35.0 | 45955 | 0.6540 | 1.7245 |
0.6587 | 36.0 | 47268 | 0.7019 | 1.5849 |
0.6044 | 37.0 | 48581 | 0.7062 | 1.6146 |
0.572 | 38.0 | 49894 | 0.7081 | 1.6583 |
0.545 | 39.0 | 51207 | 0.7087 | 1.6993 |
0.5341 | 40.0 | 52520 | 0.7106 | 1.7078 |
0.5284 | 41.0 | 53833 | 0.7105 | 1.7241 |
0.5186 | 42.0 | 55146 | 0.7112 | 1.7408 |
0.506 | 43.0 | 56459 | 0.7106 | 1.7487 |
0.5043 | 44.0 | 57772 | 0.7109 | 1.7547 |
0.5094 | 45.0 | 59085 | 0.7111 | 1.7536 |
0.5547 | 46.0 | 60398 | 0.7069 | 1.7074 |
0.5391 | 47.0 | 61711 | 0.7090 | 1.7401 |
0.5253 | 48.0 | 63024 | 0.7093 | 1.7770 |
0.5066 | 49.0 | 64337 | 0.7102 | 1.8135 |
0.495 | 50.0 | 65650 | 0.7110 | 1.8452 |
0.4813 | 51.0 | 66963 | 0.7107 | 1.8846 |
0.4704 | 52.0 | 68276 | 0.7124 | 1.8989 |
0.4689 | 53.0 | 69589 | 0.7132 | 1.9311 |
0.4611 | 54.0 | 70902 | 0.7131 | 1.9354 |
0.4547 | 55.0 | 72215 | 0.7133 | 1.9741 |
0.4481 | 56.0 | 73528 | 0.7131 | 1.9899 |
0.4709 | 57.0 | 74841 | 0.7104 | 1.9412 |
0.4647 | 58.0 | 76154 | 0.7098 | 1.9707 |
0.4566 | 59.0 | 77467 | 0.7116 | 2.0151 |
0.4511 | 60.0 | 78780 | 0.7114 | 2.0363 |
0.4423 | 61.0 | 80093 | 0.7112 | 2.0710 |
0.4356 | 62.0 | 81406 | 0.7116 | 2.0611 |
0.4272 | 63.0 | 82719 | 0.7118 | 2.0891 |
0.4254 | 64.0 | 84032 | 0.7124 | 2.0879 |
0.4221 | 65.0 | 85345 | 0.7131 | 2.1167 |
0.4189 | 66.0 | 86658 | 0.7129 | 2.1363 |
0.4219 | 67.0 | 87971 | 0.7130 | 2.1355 |
0.4149 | 68.0 | 89284 | 0.7132 | 2.1466 |
0.4125 | 69.0 | 90597 | 0.7131 | 2.1478 |
0.4162 | 70.0 | 91910 | 0.7132 | 2.1484 |
0.8802 | 36.0 | 94500 | 0.6567 | 1.7588 |
0.8772 | 37.0 | 97125 | 0.6669 | 1.6901 |
0.847 | 38.0 | 99750 | 0.6683 | 1.7208 |
0.8349 | 39.0 | 102375 | 0.6680 | 1.7477 |
0.8159 | 40.0 | 105000 | 0.6641 | 1.7669 |
0.7894 | 41.0 | 107625 | 0.6698 | 1.7947 |
0.765 | 42.0 | 110250 | 0.6673 | 1.7770 |
0.7417 | 43.0 | 112875 | 0.6686 | 1.8336 |
0.7214 | 44.0 | 115500 | 0.6755 | 1.7522 |
0.7113 | 45.0 | 118125 | 0.6774 | 1.7852 |
0.6954 | 46.0 | 120750 | 0.6774 | 1.7557 |
0.6658 | 47.0 | 123375 | 0.6788 | 1.8116 |
0.6593 | 48.0 | 126000 | 0.6829 | 1.8154 |
0.6384 | 49.0 | 128625 | 0.6795 | 1.7875 |
0.6257 | 50.0 | 131250 | 0.6811 | 1.8821 |
0.5999 | 51.0 | 133875 | 0.6872 | 1.8406 |
0.5924 | 52.0 | 136500 | 0.6765 | 1.9697 |
0.5812 | 53.0 | 139125 | 0.6818 | 1.9344 |
0.5521 | 54.0 | 141750 | 0.6802 | 1.9845 |
0.556 | 55.0 | 144375 | 0.6826 | 2.0039 |
0.5412 | 56.0 | 147000 | 0.6857 | 1.9339 |
0.5204 | 57.0 | 149625 | 0.6872 | 2.0444 |
0.5051 | 58.0 | 152250 | 0.6896 | 1.9678 |
0.4977 | 59.0 | 154875 | 0.6878 | 2.0385 |
0.4932 | 60.0 | 157500 | 0.6890 | 2.0594 |
0.4689 | 61.0 | 160125 | 0.6835 | 2.1333 |
0.4652 | 62.0 | 162750 | 0.6899 | 2.1436 |
0.4515 | 63.0 | 165375 | 0.6923 | 2.1558 |
0.4542 | 64.0 | 168000 | 0.6934 | 2.0638 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.1.2
- Datasets 2.16.1
- Tokenizers 0.13.3
- Downloads last month
- 16