|
---
|
|
license: apache-2.0
|
|
base_model: microsoft/swin-tiny-patch4-window7-224
|
|
tags:
|
|
- generated_from_trainer
|
|
datasets:
|
|
- imagefolder
|
|
metrics:
|
|
- accuracy
|
|
model-index:
|
|
- name: swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-stacked
|
|
results:
|
|
- task:
|
|
name: Image Classification
|
|
type: image-classification
|
|
dataset:
|
|
name: imagefolder
|
|
type: imagefolder
|
|
config: default
|
|
split: train
|
|
args: default
|
|
metrics:
|
|
- name: Accuracy
|
|
type: accuracy
|
|
value: 0.9679752066115702
|
|
---
|
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
should probably proofread and complete it, then remove this comment. -->
|
|
|
|
# swin-tiny-patch4-window7-224-finetuned-st-wsdmhar-stacked
|
|
|
|
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
|
|
It achieves the following results on the evaluation set:
|
|
- Loss: 0.1344
|
|
- Accuracy: 0.9680
|
|
|
|
## Model description
|
|
|
|
More information needed
|
|
|
|
## Intended uses & limitations
|
|
|
|
More information needed
|
|
|
|
## Training and evaluation data
|
|
|
|
More information needed
|
|
|
|
## Training procedure
|
|
|
|
### Training hyperparameters
|
|
|
|
The following hyperparameters were used during training:
|
|
- learning_rate: 5e-05
|
|
- train_batch_size: 32
|
|
- eval_batch_size: 32
|
|
- seed: 42
|
|
- gradient_accumulation_steps: 4
|
|
- total_train_batch_size: 128
|
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
|
- lr_scheduler_type: linear
|
|
- lr_scheduler_warmup_ratio: 0.1
|
|
- num_epochs: 100
|
|
|
|
### Training results
|
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|
|
|
| 1.4973 | 1.0 | 53 | 1.3214 | 0.3743 |
|
|
| 0.7898 | 2.0 | 106 | 0.7212 | 0.6753 |
|
|
| 0.5919 | 3.0 | 159 | 0.5983 | 0.7317 |
|
|
| 0.5388 | 4.0 | 212 | 0.4451 | 0.8209 |
|
|
| 0.475 | 5.0 | 265 | 0.3542 | 0.8674 |
|
|
| 0.4174 | 6.0 | 318 | 0.3148 | 0.8771 |
|
|
| 0.3487 | 7.0 | 371 | 0.3107 | 0.8802 |
|
|
| 0.3385 | 8.0 | 424 | 0.3179 | 0.8798 |
|
|
| 0.3324 | 9.0 | 477 | 0.2846 | 0.8998 |
|
|
| 0.3347 | 10.0 | 530 | 0.2837 | 0.8871 |
|
|
| 0.2952 | 11.0 | 583 | 0.2412 | 0.9139 |
|
|
| 0.282 | 12.0 | 636 | 0.3142 | 0.8767 |
|
|
| 0.2679 | 13.0 | 689 | 0.2496 | 0.9005 |
|
|
| 0.2816 | 14.0 | 742 | 0.2014 | 0.9239 |
|
|
| 0.2989 | 15.0 | 795 | 0.2049 | 0.9218 |
|
|
| 0.2634 | 16.0 | 848 | 0.2066 | 0.9232 |
|
|
| 0.2692 | 17.0 | 901 | 0.1994 | 0.9284 |
|
|
| 0.2069 | 18.0 | 954 | 0.1958 | 0.9304 |
|
|
| 0.2373 | 19.0 | 1007 | 0.2273 | 0.9249 |
|
|
| 0.1992 | 20.0 | 1060 | 0.2094 | 0.9267 |
|
|
| 0.1997 | 21.0 | 1113 | 0.1808 | 0.9387 |
|
|
| 0.1794 | 22.0 | 1166 | 0.1833 | 0.9408 |
|
|
| 0.1736 | 23.0 | 1219 | 0.2456 | 0.9091 |
|
|
| 0.2004 | 24.0 | 1272 | 0.1918 | 0.9294 |
|
|
| 0.2039 | 25.0 | 1325 | 0.1768 | 0.9370 |
|
|
| 0.1829 | 26.0 | 1378 | 0.2090 | 0.9225 |
|
|
| 0.1566 | 27.0 | 1431 | 0.1467 | 0.9456 |
|
|
| 0.1531 | 28.0 | 1484 | 0.1604 | 0.9404 |
|
|
| 0.1553 | 29.0 | 1537 | 0.1612 | 0.9449 |
|
|
| 0.1406 | 30.0 | 1590 | 0.1644 | 0.9494 |
|
|
| 0.1396 | 31.0 | 1643 | 0.1411 | 0.9501 |
|
|
| 0.1049 | 32.0 | 1696 | 0.1616 | 0.9539 |
|
|
| 0.1411 | 33.0 | 1749 | 0.1708 | 0.9446 |
|
|
| 0.1211 | 34.0 | 1802 | 0.1392 | 0.9501 |
|
|
| 0.1113 | 35.0 | 1855 | 0.1369 | 0.9525 |
|
|
| 0.1249 | 36.0 | 1908 | 0.1320 | 0.9535 |
|
|
| 0.1274 | 37.0 | 1961 | 0.1524 | 0.9518 |
|
|
| 0.1191 | 38.0 | 2014 | 0.1438 | 0.9525 |
|
|
| 0.0949 | 39.0 | 2067 | 0.1379 | 0.9573 |
|
|
| 0.0936 | 40.0 | 2120 | 0.1463 | 0.9518 |
|
|
| 0.1008 | 41.0 | 2173 | 0.1681 | 0.9494 |
|
|
| 0.0887 | 42.0 | 2226 | 0.1463 | 0.9566 |
|
|
| 0.1113 | 43.0 | 2279 | 0.1719 | 0.9456 |
|
|
| 0.1087 | 44.0 | 2332 | 0.1343 | 0.9604 |
|
|
| 0.097 | 45.0 | 2385 | 0.1431 | 0.9576 |
|
|
| 0.1061 | 46.0 | 2438 | 0.1495 | 0.9580 |
|
|
| 0.11 | 47.0 | 2491 | 0.1555 | 0.9549 |
|
|
| 0.0806 | 48.0 | 2544 | 0.1493 | 0.9549 |
|
|
| 0.0979 | 49.0 | 2597 | 0.2320 | 0.9373 |
|
|
| 0.0751 | 50.0 | 2650 | 0.1516 | 0.9573 |
|
|
| 0.0845 | 51.0 | 2703 | 0.1277 | 0.9614 |
|
|
| 0.079 | 52.0 | 2756 | 0.1373 | 0.9601 |
|
|
| 0.0818 | 53.0 | 2809 | 0.1569 | 0.9539 |
|
|
| 0.0845 | 54.0 | 2862 | 0.1422 | 0.9604 |
|
|
| 0.0796 | 55.0 | 2915 | 0.1400 | 0.9621 |
|
|
| 0.0975 | 56.0 | 2968 | 0.1375 | 0.9573 |
|
|
| 0.0607 | 57.0 | 3021 | 0.1504 | 0.9580 |
|
|
| 0.0632 | 58.0 | 3074 | 0.1364 | 0.9607 |
|
|
| 0.0542 | 59.0 | 3127 | 0.1278 | 0.9669 |
|
|
| 0.0807 | 60.0 | 3180 | 0.1507 | 0.9518 |
|
|
| 0.0673 | 61.0 | 3233 | 0.1302 | 0.9645 |
|
|
| 0.0773 | 62.0 | 3286 | 0.1388 | 0.9638 |
|
|
| 0.0739 | 63.0 | 3339 | 0.1533 | 0.9573 |
|
|
| 0.0718 | 64.0 | 3392 | 0.1325 | 0.9594 |
|
|
| 0.0719 | 65.0 | 3445 | 0.1304 | 0.9625 |
|
|
| 0.0487 | 66.0 | 3498 | 0.1250 | 0.9645 |
|
|
| 0.0718 | 67.0 | 3551 | 0.1512 | 0.9573 |
|
|
| 0.0851 | 68.0 | 3604 | 0.1299 | 0.9607 |
|
|
| 0.0658 | 69.0 | 3657 | 0.1424 | 0.9625 |
|
|
| 0.0605 | 70.0 | 3710 | 0.1391 | 0.9625 |
|
|
| 0.0732 | 71.0 | 3763 | 0.1320 | 0.9642 |
|
|
| 0.0613 | 72.0 | 3816 | 0.1461 | 0.9607 |
|
|
| 0.056 | 73.0 | 3869 | 0.1328 | 0.9635 |
|
|
| 0.0661 | 74.0 | 3922 | 0.1319 | 0.9628 |
|
|
| 0.0581 | 75.0 | 3975 | 0.1337 | 0.9666 |
|
|
| 0.0698 | 76.0 | 4028 | 0.1383 | 0.9645 |
|
|
| 0.0544 | 77.0 | 4081 | 0.1324 | 0.9656 |
|
|
| 0.059 | 78.0 | 4134 | 0.1380 | 0.9645 |
|
|
| 0.0554 | 79.0 | 4187 | 0.1435 | 0.9638 |
|
|
| 0.0497 | 80.0 | 4240 | 0.1310 | 0.9649 |
|
|
| 0.0463 | 81.0 | 4293 | 0.1384 | 0.9604 |
|
|
| 0.0622 | 82.0 | 4346 | 0.1363 | 0.9628 |
|
|
| 0.0534 | 83.0 | 4399 | 0.1428 | 0.9635 |
|
|
| 0.0434 | 84.0 | 4452 | 0.1374 | 0.9656 |
|
|
| 0.0591 | 85.0 | 4505 | 0.1332 | 0.9663 |
|
|
| 0.0488 | 86.0 | 4558 | 0.1271 | 0.9697 |
|
|
| 0.0418 | 87.0 | 4611 | 0.1286 | 0.9669 |
|
|
| 0.0505 | 88.0 | 4664 | 0.1372 | 0.9676 |
|
|
| 0.0486 | 89.0 | 4717 | 0.1372 | 0.9676 |
|
|
| 0.0561 | 90.0 | 4770 | 0.1348 | 0.9680 |
|
|
| 0.0498 | 91.0 | 4823 | 0.1340 | 0.9669 |
|
|
| 0.0432 | 92.0 | 4876 | 0.1351 | 0.9621 |
|
|
| 0.0322 | 93.0 | 4929 | 0.1380 | 0.9659 |
|
|
| 0.0389 | 94.0 | 4982 | 0.1370 | 0.9656 |
|
|
| 0.0408 | 95.0 | 5035 | 0.1343 | 0.9683 |
|
|
| 0.0367 | 96.0 | 5088 | 0.1347 | 0.9680 |
|
|
| 0.0337 | 97.0 | 5141 | 0.1366 | 0.9669 |
|
|
| 0.0338 | 98.0 | 5194 | 0.1355 | 0.9669 |
|
|
| 0.0284 | 99.0 | 5247 | 0.1340 | 0.9676 |
|
|
| 0.0501 | 100.0 | 5300 | 0.1344 | 0.9680 |
|
|
|
|
|
|
### Framework versions
|
|
|
|
- Transformers 4.43.2
|
|
- Pytorch 2.3.1+cu118
|
|
- Datasets 2.20.0
|
|
- Tokenizers 0.19.1
|
|
|