amaye15's picture
End of training
a11aa48 verified
---
license: apache-2.0
base_model: microsoft/swinv2-base-patch4-window16-256
tags:
- generated_from_trainer
datasets:
- stanford-dogs
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: microsoft-swinv2-base-patch4-window16-256-batch32-lr0.0005-standford-dogs
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: stanford-dogs
type: stanford-dogs
config: default
split: full
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9429057337220602
- name: F1
type: f1
value: 0.9410841953165723
- name: Precision
type: precision
value: 0.9431724455914652
- name: Recall
type: recall
value: 0.9417046971391595
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# microsoft-swinv2-base-patch4-window16-256-batch32-lr0.0005-standford-dogs
This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window16-256](https://huggingface.co/microsoft/swinv2-base-patch4-window16-256) on the stanford-dogs dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1810
- Accuracy: 0.9429
- F1: 0.9411
- Precision: 0.9432
- Recall: 0.9417
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 1000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 4.7518 | 0.0777 | 10 | 4.6391 | 0.0741 | 0.0533 | 0.0667 | 0.0705 |
| 4.5585 | 0.1553 | 20 | 4.3463 | 0.1919 | 0.1445 | 0.1900 | 0.1794 |
| 4.2377 | 0.2330 | 30 | 3.8243 | 0.3525 | 0.3100 | 0.4154 | 0.3382 |
| 3.6654 | 0.3107 | 40 | 2.9276 | 0.6409 | 0.6111 | 0.6994 | 0.6300 |
| 2.7617 | 0.3883 | 50 | 1.7703 | 0.8248 | 0.8042 | 0.8361 | 0.8182 |
| 1.9475 | 0.4660 | 60 | 1.0440 | 0.8863 | 0.8781 | 0.8924 | 0.8821 |
| 1.3629 | 0.5437 | 70 | 0.6490 | 0.9099 | 0.9031 | 0.9191 | 0.9062 |
| 1.0488 | 0.6214 | 80 | 0.4485 | 0.9150 | 0.9075 | 0.9147 | 0.9118 |
| 0.8477 | 0.6990 | 90 | 0.3744 | 0.9206 | 0.9169 | 0.9294 | 0.9190 |
| 0.7184 | 0.7767 | 100 | 0.3301 | 0.9259 | 0.9215 | 0.9283 | 0.9227 |
| 0.7149 | 0.8544 | 110 | 0.2970 | 0.9186 | 0.9152 | 0.9227 | 0.9156 |
| 0.6429 | 0.9320 | 120 | 0.2675 | 0.9286 | 0.9238 | 0.9301 | 0.9256 |
| 0.5864 | 1.0097 | 130 | 0.2609 | 0.9291 | 0.9258 | 0.9338 | 0.9272 |
| 0.5414 | 1.0874 | 140 | 0.2644 | 0.9162 | 0.9122 | 0.9212 | 0.9156 |
| 0.5323 | 1.1650 | 150 | 0.2454 | 0.9281 | 0.9225 | 0.9362 | 0.9256 |
| 0.5061 | 1.2427 | 160 | 0.2481 | 0.9269 | 0.9235 | 0.9308 | 0.9251 |
| 0.5898 | 1.3204 | 170 | 0.2306 | 0.9346 | 0.9324 | 0.9389 | 0.9331 |
| 0.5277 | 1.3981 | 180 | 0.2192 | 0.9368 | 0.9327 | 0.9384 | 0.9350 |
| 0.4824 | 1.4757 | 190 | 0.2171 | 0.9337 | 0.9297 | 0.9375 | 0.9311 |
| 0.4632 | 1.5534 | 200 | 0.2244 | 0.9346 | 0.9315 | 0.9379 | 0.9326 |
| 0.4882 | 1.6311 | 210 | 0.2237 | 0.9361 | 0.9323 | 0.9404 | 0.9345 |
| 0.4583 | 1.7087 | 220 | 0.2228 | 0.9327 | 0.9289 | 0.9373 | 0.9304 |
| 0.4692 | 1.7864 | 230 | 0.2098 | 0.9354 | 0.9316 | 0.9370 | 0.9332 |
| 0.5407 | 1.8641 | 240 | 0.2102 | 0.9356 | 0.9342 | 0.9375 | 0.9351 |
| 0.4629 | 1.9417 | 250 | 0.2045 | 0.9378 | 0.9349 | 0.9396 | 0.9367 |
| 0.4363 | 2.0194 | 260 | 0.2023 | 0.9373 | 0.9346 | 0.9398 | 0.9355 |
| 0.4328 | 2.0971 | 270 | 0.2063 | 0.9354 | 0.9320 | 0.9360 | 0.9343 |
| 0.3554 | 2.1748 | 280 | 0.1948 | 0.9439 | 0.9398 | 0.9475 | 0.9418 |
| 0.4024 | 2.2524 | 290 | 0.1985 | 0.9388 | 0.9372 | 0.9397 | 0.9377 |
| 0.4006 | 2.3301 | 300 | 0.2153 | 0.9334 | 0.9275 | 0.9420 | 0.9311 |
| 0.3935 | 2.4078 | 310 | 0.2021 | 0.9393 | 0.9346 | 0.9416 | 0.9368 |
| 0.3591 | 2.4854 | 320 | 0.2126 | 0.9346 | 0.9311 | 0.9403 | 0.9333 |
| 0.4058 | 2.5631 | 330 | 0.2020 | 0.9378 | 0.9357 | 0.9393 | 0.9358 |
| 0.396 | 2.6408 | 340 | 0.2038 | 0.9371 | 0.9339 | 0.9410 | 0.9357 |
| 0.4157 | 2.7184 | 350 | 0.2091 | 0.9332 | 0.9288 | 0.9352 | 0.9308 |
| 0.4222 | 2.7961 | 360 | 0.1933 | 0.9393 | 0.9372 | 0.9399 | 0.9378 |
| 0.3521 | 2.8738 | 370 | 0.1984 | 0.9397 | 0.9381 | 0.9430 | 0.9388 |
| 0.3925 | 2.9515 | 380 | 0.1874 | 0.9383 | 0.9347 | 0.9390 | 0.9358 |
| 0.3475 | 3.0291 | 390 | 0.1994 | 0.9383 | 0.9364 | 0.9410 | 0.9376 |
| 0.3526 | 3.1068 | 400 | 0.1941 | 0.9390 | 0.9352 | 0.9402 | 0.9373 |
| 0.351 | 3.1845 | 410 | 0.1893 | 0.9417 | 0.9403 | 0.9438 | 0.9410 |
| 0.3549 | 3.2621 | 420 | 0.1960 | 0.9390 | 0.9370 | 0.9410 | 0.9381 |
| 0.3291 | 3.3398 | 430 | 0.1948 | 0.9397 | 0.9358 | 0.9387 | 0.9374 |
| 0.3153 | 3.4175 | 440 | 0.1992 | 0.9441 | 0.9415 | 0.9453 | 0.9427 |
| 0.3116 | 3.4951 | 450 | 0.2005 | 0.9417 | 0.9389 | 0.9432 | 0.9404 |
| 0.3053 | 3.5728 | 460 | 0.1974 | 0.9412 | 0.9372 | 0.9424 | 0.9394 |
| 0.3141 | 3.6505 | 470 | 0.1941 | 0.9405 | 0.9386 | 0.9420 | 0.9395 |
| 0.3275 | 3.7282 | 480 | 0.2182 | 0.9334 | 0.9301 | 0.9374 | 0.9321 |
| 0.2997 | 3.8058 | 490 | 0.2029 | 0.9376 | 0.9343 | 0.9392 | 0.9360 |
| 0.3242 | 3.8835 | 500 | 0.1996 | 0.9380 | 0.9344 | 0.9399 | 0.9361 |
| 0.3585 | 3.9612 | 510 | 0.1935 | 0.9405 | 0.9378 | 0.9421 | 0.9389 |
| 0.2942 | 4.0388 | 520 | 0.2028 | 0.9368 | 0.9341 | 0.9428 | 0.9367 |
| 0.3233 | 4.1165 | 530 | 0.2029 | 0.9378 | 0.9353 | 0.9406 | 0.9364 |
| 0.2942 | 4.1942 | 540 | 0.1959 | 0.9385 | 0.9368 | 0.9395 | 0.9372 |
| 0.3079 | 4.2718 | 550 | 0.1941 | 0.9371 | 0.9349 | 0.9373 | 0.9354 |
| 0.2931 | 4.3495 | 560 | 0.1871 | 0.9414 | 0.9388 | 0.9410 | 0.9394 |
| 0.3058 | 4.4272 | 570 | 0.1879 | 0.9419 | 0.9403 | 0.9430 | 0.9407 |
| 0.3402 | 4.5049 | 580 | 0.1833 | 0.9434 | 0.9409 | 0.9435 | 0.9420 |
| 0.3169 | 4.5825 | 590 | 0.1882 | 0.9412 | 0.9391 | 0.9425 | 0.9402 |
| 0.3071 | 4.6602 | 600 | 0.1821 | 0.9448 | 0.9425 | 0.9460 | 0.9431 |
| 0.313 | 4.7379 | 610 | 0.1879 | 0.9429 | 0.9401 | 0.9441 | 0.9413 |
| 0.3338 | 4.8155 | 620 | 0.1843 | 0.9456 | 0.9424 | 0.9469 | 0.9439 |
| 0.2468 | 4.8932 | 630 | 0.1866 | 0.9436 | 0.9412 | 0.9441 | 0.9426 |
| 0.2567 | 4.9709 | 640 | 0.1882 | 0.9405 | 0.9387 | 0.9417 | 0.9393 |
| 0.2792 | 5.0485 | 650 | 0.1914 | 0.9429 | 0.9407 | 0.9442 | 0.9418 |
| 0.2985 | 5.1262 | 660 | 0.1880 | 0.9429 | 0.9393 | 0.9442 | 0.9411 |
| 0.2744 | 5.2039 | 670 | 0.1865 | 0.9410 | 0.9378 | 0.9420 | 0.9390 |
| 0.2662 | 5.2816 | 680 | 0.1877 | 0.9419 | 0.9400 | 0.9423 | 0.9407 |
| 0.2613 | 5.3592 | 690 | 0.1890 | 0.9393 | 0.9369 | 0.9401 | 0.9378 |
| 0.2698 | 5.4369 | 700 | 0.1849 | 0.9429 | 0.9409 | 0.9441 | 0.9417 |
| 0.2592 | 5.5146 | 710 | 0.1854 | 0.9429 | 0.9414 | 0.9439 | 0.9425 |
| 0.2819 | 5.5922 | 720 | 0.1868 | 0.9429 | 0.9414 | 0.9443 | 0.9418 |
| 0.2625 | 5.6699 | 730 | 0.1832 | 0.9434 | 0.9417 | 0.9438 | 0.9422 |
| 0.273 | 5.7476 | 740 | 0.1862 | 0.9439 | 0.9408 | 0.9445 | 0.9424 |
| 0.2718 | 5.8252 | 750 | 0.1838 | 0.9441 | 0.9417 | 0.9443 | 0.9428 |
| 0.3055 | 5.9029 | 760 | 0.1852 | 0.9422 | 0.9396 | 0.9426 | 0.9407 |
| 0.276 | 5.9806 | 770 | 0.1843 | 0.9424 | 0.9409 | 0.9434 | 0.9415 |
| 0.2614 | 6.0583 | 780 | 0.1839 | 0.9429 | 0.9403 | 0.9431 | 0.9411 |
| 0.2452 | 6.1359 | 790 | 0.1858 | 0.9407 | 0.9384 | 0.9414 | 0.9390 |
| 0.2608 | 6.2136 | 800 | 0.1851 | 0.9429 | 0.9411 | 0.9437 | 0.9417 |
| 0.2639 | 6.2913 | 810 | 0.1842 | 0.9453 | 0.9432 | 0.9463 | 0.9438 |
| 0.2696 | 6.3689 | 820 | 0.1812 | 0.9424 | 0.9406 | 0.9425 | 0.9412 |
| 0.2524 | 6.4466 | 830 | 0.1830 | 0.9427 | 0.9411 | 0.9433 | 0.9417 |
| 0.2673 | 6.5243 | 840 | 0.1823 | 0.9451 | 0.9436 | 0.9464 | 0.9442 |
| 0.2991 | 6.6019 | 850 | 0.1837 | 0.9429 | 0.9408 | 0.9431 | 0.9419 |
| 0.2704 | 6.6796 | 860 | 0.1833 | 0.9439 | 0.9424 | 0.9446 | 0.9431 |
| 0.2437 | 6.7573 | 870 | 0.1857 | 0.9424 | 0.9410 | 0.9434 | 0.9416 |
| 0.2266 | 6.8350 | 880 | 0.1846 | 0.9431 | 0.9416 | 0.9436 | 0.9423 |
| 0.2276 | 6.9126 | 890 | 0.1825 | 0.9441 | 0.9426 | 0.9448 | 0.9433 |
| 0.2249 | 6.9903 | 900 | 0.1813 | 0.9436 | 0.9419 | 0.9441 | 0.9425 |
| 0.2559 | 7.0680 | 910 | 0.1813 | 0.9444 | 0.9425 | 0.9448 | 0.9431 |
| 0.2616 | 7.1456 | 920 | 0.1813 | 0.9441 | 0.9421 | 0.9443 | 0.9428 |
| 0.2247 | 7.2233 | 930 | 0.1813 | 0.9439 | 0.9421 | 0.9442 | 0.9426 |
| 0.2471 | 7.3010 | 940 | 0.1813 | 0.9448 | 0.9430 | 0.9453 | 0.9436 |
| 0.2446 | 7.3786 | 950 | 0.1817 | 0.9444 | 0.9427 | 0.9450 | 0.9432 |
| 0.2262 | 7.4563 | 960 | 0.1819 | 0.9434 | 0.9417 | 0.9439 | 0.9423 |
| 0.2632 | 7.5340 | 970 | 0.1818 | 0.9439 | 0.9422 | 0.9444 | 0.9427 |
| 0.2258 | 7.6117 | 980 | 0.1815 | 0.9434 | 0.9416 | 0.9439 | 0.9422 |
| 0.2404 | 7.6893 | 990 | 0.1811 | 0.9429 | 0.9410 | 0.9432 | 0.9416 |
| 0.2379 | 7.7670 | 1000 | 0.1810 | 0.9429 | 0.9411 | 0.9432 | 0.9417 |
### Framework versions
- Transformers 4.40.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.19.1