Edit model card

swin-tiny-patch4-window7-224-finetuned-papsmear

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2644
  • Accuracy: 0.9779

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.7081 0.9935 38 1.6642 0.2868
1.4025 1.9869 76 1.3761 0.4632
1.0918 2.9804 114 1.0276 0.5515
0.8051 4.0 153 0.7679 0.6691
0.635 4.9935 191 0.5928 0.7868
0.6051 5.9869 229 0.6957 0.75
0.5539 6.9804 267 0.5016 0.7941
0.4683 8.0 306 0.4733 0.8235
0.4153 8.9935 344 0.4835 0.8529
0.3954 9.9869 382 0.5431 0.8309
0.3524 10.9804 420 0.4061 0.8235
0.3546 12.0 459 0.4925 0.8382
0.2922 12.9935 497 0.3637 0.875
0.2342 13.9869 535 0.3286 0.8971
0.2083 14.9804 573 0.3271 0.8824
0.2704 16.0 612 0.3700 0.8824
0.1871 16.9935 650 0.3447 0.8971
0.226 17.9869 688 0.4280 0.8603
0.245 18.9804 726 0.6445 0.8088
0.1545 20.0 765 0.4180 0.8603
0.0981 20.9935 803 0.3208 0.9044
0.1455 21.9869 841 0.4256 0.8603
0.2405 22.9804 879 0.3474 0.8971
0.1549 24.0 918 0.3940 0.9044
0.1721 24.9935 956 0.4279 0.8824
0.1378 25.9869 994 0.3871 0.9044
0.0924 26.9804 1032 0.7301 0.8456
0.1325 28.0 1071 0.3712 0.9044
0.1426 28.9935 1109 0.4400 0.8603
0.0866 29.9869 1147 0.2779 0.9412
0.0659 30.9804 1185 0.3207 0.9412
0.1175 32.0 1224 0.4339 0.9044
0.0455 32.9935 1262 0.4537 0.9265
0.1006 33.9869 1300 0.6521 0.875
0.033 34.9804 1338 0.5616 0.9044
0.0979 36.0 1377 0.3718 0.9191
0.1045 36.9935 1415 0.2529 0.9632
0.0815 37.9869 1453 0.3511 0.9338
0.0761 38.9804 1491 0.3114 0.9338
0.0747 40.0 1530 0.2837 0.9338
0.0545 40.9935 1568 0.4269 0.9412
0.0796 41.9869 1606 0.2331 0.9412
0.055 42.9804 1644 0.2900 0.9485
0.0706 44.0 1683 0.3368 0.9632
0.0505 44.9935 1721 0.3780 0.9485
0.0698 45.9869 1759 0.4822 0.9191
0.0275 46.9804 1797 0.3434 0.9632
0.0641 48.0 1836 0.3387 0.9706
0.0484 48.9935 1874 0.5350 0.9191
0.0388 49.9869 1912 0.3826 0.9118
0.0347 50.9804 1950 0.3739 0.9559
0.1046 52.0 1989 0.3075 0.9118
0.0298 52.9935 2027 0.3558 0.9559
0.0478 53.9869 2065 0.3056 0.9706
0.0285 54.9804 2103 0.2851 0.9632
0.0407 56.0 2142 0.3223 0.9559
0.0459 56.9935 2180 0.4575 0.9485
0.0409 57.9869 2218 0.2930 0.9632
0.0743 58.9804 2256 0.4032 0.9485
0.0346 60.0 2295 0.3738 0.9412
0.0302 60.9935 2333 0.3597 0.9485
0.0488 61.9869 2371 0.2595 0.9559
0.0562 62.9804 2409 0.3764 0.9412
0.0216 64.0 2448 0.2644 0.9779
0.0219 64.9935 2486 0.3092 0.9632
0.0272 65.9869 2524 0.2898 0.9632
0.027 66.9804 2562 0.2693 0.9632
0.0397 68.0 2601 0.3843 0.9412
0.0154 68.9935 2639 0.3051 0.9485
0.0004 69.9869 2677 0.3909 0.9412
0.0651 70.9804 2715 0.2977 0.9485
0.016 72.0 2754 0.2695 0.9632
0.0351 72.9935 2792 0.2720 0.9706
0.0206 73.9869 2830 0.2549 0.9706
0.0109 74.9804 2868 0.2412 0.9706
0.0012 76.0 2907 0.3494 0.9779
0.0418 76.9935 2945 0.3729 0.9632
0.0165 77.9869 2983 0.3471 0.9632
0.0163 78.9804 3021 0.2973 0.9706
0.0202 80.0 3060 0.3730 0.9559
0.0368 80.9935 3098 0.2877 0.9706
0.0374 81.9869 3136 0.4143 0.9632
0.0296 82.9804 3174 0.2895 0.9779
0.0405 84.0 3213 0.2927 0.9559
0.0097 84.9935 3251 0.3179 0.9632
0.0182 85.9869 3289 0.3047 0.9706
0.0207 86.9804 3327 0.3018 0.9779
0.0207 88.0 3366 0.3321 0.9632
0.003 88.9935 3404 0.3086 0.9706
0.0157 89.9869 3442 0.2948 0.9706
0.0428 90.9804 3480 0.3175 0.9706
0.0189 92.0 3519 0.3240 0.9632
0.0046 92.9935 3557 0.3414 0.9632
0.0057 93.9869 3595 0.3329 0.9632
0.0165 94.9804 3633 0.3240 0.9632
0.006 96.0 3672 0.3180 0.9706
0.0172 96.9935 3710 0.3103 0.9779
0.0109 97.9869 3748 0.3035 0.9779
0.0172 98.9804 3786 0.3034 0.9779
0.0219 99.3464 3800 0.3036 0.9779

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mujerry/swin-tiny-patch4-window7-224-finetuned-papsmear

Finetuned
(441)
this model

Evaluation results