Edit model card

swin-small-patch4-window7-224-finetuned-isic217

This model is a fine-tuned version of microsoft/swin-small-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9417
  • Accuracy: 0.5455

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.1844 0.9796 24 2.1103 0.1364
2.0018 2.0 49 1.8737 0.2727
1.6474 2.9796 73 1.9019 0.2727
1.3757 4.0 98 1.7487 0.3636
1.1526 4.9796 122 1.7576 0.4091
0.9161 6.0 147 1.5886 0.5
0.7568 6.9796 171 1.8935 0.4545
0.4024 8.0 196 1.6767 0.4545
0.814 8.9796 220 1.7112 0.3636
0.4346 10.0 245 1.9364 0.4091
0.3456 10.9796 269 1.9417 0.5455
0.228 12.0 294 2.1569 0.4091
0.1681 12.9796 318 2.0565 0.4545
0.1498 14.0 343 2.0701 0.3636
0.1599 14.9796 367 2.4973 0.5
0.3856 16.0 392 2.2473 0.4545
0.2529 16.9796 416 2.0918 0.4545
0.0557 18.0 441 1.9596 0.5455
0.0895 18.9796 465 2.5522 0.4545
0.0719 20.0 490 2.2938 0.5
0.0764 20.9796 514 2.6754 0.4545
0.1301 22.0 539 2.5287 0.4545
0.1205 22.9796 563 2.7532 0.4091
0.1013 24.0 588 2.6988 0.4545
0.0777 24.9796 612 2.9345 0.4091
0.1807 26.0 637 2.9981 0.4545
0.0298 26.9796 661 2.8549 0.4545
0.0589 28.0 686 2.6967 0.4545
0.0896 28.9796 710 2.6903 0.4545
0.0218 29.3878 720 2.6902 0.4545

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
48.9M params
Tensor type
I64
·
F32
·

Finetuned from

Evaluation results