Edit model card

swin-tiny-patch4-window7-224-finetuned-skin-cancer

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2772
  • Accuracy: 0.8984
  • Precision Per Class: {0: 0.8644793152639088, 1: 0.8451327433628318, 2: 0.8794117647058823, 3: 0.9717607973421927, 4: 0.8483606557377049, 5: 0.8936507936507937, 6: 1.0}
  • Recall Per Class: {0: 0.9237804878048781, 1: 0.8280346820809249, 2: 0.8978978978978979, 3: 0.8369098712446352, 4: 0.9338345864661655, 5: 0.8922345483359746, 6: 0.9795620437956204}
  • F1 Per Class: {0: 0.8931466470154754, 1: 0.8364963503649636, 2: 0.8885586924219911, 3: 0.8993082244427363, 4: 0.8890479599141016, 5: 0.8929421094369548, 6: 0.9896755162241887}
  • Auc Roc Per Class: {0: 0.9915928254750601, 1: 0.9884721656512784, 2: 0.9917100566306127, 3: 0.9958148687289727, 4: 0.9937121284223217, 5: 0.9929341867176752, 6: 0.9997305333073577}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Per Class Recall Per Class F1 Per Class Auc Roc Per Class
0.362 1.0 330 0.2772 0.8984 {0: 0.8644793152639088, 1: 0.8451327433628318, 2: 0.8794117647058823, 3: 0.9717607973421927, 4: 0.8483606557377049, 5: 0.8936507936507937, 6: 1.0} {0: 0.9237804878048781, 1: 0.8280346820809249, 2: 0.8978978978978979, 3: 0.8369098712446352, 4: 0.9338345864661655, 5: 0.8922345483359746, 6: 0.9795620437956204} {0: 0.8931466470154754, 1: 0.8364963503649636, 2: 0.8885586924219911, 3: 0.8993082244427363, 4: 0.8890479599141016, 5: 0.8929421094369548, 6: 0.9896755162241887} {0: 0.9915928254750601, 1: 0.9884721656512784, 2: 0.9917100566306127, 3: 0.9958148687289727, 4: 0.9937121284223217, 5: 0.9929341867176752, 6: 0.9997305333073577}

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
16
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·

Finetuned from

Evaluation results