Edit model card

smids_10x_deit_tiny_sgd_0001_fold1

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4530
  • Accuracy: 0.8114

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0181 1.0 751 0.9693 0.5359
0.81 2.0 1502 0.8850 0.5993
0.7699 3.0 2253 0.8246 0.6377
0.6601 4.0 3004 0.7789 0.6578
0.653 5.0 3755 0.7391 0.6745
0.6463 6.0 4506 0.7047 0.6912
0.5744 7.0 5257 0.6756 0.7028
0.4963 8.0 6008 0.6490 0.7129
0.5329 9.0 6759 0.6286 0.7195
0.5165 10.0 7510 0.6094 0.7295
0.5717 11.0 8261 0.5949 0.7279
0.4844 12.0 9012 0.5809 0.7396
0.4587 13.0 9763 0.5699 0.7446
0.4195 14.0 10514 0.5589 0.7496
0.4521 15.0 11265 0.5504 0.7579
0.4327 16.0 12016 0.5411 0.7596
0.4611 17.0 12767 0.5341 0.7663
0.4248 18.0 13518 0.5294 0.7746
0.4694 19.0 14269 0.5215 0.7780
0.395 20.0 15020 0.5170 0.7880
0.3437 21.0 15771 0.5117 0.7880
0.4367 22.0 16522 0.5057 0.7947
0.3451 23.0 17273 0.5010 0.7930
0.4413 24.0 18024 0.4962 0.7930
0.3908 25.0 18775 0.4929 0.7930
0.4631 26.0 19526 0.4899 0.7930
0.3779 27.0 20277 0.4860 0.7930
0.4436 28.0 21028 0.4829 0.7963
0.3794 29.0 21779 0.4792 0.7997
0.3732 30.0 22530 0.4775 0.7963
0.3411 31.0 23281 0.4746 0.7980
0.4745 32.0 24032 0.4718 0.7980
0.4263 33.0 24783 0.4692 0.7997
0.3711 34.0 25534 0.4676 0.8030
0.3951 35.0 26285 0.4656 0.8047
0.4026 36.0 27036 0.4635 0.8047
0.4811 37.0 27787 0.4621 0.8063
0.3816 38.0 28538 0.4609 0.8063
0.2904 39.0 29289 0.4596 0.8047
0.4708 40.0 30040 0.4586 0.8097
0.3633 41.0 30791 0.4575 0.8080
0.367 42.0 31542 0.4565 0.8080
0.4048 43.0 32293 0.4557 0.8080
0.3531 44.0 33044 0.4549 0.8080
0.3608 45.0 33795 0.4542 0.8097
0.3794 46.0 34546 0.4538 0.8097
0.3429 47.0 35297 0.4534 0.8114
0.395 48.0 36048 0.4532 0.8114
0.3682 49.0 36799 0.4531 0.8114
0.3927 50.0 37550 0.4530 0.8114

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
12

Finetuned from

Evaluation results