Edit model card

hushem_5x_deit_small_sgd_001_fold5

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0219
  • Accuracy: 0.5854

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5389 1.0 28 1.4343 0.2439
1.3921 2.0 56 1.3847 0.2683
1.3749 3.0 84 1.3585 0.3659
1.3126 4.0 112 1.3409 0.3659
1.3117 5.0 140 1.3251 0.3659
1.3121 6.0 168 1.3119 0.3415
1.2628 7.0 196 1.2980 0.3415
1.2308 8.0 224 1.2843 0.3659
1.2428 9.0 252 1.2711 0.4146
1.1961 10.0 280 1.2591 0.4146
1.1795 11.0 308 1.2486 0.3902
1.1594 12.0 336 1.2381 0.3902
1.1371 13.0 364 1.2260 0.3902
1.1217 14.0 392 1.2140 0.4390
1.0975 15.0 420 1.2018 0.4634
1.1139 16.0 448 1.1910 0.4878
1.0797 17.0 476 1.1802 0.4878
1.0813 18.0 504 1.1682 0.4878
1.0619 19.0 532 1.1572 0.4878
1.0398 20.0 560 1.1467 0.5122
1.0215 21.0 588 1.1362 0.5122
1.0014 22.0 616 1.1280 0.5366
1.0047 23.0 644 1.1216 0.5610
0.9823 24.0 672 1.1144 0.5610
0.9814 25.0 700 1.1058 0.5610
0.9822 26.0 728 1.0976 0.5610
0.9448 27.0 756 1.0916 0.5366
0.9805 28.0 784 1.0839 0.5366
0.9187 29.0 812 1.0780 0.5366
0.9659 30.0 840 1.0725 0.5366
0.9135 31.0 868 1.0663 0.5610
0.889 32.0 896 1.0628 0.5610
0.9089 33.0 924 1.0587 0.5610
0.9062 34.0 952 1.0524 0.5610
0.9029 35.0 980 1.0479 0.5610
0.8924 36.0 1008 1.0439 0.5854
0.8694 37.0 1036 1.0402 0.5854
0.8578 38.0 1064 1.0365 0.5610
0.8992 39.0 1092 1.0340 0.5854
0.8586 40.0 1120 1.0317 0.5854
0.8737 41.0 1148 1.0296 0.5854
0.8517 42.0 1176 1.0278 0.5854
0.8537 43.0 1204 1.0257 0.5854
0.8642 44.0 1232 1.0243 0.5854
0.871 45.0 1260 1.0234 0.5854
0.8594 46.0 1288 1.0226 0.5854
0.8675 47.0 1316 1.0221 0.5854
0.874 48.0 1344 1.0219 0.5854
0.8459 49.0 1372 1.0219 0.5854
0.8538 50.0 1400 1.0219 0.5854

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
20
Safetensors
Model size
21.7M params
Tensor type
F32
·

Finetuned from

Evaluation results