Edit model card

hushem_1x_deit_tiny_sgd_lr001_fold2

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4182
  • Accuracy: 0.3556

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.6125 0.1333
1.5852 2.0 12 1.5871 0.1333
1.5852 3.0 18 1.5653 0.1556
1.5028 4.0 24 1.5474 0.1556
1.4795 5.0 30 1.5322 0.1556
1.4795 6.0 36 1.5188 0.1556
1.4252 7.0 42 1.5071 0.1556
1.4252 8.0 48 1.4989 0.1556
1.3707 9.0 54 1.4901 0.1778
1.365 10.0 60 1.4824 0.2
1.365 11.0 66 1.4748 0.2
1.3235 12.0 72 1.4694 0.2444
1.3235 13.0 78 1.4635 0.2444
1.3233 14.0 84 1.4596 0.2444
1.2774 15.0 90 1.4554 0.2444
1.2774 16.0 96 1.4518 0.2444
1.2584 17.0 102 1.4482 0.2667
1.2584 18.0 108 1.4450 0.2667
1.2788 19.0 114 1.4423 0.2667
1.2388 20.0 120 1.4398 0.2667
1.2388 21.0 126 1.4370 0.2889
1.2317 22.0 132 1.4351 0.2667
1.2317 23.0 138 1.4327 0.2889
1.2286 24.0 144 1.4312 0.2889
1.2033 25.0 150 1.4298 0.2889
1.2033 26.0 156 1.4283 0.3111
1.1965 27.0 162 1.4267 0.3111
1.1965 28.0 168 1.4258 0.3111
1.1963 29.0 174 1.4246 0.3111
1.1946 30.0 180 1.4236 0.3111
1.1946 31.0 186 1.4227 0.3333
1.1805 32.0 192 1.4218 0.3556
1.1805 33.0 198 1.4211 0.3556
1.1439 34.0 204 1.4203 0.3556
1.1699 35.0 210 1.4197 0.3556
1.1699 36.0 216 1.4193 0.3556
1.156 37.0 222 1.4190 0.3556
1.156 38.0 228 1.4187 0.3556
1.1475 39.0 234 1.4185 0.3556
1.1517 40.0 240 1.4183 0.3556
1.1517 41.0 246 1.4182 0.3556
1.1468 42.0 252 1.4182 0.3556
1.1468 43.0 258 1.4182 0.3556
1.1597 44.0 264 1.4182 0.3556
1.1542 45.0 270 1.4182 0.3556
1.1542 46.0 276 1.4182 0.3556
1.1604 47.0 282 1.4182 0.3556
1.1604 48.0 288 1.4182 0.3556
1.1576 49.0 294 1.4182 0.3556
1.143 50.0 300 1.4182 0.3556

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
1
Safetensors
Model size
5.53M params
Tensor type
F32
·

Finetuned from

Evaluation results