Edit model card

hushem_1x_deit_tiny_rms_lr00001_fold1

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2449
  • Accuracy: 0.6889

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.2771 0.4
1.263 2.0 12 1.0784 0.5556
1.263 3.0 18 0.9616 0.5556
0.5461 4.0 24 1.0339 0.6889
0.2446 5.0 30 0.9939 0.6667
0.2446 6.0 36 1.2442 0.4889
0.0817 7.0 42 0.7980 0.6222
0.0817 8.0 48 0.8675 0.6444
0.0302 9.0 54 0.8969 0.6889
0.009 10.0 60 0.9399 0.6222
0.009 11.0 66 1.0591 0.7111
0.0037 12.0 72 1.0283 0.6667
0.0037 13.0 78 1.0855 0.6667
0.0025 14.0 84 1.1121 0.6667
0.0019 15.0 90 1.1082 0.6667
0.0019 16.0 96 1.1158 0.6667
0.0015 17.0 102 1.1382 0.6667
0.0015 18.0 108 1.1574 0.6667
0.0013 19.0 114 1.1739 0.6667
0.0011 20.0 120 1.1736 0.6667
0.0011 21.0 126 1.1594 0.6889
0.001 22.0 132 1.1738 0.6889
0.001 23.0 138 1.1962 0.6667
0.0009 24.0 144 1.1951 0.6889
0.0008 25.0 150 1.2004 0.6889
0.0008 26.0 156 1.1996 0.6889
0.0008 27.0 162 1.2076 0.6889
0.0008 28.0 168 1.2144 0.6889
0.0007 29.0 174 1.2117 0.6889
0.0007 30.0 180 1.2204 0.6889
0.0007 31.0 186 1.2217 0.6889
0.0006 32.0 192 1.2270 0.6889
0.0006 33.0 198 1.2321 0.6889
0.0006 34.0 204 1.2307 0.6889
0.0006 35.0 210 1.2313 0.6889
0.0006 36.0 216 1.2374 0.6889
0.0006 37.0 222 1.2446 0.6889
0.0006 38.0 228 1.2471 0.6889
0.0005 39.0 234 1.2452 0.6889
0.0006 40.0 240 1.2458 0.6889
0.0006 41.0 246 1.2454 0.6889
0.0005 42.0 252 1.2449 0.6889
0.0005 43.0 258 1.2449 0.6889
0.0005 44.0 264 1.2449 0.6889
0.0005 45.0 270 1.2449 0.6889
0.0005 46.0 276 1.2449 0.6889
0.0005 47.0 282 1.2449 0.6889
0.0005 48.0 288 1.2449 0.6889
0.0005 49.0 294 1.2449 0.6889
0.0005 50.0 300 1.2449 0.6889

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
1
Safetensors
Model size
5.53M params
Tensor type
F32
·

Finetuned from

Evaluation results