Edit model card

hushem_1x_deit_tiny_rms_lr001_fold2

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2600
  • Accuracy: 0.3556

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 2.0874 0.2444
4.6196 2.0 12 2.3422 0.2444
4.6196 3.0 18 1.7914 0.2444
1.8086 4.0 24 1.6082 0.2667
1.5901 5.0 30 1.5144 0.2444
1.5901 6.0 36 1.6190 0.2444
1.5211 7.0 42 1.5231 0.2444
1.5211 8.0 48 1.5027 0.2444
1.4477 9.0 54 1.4266 0.2444
1.4394 10.0 60 1.4345 0.2444
1.4394 11.0 66 1.3152 0.4444
1.3604 12.0 72 1.3376 0.2444
1.3604 13.0 78 1.3260 0.2667
1.3864 14.0 84 1.5120 0.2444
1.3555 15.0 90 1.2685 0.3556
1.3555 16.0 96 1.1751 0.4444
1.3501 17.0 102 1.2648 0.4444
1.3501 18.0 108 1.3992 0.3778
1.2496 19.0 114 1.4208 0.2889
1.2587 20.0 120 1.1782 0.4444
1.2587 21.0 126 1.2882 0.4444
1.2321 22.0 132 1.3142 0.4444
1.2321 23.0 138 1.1784 0.4222
1.1985 24.0 144 1.2247 0.3778
1.234 25.0 150 1.2329 0.3778
1.234 26.0 156 1.2482 0.3556
1.1913 27.0 162 1.4153 0.3111
1.1913 28.0 168 1.2994 0.3333
1.1911 29.0 174 1.1400 0.4667
1.1955 30.0 180 1.2156 0.3778
1.1955 31.0 186 1.2232 0.4
1.1751 32.0 192 1.3853 0.2889
1.1751 33.0 198 1.2309 0.3333
1.1328 34.0 204 1.2338 0.3333
1.195 35.0 210 1.2383 0.3333
1.195 36.0 216 1.2991 0.3778
1.1661 37.0 222 1.3228 0.3556
1.1661 38.0 228 1.2550 0.3778
1.0748 39.0 234 1.2591 0.3556
1.1122 40.0 240 1.2234 0.3778
1.1122 41.0 246 1.2608 0.3556
1.102 42.0 252 1.2600 0.3556
1.102 43.0 258 1.2600 0.3556
1.0792 44.0 264 1.2600 0.3556
1.1126 45.0 270 1.2600 0.3556
1.1126 46.0 276 1.2600 0.3556
1.0704 47.0 282 1.2600 0.3556
1.0704 48.0 288 1.2600 0.3556
1.1302 49.0 294 1.2600 0.3556
1.0797 50.0 300 1.2600 0.3556

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
1
Safetensors
Model size
5.53M params
Tensor type
F32
·

Finetuned from

Evaluation results