Edit model card

hushem_1x_deit_tiny_sgd_lr00001_fold4

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5565
  • Accuracy: 0.1905

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.5632 0.1905
1.6067 2.0 12 1.5628 0.1905
1.6067 3.0 18 1.5625 0.1905
1.6234 4.0 24 1.5621 0.1905
1.6412 5.0 30 1.5618 0.1905
1.6412 6.0 36 1.5615 0.1905
1.6304 7.0 42 1.5612 0.1905
1.6304 8.0 48 1.5609 0.1905
1.6339 9.0 54 1.5606 0.1905
1.6208 10.0 60 1.5604 0.1905
1.6208 11.0 66 1.5601 0.1905
1.599 12.0 72 1.5598 0.1905
1.599 13.0 78 1.5596 0.1905
1.6454 14.0 84 1.5594 0.1905
1.5993 15.0 90 1.5591 0.1905
1.5993 16.0 96 1.5589 0.1905
1.6104 17.0 102 1.5587 0.1905
1.6104 18.0 108 1.5585 0.1905
1.5995 19.0 114 1.5584 0.1905
1.6359 20.0 120 1.5582 0.1905
1.6359 21.0 126 1.5580 0.1905
1.5868 22.0 132 1.5579 0.1905
1.5868 23.0 138 1.5577 0.1905
1.67 24.0 144 1.5576 0.1905
1.5662 25.0 150 1.5575 0.1905
1.5662 26.0 156 1.5573 0.1905
1.6118 27.0 162 1.5572 0.1905
1.6118 28.0 168 1.5571 0.1905
1.6163 29.0 174 1.5570 0.1905
1.6392 30.0 180 1.5569 0.1905
1.6392 31.0 186 1.5568 0.1905
1.6602 32.0 192 1.5568 0.1905
1.6602 33.0 198 1.5567 0.1905
1.5354 34.0 204 1.5567 0.1905
1.6205 35.0 210 1.5566 0.1905
1.6205 36.0 216 1.5566 0.1905
1.6201 37.0 222 1.5565 0.1905
1.6201 38.0 228 1.5565 0.1905
1.5923 39.0 234 1.5565 0.1905
1.6521 40.0 240 1.5565 0.1905
1.6521 41.0 246 1.5565 0.1905
1.6177 42.0 252 1.5565 0.1905
1.6177 43.0 258 1.5565 0.1905
1.6437 44.0 264 1.5565 0.1905
1.5948 45.0 270 1.5565 0.1905
1.5948 46.0 276 1.5565 0.1905
1.6236 47.0 282 1.5565 0.1905
1.6236 48.0 288 1.5565 0.1905
1.6168 49.0 294 1.5565 0.1905
1.6032 50.0 300 1.5565 0.1905

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
1
Safetensors
Model size
5.53M params
Tensor type
F32
·

Finetuned from

Evaluation results