Edit model card

hushem_1x_deit_base_sgd_0001_fold3

This model is a fine-tuned version of facebook/deit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3860
  • Accuracy: 0.3721

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.3866 0.3721
1.4431 2.0 12 1.3866 0.3721
1.4431 3.0 18 1.3866 0.3721
1.4376 4.0 24 1.3865 0.3721
1.4767 5.0 30 1.3865 0.3721
1.4767 6.0 36 1.3865 0.3721
1.4576 7.0 42 1.3864 0.3721
1.4576 8.0 48 1.3864 0.3721
1.4525 9.0 54 1.3864 0.3721
1.436 10.0 60 1.3864 0.3721
1.436 11.0 66 1.3863 0.3721
1.4515 12.0 72 1.3863 0.3721
1.4515 13.0 78 1.3863 0.3721
1.4484 14.0 84 1.3863 0.3721
1.4554 15.0 90 1.3862 0.3721
1.4554 16.0 96 1.3862 0.3721
1.4506 17.0 102 1.3862 0.3721
1.4506 18.0 108 1.3862 0.3721
1.4516 19.0 114 1.3862 0.3721
1.4451 20.0 120 1.3861 0.3721
1.4451 21.0 126 1.3861 0.3721
1.4507 22.0 132 1.3861 0.3721
1.4507 23.0 138 1.3861 0.3721
1.4465 24.0 144 1.3861 0.3721
1.4556 25.0 150 1.3861 0.3721
1.4556 26.0 156 1.3861 0.3721
1.4543 27.0 162 1.3861 0.3721
1.4543 28.0 168 1.3860 0.3721
1.4297 29.0 174 1.3860 0.3721
1.4518 30.0 180 1.3860 0.3721
1.4518 31.0 186 1.3860 0.3721
1.4478 32.0 192 1.3860 0.3721
1.4478 33.0 198 1.3860 0.3721
1.4514 34.0 204 1.3860 0.3721
1.4485 35.0 210 1.3860 0.3721
1.4485 36.0 216 1.3860 0.3721
1.4495 37.0 222 1.3860 0.3721
1.4495 38.0 228 1.3860 0.3721
1.4541 39.0 234 1.3860 0.3721
1.4436 40.0 240 1.3860 0.3721
1.4436 41.0 246 1.3860 0.3721
1.4487 42.0 252 1.3860 0.3721
1.4487 43.0 258 1.3860 0.3721
1.4524 44.0 264 1.3860 0.3721
1.437 45.0 270 1.3860 0.3721
1.437 46.0 276 1.3860 0.3721
1.4494 47.0 282 1.3860 0.3721
1.4494 48.0 288 1.3860 0.3721
1.4556 49.0 294 1.3860 0.3721
1.4483 50.0 300 1.3860 0.3721

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.15.0
Downloads last month
6
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results