Edit model card

hushem_1x_deit_tiny_sgd_lr00001_fold3

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5383
  • Accuracy: 0.3023

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.5437 0.3023
1.6283 2.0 12 1.5434 0.3023
1.6283 3.0 18 1.5431 0.3023
1.63 4.0 24 1.5428 0.3023
1.6367 5.0 30 1.5426 0.3023
1.6367 6.0 36 1.5423 0.3023
1.6273 7.0 42 1.5421 0.3023
1.6273 8.0 48 1.5419 0.3023
1.6489 9.0 54 1.5417 0.3023
1.5924 10.0 60 1.5414 0.3023
1.5924 11.0 66 1.5412 0.3023
1.6227 12.0 72 1.5411 0.3023
1.6227 13.0 78 1.5409 0.3023
1.6367 14.0 84 1.5407 0.3023
1.622 15.0 90 1.5405 0.3023
1.622 16.0 96 1.5403 0.3023
1.621 17.0 102 1.5401 0.3023
1.621 18.0 108 1.5400 0.3023
1.6386 19.0 114 1.5398 0.3023
1.6207 20.0 120 1.5397 0.3023
1.6207 21.0 126 1.5395 0.3023
1.6152 22.0 132 1.5394 0.3023
1.6152 23.0 138 1.5393 0.3023
1.6503 24.0 144 1.5392 0.3023
1.6219 25.0 150 1.5390 0.3023
1.6219 26.0 156 1.5389 0.3023
1.6152 27.0 162 1.5389 0.3023
1.6152 28.0 168 1.5388 0.3023
1.6192 29.0 174 1.5387 0.3023
1.6111 30.0 180 1.5386 0.3023
1.6111 31.0 186 1.5386 0.3023
1.6114 32.0 192 1.5385 0.3023
1.6114 33.0 198 1.5384 0.3023
1.6361 34.0 204 1.5384 0.3023
1.6146 35.0 210 1.5384 0.3023
1.6146 36.0 216 1.5383 0.3023
1.6254 37.0 222 1.5383 0.3023
1.6254 38.0 228 1.5383 0.3023
1.6124 39.0 234 1.5383 0.3023
1.6367 40.0 240 1.5383 0.3023
1.6367 41.0 246 1.5383 0.3023
1.6229 42.0 252 1.5383 0.3023
1.6229 43.0 258 1.5383 0.3023
1.6506 44.0 264 1.5383 0.3023
1.6148 45.0 270 1.5383 0.3023
1.6148 46.0 276 1.5383 0.3023
1.6242 47.0 282 1.5383 0.3023
1.6242 48.0 288 1.5383 0.3023
1.6087 49.0 294 1.5383 0.3023
1.6097 50.0 300 1.5383 0.3023

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
1
Safetensors
Model size
5.53M params
Tensor type
F32
·

Finetuned from

Evaluation results