Edit model card

hushem_1x_beit_base_sgd_00001_fold1

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5985
  • Accuracy: 0.2667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.6073 0.2667
1.576 2.0 12 1.6069 0.2667
1.576 3.0 18 1.6064 0.2667
1.541 4.0 24 1.6060 0.2667
1.5934 5.0 30 1.6056 0.2667
1.5934 6.0 36 1.6052 0.2667
1.5677 7.0 42 1.6048 0.2667
1.5677 8.0 48 1.6044 0.2667
1.5641 9.0 54 1.6041 0.2667
1.5398 10.0 60 1.6037 0.2667
1.5398 11.0 66 1.6034 0.2667
1.568 12.0 72 1.6030 0.2667
1.568 13.0 78 1.6027 0.2667
1.5479 14.0 84 1.6024 0.2667
1.5637 15.0 90 1.6021 0.2667
1.5637 16.0 96 1.6019 0.2667
1.5778 17.0 102 1.6016 0.2667
1.5778 18.0 108 1.6013 0.2667
1.5608 19.0 114 1.6011 0.2667
1.5898 20.0 120 1.6009 0.2667
1.5898 21.0 126 1.6006 0.2667
1.5472 22.0 132 1.6004 0.2667
1.5472 23.0 138 1.6002 0.2667
1.5773 24.0 144 1.6000 0.2667
1.5601 25.0 150 1.5998 0.2667
1.5601 26.0 156 1.5997 0.2667
1.5627 27.0 162 1.5995 0.2667
1.5627 28.0 168 1.5994 0.2667
1.5472 29.0 174 1.5993 0.2667
1.5831 30.0 180 1.5991 0.2667
1.5831 31.0 186 1.5990 0.2667
1.5527 32.0 192 1.5989 0.2667
1.5527 33.0 198 1.5988 0.2667
1.535 34.0 204 1.5988 0.2667
1.5751 35.0 210 1.5987 0.2667
1.5751 36.0 216 1.5986 0.2667
1.5377 37.0 222 1.5986 0.2667
1.5377 38.0 228 1.5985 0.2667
1.5661 39.0 234 1.5985 0.2667
1.5536 40.0 240 1.5985 0.2667
1.5536 41.0 246 1.5985 0.2667
1.5592 42.0 252 1.5985 0.2667
1.5592 43.0 258 1.5985 0.2667
1.5823 44.0 264 1.5985 0.2667
1.5597 45.0 270 1.5985 0.2667
1.5597 46.0 276 1.5985 0.2667
1.5427 47.0 282 1.5985 0.2667
1.5427 48.0 288 1.5985 0.2667
1.5619 49.0 294 1.5985 0.2667
1.5725 50.0 300 1.5985 0.2667

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results