Edit model card

hushem_5x_beit_base_sgd_00001_fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6137
  • Accuracy: 0.2439

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5748 1.0 28 1.6349 0.2439
1.5498 2.0 56 1.6339 0.2439
1.5458 3.0 84 1.6329 0.2439
1.5997 4.0 112 1.6319 0.2439
1.5518 5.0 140 1.6310 0.2439
1.6078 6.0 168 1.6301 0.2439
1.6054 7.0 196 1.6292 0.2439
1.5635 8.0 224 1.6284 0.2439
1.5412 9.0 252 1.6276 0.2439
1.5684 10.0 280 1.6268 0.2439
1.5211 11.0 308 1.6261 0.2439
1.5857 12.0 336 1.6254 0.2439
1.5804 13.0 364 1.6248 0.2439
1.5778 14.0 392 1.6241 0.2439
1.5905 15.0 420 1.6235 0.2439
1.5552 16.0 448 1.6228 0.2439
1.5712 17.0 476 1.6222 0.2439
1.5113 18.0 504 1.6216 0.2439
1.5441 19.0 532 1.6210 0.2439
1.547 20.0 560 1.6205 0.2439
1.5712 21.0 588 1.6200 0.2439
1.595 22.0 616 1.6195 0.2439
1.6001 23.0 644 1.6190 0.2439
1.6008 24.0 672 1.6185 0.2439
1.5469 25.0 700 1.6181 0.2439
1.567 26.0 728 1.6177 0.2439
1.618 27.0 756 1.6173 0.2439
1.4849 28.0 784 1.6170 0.2439
1.5706 29.0 812 1.6166 0.2439
1.5269 30.0 840 1.6163 0.2439
1.588 31.0 868 1.6160 0.2439
1.5207 32.0 896 1.6157 0.2439
1.5395 33.0 924 1.6155 0.2439
1.5482 34.0 952 1.6152 0.2439
1.6004 35.0 980 1.6150 0.2439
1.5389 36.0 1008 1.6148 0.2439
1.5566 37.0 1036 1.6146 0.2439
1.54 38.0 1064 1.6145 0.2439
1.5715 39.0 1092 1.6143 0.2439
1.5148 40.0 1120 1.6142 0.2439
1.5688 41.0 1148 1.6141 0.2439
1.5803 42.0 1176 1.6140 0.2439
1.5477 43.0 1204 1.6139 0.2439
1.5623 44.0 1232 1.6138 0.2439
1.5648 45.0 1260 1.6137 0.2439
1.5331 46.0 1288 1.6137 0.2439
1.5791 47.0 1316 1.6137 0.2439
1.5282 48.0 1344 1.6137 0.2439
1.5715 49.0 1372 1.6137 0.2439
1.5955 50.0 1400 1.6137 0.2439

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
85.8M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results