Edit model card

hushem_1x_beit_base_sgd_001_fold1

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3898
  • Accuracy: 0.4222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.5634 0.2889
1.5431 2.0 12 1.5296 0.2667
1.5431 3.0 18 1.5055 0.2889
1.4519 4.0 24 1.4860 0.2889
1.4616 5.0 30 1.4682 0.3111
1.4616 6.0 36 1.4542 0.3333
1.4129 7.0 42 1.4431 0.3333
1.4129 8.0 48 1.4346 0.3333
1.3729 9.0 54 1.4281 0.3333
1.3344 10.0 60 1.4232 0.3333
1.3344 11.0 66 1.4155 0.3333
1.3342 12.0 72 1.4113 0.3556
1.3342 13.0 78 1.4084 0.3556
1.3011 14.0 84 1.4054 0.3556
1.2926 15.0 90 1.4035 0.3556
1.2926 16.0 96 1.4007 0.3556
1.2856 17.0 102 1.3978 0.3778
1.2856 18.0 108 1.3952 0.3778
1.2766 19.0 114 1.3942 0.3778
1.2702 20.0 120 1.3946 0.3778
1.2702 21.0 126 1.3932 0.4222
1.2243 22.0 132 1.3923 0.4222
1.2243 23.0 138 1.3923 0.4
1.2345 24.0 144 1.3921 0.3778
1.2029 25.0 150 1.3905 0.3778
1.2029 26.0 156 1.3901 0.3778
1.2078 27.0 162 1.3891 0.4
1.2078 28.0 168 1.3898 0.4
1.189 29.0 174 1.3895 0.4
1.1978 30.0 180 1.3896 0.4
1.1978 31.0 186 1.3902 0.4
1.1767 32.0 192 1.3905 0.4222
1.1767 33.0 198 1.3902 0.4222
1.1816 34.0 204 1.3893 0.4222
1.1968 35.0 210 1.3893 0.4222
1.1968 36.0 216 1.3899 0.4222
1.1681 37.0 222 1.3900 0.4222
1.1681 38.0 228 1.3901 0.4222
1.1768 39.0 234 1.3901 0.4222
1.1628 40.0 240 1.3901 0.4222
1.1628 41.0 246 1.3899 0.4222
1.1656 42.0 252 1.3898 0.4222
1.1656 43.0 258 1.3898 0.4222
1.1713 44.0 264 1.3898 0.4222
1.1727 45.0 270 1.3898 0.4222
1.1727 46.0 276 1.3898 0.4222
1.1615 47.0 282 1.3898 0.4222
1.1615 48.0 288 1.3898 0.4222
1.1754 49.0 294 1.3898 0.4222
1.1668 50.0 300 1.3898 0.4222

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results