Edit model card

hushem_1x_beit_base_rms_00001_fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9358
  • Accuracy: 0.8444

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.2856 0.4889
1.4398 2.0 12 0.9696 0.6222
1.4398 3.0 18 0.7405 0.7111
0.463 4.0 24 0.8561 0.7333
0.1243 5.0 30 0.6572 0.8222
0.1243 6.0 36 0.6983 0.8444
0.0205 7.0 42 0.7294 0.8222
0.0205 8.0 48 0.6504 0.8
0.0064 9.0 54 0.6828 0.8222
0.0142 10.0 60 0.6539 0.8222
0.0142 11.0 66 0.7615 0.8444
0.0032 12.0 72 0.8146 0.8444
0.0032 13.0 78 0.8154 0.8444
0.0019 14.0 84 0.7947 0.8444
0.0028 15.0 90 0.7939 0.8444
0.0028 16.0 96 0.8240 0.8444
0.0013 17.0 102 0.8242 0.8222
0.0013 18.0 108 0.8443 0.8444
0.0014 19.0 114 0.8393 0.8444
0.0012 20.0 120 0.9165 0.8222
0.0012 21.0 126 0.8985 0.8222
0.0008 22.0 132 0.9053 0.8222
0.0008 23.0 138 0.9182 0.8222
0.0007 24.0 144 0.9131 0.8222
0.0007 25.0 150 0.9205 0.8222
0.0007 26.0 156 0.9165 0.8222
0.0004 27.0 162 0.9119 0.8222
0.0004 28.0 168 0.9185 0.8222
0.0005 29.0 174 0.9203 0.8222
0.0004 30.0 180 0.9232 0.8222
0.0004 31.0 186 0.9207 0.8444
0.0009 32.0 192 0.9256 0.8444
0.0009 33.0 198 0.9230 0.8444
0.0082 34.0 204 0.9200 0.8444
0.0007 35.0 210 0.9385 0.8444
0.0007 36.0 216 0.9350 0.8444
0.0005 37.0 222 0.9367 0.8444
0.0005 38.0 228 0.9290 0.8444
0.0044 39.0 234 0.9294 0.8444
0.0005 40.0 240 0.9330 0.8444
0.0005 41.0 246 0.9359 0.8444
0.0006 42.0 252 0.9358 0.8444
0.0006 43.0 258 0.9358 0.8444
0.0005 44.0 264 0.9358 0.8444
0.0007 45.0 270 0.9358 0.8444
0.0007 46.0 276 0.9358 0.8444
0.0007 47.0 282 0.9358 0.8444
0.0007 48.0 288 0.9358 0.8444
0.0006 49.0 294 0.9358 0.8444
0.0004 50.0 300 0.9358 0.8444

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
14
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results