Edit model card

hushem_5x_beit_base_adamax_001_fold3

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7065
  • Accuracy: 0.6977

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.4764 1.0 28 1.3308 0.4651
1.3399 2.0 56 1.2727 0.5116
1.2309 3.0 84 1.0539 0.5349
1.0517 4.0 112 1.0384 0.6279
0.9736 5.0 140 1.1313 0.5349
1.0108 6.0 168 0.6569 0.8140
0.9287 7.0 196 0.7779 0.7674
0.9063 8.0 224 0.8802 0.5581
0.7994 9.0 252 1.1244 0.5581
0.8319 10.0 280 0.7284 0.7209
0.8096 11.0 308 0.7775 0.7209
0.8274 12.0 336 0.7683 0.6744
0.798 13.0 364 0.8219 0.6512
0.6756 14.0 392 0.5656 0.7442
0.9098 15.0 420 0.6922 0.6279
0.6261 16.0 448 1.0949 0.5814
0.6243 17.0 476 0.7154 0.7209
0.7247 18.0 504 0.6429 0.7674
0.6106 19.0 532 0.7927 0.6744
0.4831 20.0 560 0.6060 0.7674
0.5359 21.0 588 1.2593 0.5349
0.429 22.0 616 1.0232 0.6744
0.4536 23.0 644 1.2564 0.6744
0.2969 24.0 672 1.2153 0.6279
0.3018 25.0 700 1.3650 0.5814
0.2695 26.0 728 1.6759 0.6279
0.2235 27.0 756 1.8158 0.5814
0.2674 28.0 784 1.7475 0.6977
0.1711 29.0 812 1.5630 0.7209
0.1241 30.0 840 1.5976 0.7442
0.1378 31.0 868 1.8498 0.7209
0.1016 32.0 896 2.3022 0.6279
0.1245 33.0 924 2.0178 0.6047
0.1029 34.0 952 2.0725 0.6744
0.0329 35.0 980 1.6046 0.7674
0.1038 36.0 1008 2.3364 0.6047
0.055 37.0 1036 3.1044 0.5581
0.0031 38.0 1064 2.6896 0.6512
0.0537 39.0 1092 3.2350 0.6047
0.0484 40.0 1120 3.5002 0.5814
0.0311 41.0 1148 3.0948 0.6512
0.0491 42.0 1176 2.8268 0.6977
0.0023 43.0 1204 2.5306 0.6977
0.0192 44.0 1232 2.3977 0.6977
0.0339 45.0 1260 2.5488 0.6977
0.0369 46.0 1288 2.5878 0.7209
0.049 47.0 1316 2.7159 0.6977
0.0044 48.0 1344 2.7074 0.6977
0.0183 49.0 1372 2.7065 0.6977
0.0409 50.0 1400 2.7065 0.6977

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
13
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results