Edit model card

smids_1x_beit_base_adamax_001_fold4

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7646
  • Accuracy: 0.775

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9222 1.0 75 0.8216 0.5567
0.8457 2.0 150 0.8398 0.57
0.8147 3.0 225 0.7493 0.6333
0.7701 4.0 300 0.7606 0.6117
0.8026 5.0 375 0.8189 0.565
0.6963 6.0 450 0.6808 0.665
0.7638 7.0 525 0.6641 0.7017
0.6601 8.0 600 0.6495 0.6833
0.6719 9.0 675 0.7134 0.66
0.5461 10.0 750 0.5791 0.7483
0.547 11.0 825 0.5859 0.7633
0.4912 12.0 900 0.5937 0.735
0.5352 13.0 975 0.5233 0.7667
0.4434 14.0 1050 0.5543 0.7617
0.4927 15.0 1125 0.7581 0.6767
0.4312 16.0 1200 0.5587 0.7667
0.3899 17.0 1275 0.6422 0.7633
0.3786 18.0 1350 0.6068 0.7783
0.4006 19.0 1425 0.6778 0.7617
0.3094 20.0 1500 0.6494 0.775
0.3319 21.0 1575 0.6363 0.765
0.2928 22.0 1650 0.7276 0.7817
0.2846 23.0 1725 0.8156 0.7733
0.1736 24.0 1800 0.7838 0.785
0.2416 25.0 1875 0.8283 0.775
0.1805 26.0 1950 0.8042 0.7867
0.1895 27.0 2025 1.0411 0.7933
0.0832 28.0 2100 1.0766 0.7983
0.099 29.0 2175 1.1178 0.7683
0.0916 30.0 2250 1.3040 0.775
0.128 31.0 2325 1.2237 0.7983
0.0775 32.0 2400 1.1999 0.79
0.0706 33.0 2475 1.4034 0.78
0.0546 34.0 2550 1.4009 0.785
0.0453 35.0 2625 1.2357 0.7917
0.0136 36.0 2700 1.4685 0.79
0.0534 37.0 2775 1.8215 0.7717
0.0751 38.0 2850 1.6150 0.7833
0.0013 39.0 2925 1.7207 0.7917
0.0466 40.0 3000 1.4737 0.785
0.0122 41.0 3075 1.5635 0.7783
0.0071 42.0 3150 1.6935 0.7783
0.0119 43.0 3225 1.6935 0.7833
0.0065 44.0 3300 1.7015 0.7883
0.0254 45.0 3375 1.7329 0.7867
0.0205 46.0 3450 1.6886 0.785
0.0082 47.0 3525 1.7094 0.7833
0.0134 48.0 3600 1.7793 0.78
0.005 49.0 3675 1.7866 0.7767
0.0132 50.0 3750 1.7646 0.775

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
14
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results