Edit model card

smids_1x_beit_base_adamax_001_fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1833
  • Accuracy: 0.7633

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9804 1.0 75 0.8561 0.5383
0.8823 2.0 150 0.7905 0.5767
0.8002 3.0 225 0.7961 0.5633
0.8142 4.0 300 0.8679 0.6133
0.6765 5.0 375 0.6964 0.6817
0.652 6.0 450 0.6686 0.7
0.6785 7.0 525 0.6625 0.7067
0.5659 8.0 600 0.6154 0.7217
0.6383 9.0 675 0.6262 0.7117
0.5991 10.0 750 0.5856 0.7633
0.4627 11.0 825 0.5901 0.7633
0.5021 12.0 900 0.5968 0.7433
0.5421 13.0 975 0.5857 0.74
0.3951 14.0 1050 0.5723 0.7733
0.4943 15.0 1125 0.6046 0.7533
0.4076 16.0 1200 0.6196 0.7567
0.379 17.0 1275 0.5906 0.7817
0.3759 18.0 1350 0.5998 0.775
0.3383 19.0 1425 0.6508 0.7567
0.2622 20.0 1500 0.6675 0.775
0.316 21.0 1575 0.7118 0.785
0.2478 22.0 1650 0.7508 0.78
0.2696 23.0 1725 0.7052 0.7733
0.1441 24.0 1800 0.8658 0.7783
0.1966 25.0 1875 0.9393 0.7417
0.1228 26.0 1950 1.0783 0.7567
0.2151 27.0 2025 1.0051 0.7533
0.1799 28.0 2100 1.0898 0.755
0.1053 29.0 2175 1.0567 0.7533
0.122 30.0 2250 1.1544 0.7583
0.1375 31.0 2325 1.3014 0.7617
0.0659 32.0 2400 1.6359 0.765
0.0997 33.0 2475 1.4213 0.7717
0.0852 34.0 2550 1.6657 0.7467
0.0752 35.0 2625 1.5943 0.7733
0.0405 36.0 2700 1.5865 0.7583
0.0174 37.0 2775 1.8002 0.7533
0.0364 38.0 2850 1.6078 0.7583
0.0269 39.0 2925 2.0543 0.7667
0.0034 40.0 3000 2.1698 0.7517
0.0428 41.0 3075 1.8011 0.74
0.0355 42.0 3150 2.1588 0.7567
0.0068 43.0 3225 2.0789 0.7617
0.013 44.0 3300 2.0235 0.76
0.0102 45.0 3375 1.9567 0.7567
0.0216 46.0 3450 1.9788 0.765
0.0016 47.0 3525 2.1056 0.765
0.0046 48.0 3600 2.1156 0.7633
0.0115 49.0 3675 2.2014 0.7617
0.0156 50.0 3750 2.1833 0.7633

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
85.8M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results