Edit model card

smids_1x_beit_base_sgd_001_fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3600
  • Accuracy: 0.8636

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0974 1.0 75 1.0410 0.4309
1.0056 2.0 150 0.9317 0.5424
0.8916 3.0 225 0.8446 0.5890
0.8349 4.0 300 0.7646 0.6473
0.7414 5.0 375 0.6971 0.7022
0.6784 6.0 450 0.6337 0.7571
0.7121 7.0 525 0.5878 0.7754
0.6558 8.0 600 0.5609 0.7787
0.6317 9.0 675 0.5312 0.8087
0.6518 10.0 750 0.5083 0.8136
0.5234 11.0 825 0.4912 0.8203
0.5342 12.0 900 0.4745 0.8236
0.5263 13.0 975 0.4621 0.8186
0.4728 14.0 1050 0.4552 0.8220
0.4696 15.0 1125 0.4418 0.8336
0.4875 16.0 1200 0.4387 0.8286
0.4719 17.0 1275 0.4281 0.8303
0.4659 18.0 1350 0.4174 0.8386
0.4608 19.0 1425 0.4192 0.8319
0.4678 20.0 1500 0.4085 0.8486
0.4982 21.0 1575 0.4035 0.8519
0.4136 22.0 1650 0.3939 0.8552
0.4205 23.0 1725 0.3934 0.8502
0.45 24.0 1800 0.3901 0.8519
0.4234 25.0 1875 0.3886 0.8536
0.3928 26.0 1950 0.3881 0.8486
0.4665 27.0 2025 0.3799 0.8636
0.416 28.0 2100 0.3843 0.8519
0.386 29.0 2175 0.3779 0.8619
0.3668 30.0 2250 0.3747 0.8552
0.3858 31.0 2325 0.3781 0.8602
0.3907 32.0 2400 0.3740 0.8602
0.4156 33.0 2475 0.3701 0.8619
0.4094 34.0 2550 0.3679 0.8619
0.3888 35.0 2625 0.3683 0.8586
0.3956 36.0 2700 0.3659 0.8636
0.3691 37.0 2775 0.3660 0.8636
0.4229 38.0 2850 0.3645 0.8669
0.308 39.0 2925 0.3651 0.8636
0.382 40.0 3000 0.3644 0.8602
0.4135 41.0 3075 0.3618 0.8652
0.3791 42.0 3150 0.3629 0.8636
0.3729 43.0 3225 0.3622 0.8586
0.3719 44.0 3300 0.3628 0.8669
0.3571 45.0 3375 0.3604 0.8636
0.3721 46.0 3450 0.3598 0.8652
0.381 47.0 3525 0.3604 0.8636
0.3882 48.0 3600 0.3603 0.8636
0.3411 49.0 3675 0.3601 0.8636
0.3299 50.0 3750 0.3600 0.8636

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
9
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/smids_1x_beit_base_sgd_001_fold2

Finetuned
(286)
this model

Evaluation results