Edit model card

smids_5x_beit_base_sgd_00001_fold4

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1429
  • Accuracy: 0.3883

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.2481 1.0 375 1.3387 0.3367
1.2853 2.0 750 1.3297 0.34
1.2245 3.0 1125 1.3209 0.34
1.2269 4.0 1500 1.3124 0.34
1.1881 5.0 1875 1.3044 0.34
1.1962 6.0 2250 1.2965 0.3417
1.1766 7.0 2625 1.2890 0.345
1.1422 8.0 3000 1.2816 0.3467
1.1265 9.0 3375 1.2745 0.3483
1.1532 10.0 3750 1.2676 0.3517
1.1456 11.0 4125 1.2609 0.3533
1.1221 12.0 4500 1.2545 0.355
1.1397 13.0 4875 1.2483 0.355
1.1323 14.0 5250 1.2422 0.3583
1.1113 15.0 5625 1.2362 0.3617
1.1197 16.0 6000 1.2307 0.3633
1.1175 17.0 6375 1.2251 0.3667
1.1137 18.0 6750 1.2199 0.3683
1.1317 19.0 7125 1.2149 0.3667
1.0985 20.0 7500 1.2099 0.37
1.1037 21.0 7875 1.2054 0.37
1.1051 22.0 8250 1.2008 0.3717
1.1012 23.0 8625 1.1965 0.3717
1.0418 24.0 9000 1.1925 0.375
1.0922 25.0 9375 1.1886 0.3767
1.0809 26.0 9750 1.1848 0.3767
1.096 27.0 10125 1.1812 0.3767
1.0328 28.0 10500 1.1778 0.375
1.0501 29.0 10875 1.1745 0.3767
1.065 30.0 11250 1.1714 0.3767
1.0717 31.0 11625 1.1685 0.3783
1.104 32.0 12000 1.1658 0.3767
1.0567 33.0 12375 1.1632 0.3783
1.0632 34.0 12750 1.1607 0.3783
1.0635 35.0 13125 1.1585 0.3783
1.0477 36.0 13500 1.1563 0.3833
1.0721 37.0 13875 1.1544 0.385
1.0594 38.0 14250 1.1526 0.385
1.0484 39.0 14625 1.1510 0.3867
1.0408 40.0 15000 1.1495 0.3883
1.0421 41.0 15375 1.1482 0.39
1.0561 42.0 15750 1.1470 0.3883
1.0338 43.0 16125 1.1460 0.3883
1.0224 44.0 16500 1.1451 0.3883
1.0269 45.0 16875 1.1444 0.3883
1.0608 46.0 17250 1.1438 0.3883
1.0652 47.0 17625 1.1434 0.3883
1.0189 48.0 18000 1.1431 0.3883
1.0225 49.0 18375 1.1429 0.3883
1.0356 50.0 18750 1.1429 0.3883

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/smids_5x_beit_base_sgd_00001_fold4

Finetuned
(281)
this model

Evaluation results