Edit model card

smids_5x_beit_base_adamax_001_fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6041
  • Accuracy: 0.7617

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.8603 1.0 375 0.8648 0.5183
0.8445 2.0 750 0.8098 0.5417
0.7944 3.0 1125 0.7826 0.5917
0.7602 4.0 1500 0.8095 0.6133
0.7358 5.0 1875 0.7702 0.62
0.7338 6.0 2250 0.7325 0.6383
0.7068 7.0 2625 0.7570 0.6267
0.7788 8.0 3000 0.7318 0.6183
0.7701 9.0 3375 0.7391 0.65
0.7025 10.0 3750 0.7251 0.6617
0.7076 11.0 4125 0.7171 0.6433
0.6226 12.0 4500 0.7139 0.6333
0.6825 13.0 4875 0.7299 0.63
0.6882 14.0 5250 0.7324 0.6517
0.7468 15.0 5625 0.6842 0.7
0.6568 16.0 6000 0.7213 0.6533
0.6593 17.0 6375 0.6880 0.6583
0.68 18.0 6750 0.6884 0.6733
0.6767 19.0 7125 0.7231 0.665
0.6609 20.0 7500 0.6577 0.6983
0.6233 21.0 7875 0.7352 0.6417
0.6128 22.0 8250 0.6662 0.695
0.6939 23.0 8625 0.7254 0.71
0.6892 24.0 9000 0.7067 0.695
0.5723 25.0 9375 0.6348 0.72
0.6474 26.0 9750 0.6506 0.7083
0.6695 27.0 10125 0.6672 0.6883
0.7033 28.0 10500 0.6914 0.6833
0.6792 29.0 10875 0.6764 0.685
0.5904 30.0 11250 0.6857 0.6883
0.5913 31.0 11625 0.6709 0.6933
0.5784 32.0 12000 0.7184 0.69
0.6212 33.0 12375 0.6393 0.7233
0.6674 34.0 12750 0.6697 0.71
0.5844 35.0 13125 0.6220 0.7283
0.5892 36.0 13500 0.6265 0.7217
0.572 37.0 13875 0.6315 0.7117
0.5345 38.0 14250 0.6267 0.7417
0.5582 39.0 14625 0.5952 0.7433
0.5947 40.0 15000 0.6182 0.715
0.5681 41.0 15375 0.6009 0.7533
0.5885 42.0 15750 0.6107 0.7367
0.5772 43.0 16125 0.5746 0.75
0.4378 44.0 16500 0.5833 0.755
0.5286 45.0 16875 0.6256 0.7417
0.538 46.0 17250 0.6036 0.7483
0.5732 47.0 17625 0.6044 0.76
0.4485 48.0 18000 0.5966 0.7533
0.4959 49.0 18375 0.6043 0.7583
0.4683 50.0 18750 0.6041 0.7617

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/smids_5x_beit_base_adamax_001_fold5

Finetuned
(281)
this model

Evaluation results