Edit model card

smids_1x_beit_base_rms_0001_fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6915
  • Accuracy: 0.72

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.2962 1.0 75 0.9009 0.4967
0.8616 2.0 150 0.8829 0.5333
0.8905 3.0 225 0.8472 0.5367
0.8302 4.0 300 0.9953 0.5067
0.8678 5.0 375 0.8690 0.525
0.8529 6.0 450 0.8769 0.5283
0.8841 7.0 525 0.8786 0.53
0.8327 8.0 600 0.8584 0.5367
0.8106 9.0 675 0.8478 0.5817
0.8163 10.0 750 0.8420 0.54
0.8203 11.0 825 0.8233 0.615
0.849 12.0 900 0.8207 0.56
0.7448 13.0 975 0.9969 0.48
0.8104 14.0 1050 0.8107 0.5717
0.8455 15.0 1125 0.8387 0.56
0.7497 16.0 1200 0.7795 0.5983
0.7595 17.0 1275 0.7579 0.63
0.7118 18.0 1350 0.7723 0.63
0.7898 19.0 1425 0.7567 0.635
0.7627 20.0 1500 0.7797 0.6367
0.8345 21.0 1575 0.7467 0.6217
0.745 22.0 1650 0.7264 0.655
0.7402 23.0 1725 0.7241 0.6633
0.6239 24.0 1800 0.7183 0.665
0.6855 25.0 1875 0.7858 0.6333
0.7229 26.0 1950 0.7404 0.6333
0.7229 27.0 2025 0.7258 0.68
0.7197 28.0 2100 0.6990 0.6917
0.7057 29.0 2175 0.7035 0.68
0.7315 30.0 2250 0.7188 0.6683
0.6562 31.0 2325 0.7484 0.6283
0.6918 32.0 2400 0.6817 0.6917
0.6871 33.0 2475 0.7362 0.6717
0.6724 34.0 2550 0.6752 0.7
0.6677 35.0 2625 0.6742 0.6933
0.6138 36.0 2700 0.6850 0.6867
0.582 37.0 2775 0.6804 0.6817
0.6731 38.0 2850 0.6827 0.6917
0.5577 39.0 2925 0.7025 0.6833
0.5702 40.0 3000 0.6473 0.7117
0.578 41.0 3075 0.6455 0.72
0.6074 42.0 3150 0.6478 0.715
0.6019 43.0 3225 0.6442 0.7
0.5836 44.0 3300 0.6632 0.6983
0.5466 45.0 3375 0.6605 0.68
0.4891 46.0 3450 0.6690 0.715
0.5107 47.0 3525 0.6729 0.7167
0.3981 48.0 3600 0.7111 0.7017
0.434 49.0 3675 0.6850 0.7183
0.3741 50.0 3750 0.6915 0.72

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
85.8M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results