Edit model card

hushem_5x_beit_base_sgd_00001_fold1

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5922
  • Accuracy: 0.2667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.4867 1.0 27 1.6071 0.2667
1.5392 2.0 54 1.6064 0.2667
1.5844 3.0 81 1.6056 0.2667
1.5797 4.0 108 1.6050 0.2667
1.5108 5.0 135 1.6044 0.2667
1.5236 6.0 162 1.6037 0.2667
1.5199 7.0 189 1.6031 0.2667
1.544 8.0 216 1.6026 0.2667
1.5317 9.0 243 1.6020 0.2667
1.537 10.0 270 1.6014 0.2667
1.5415 11.0 297 1.6010 0.2667
1.5478 12.0 324 1.6004 0.2667
1.4666 13.0 351 1.6000 0.2667
1.5352 14.0 378 1.5995 0.2667
1.478 15.0 405 1.5990 0.2667
1.5333 16.0 432 1.5986 0.2667
1.5245 17.0 459 1.5982 0.2667
1.5379 18.0 486 1.5978 0.2667
1.52 19.0 513 1.5975 0.2667
1.5508 20.0 540 1.5971 0.2667
1.5421 21.0 567 1.5967 0.2667
1.4919 22.0 594 1.5963 0.2667
1.483 23.0 621 1.5960 0.2667
1.5087 24.0 648 1.5957 0.2667
1.5236 25.0 675 1.5954 0.2667
1.5228 26.0 702 1.5951 0.2667
1.5439 27.0 729 1.5949 0.2667
1.5272 28.0 756 1.5946 0.2667
1.5029 29.0 783 1.5943 0.2667
1.5695 30.0 810 1.5941 0.2667
1.5057 31.0 837 1.5939 0.2667
1.5092 32.0 864 1.5937 0.2667
1.575 33.0 891 1.5935 0.2667
1.5175 34.0 918 1.5934 0.2667
1.4801 35.0 945 1.5932 0.2667
1.4771 36.0 972 1.5930 0.2667
1.5042 37.0 999 1.5929 0.2667
1.5372 38.0 1026 1.5928 0.2667
1.5158 39.0 1053 1.5927 0.2667
1.4902 40.0 1080 1.5926 0.2667
1.4904 41.0 1107 1.5925 0.2667
1.4817 42.0 1134 1.5924 0.2667
1.5064 43.0 1161 1.5923 0.2667
1.4625 44.0 1188 1.5923 0.2667
1.5064 45.0 1215 1.5923 0.2667
1.4956 46.0 1242 1.5922 0.2667
1.502 47.0 1269 1.5922 0.2667
1.495 48.0 1296 1.5922 0.2667
1.4896 49.0 1323 1.5922 0.2667
1.5118 50.0 1350 1.5922 0.2667

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference API
Drag image file here or click to browse from your device
This model can be loaded on Inference API (serverless).

Finetuned from

Evaluation results