Edit model card

hushem_5x_beit_base_sgd_00001_fold3

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5681
  • Accuracy: 0.2558

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5838 1.0 28 1.5858 0.2558
1.5323 2.0 56 1.5850 0.2558
1.5483 3.0 84 1.5842 0.2558
1.4864 4.0 112 1.5834 0.2558
1.5286 5.0 140 1.5827 0.2558
1.5129 6.0 168 1.5819 0.2558
1.6083 7.0 196 1.5812 0.2558
1.5405 8.0 224 1.5806 0.2558
1.5045 9.0 252 1.5799 0.2558
1.4827 10.0 280 1.5793 0.2558
1.5466 11.0 308 1.5787 0.2558
1.502 12.0 336 1.5780 0.2558
1.5701 13.0 364 1.5775 0.2558
1.5522 14.0 392 1.5769 0.2558
1.6273 15.0 420 1.5763 0.2558
1.5496 16.0 448 1.5758 0.2558
1.5263 17.0 476 1.5753 0.2558
1.5326 18.0 504 1.5748 0.2558
1.5229 19.0 532 1.5744 0.2558
1.6308 20.0 560 1.5739 0.2558
1.5402 21.0 588 1.5734 0.2558
1.5767 22.0 616 1.5730 0.2558
1.546 23.0 644 1.5726 0.2558
1.4997 24.0 672 1.5722 0.2558
1.5699 25.0 700 1.5719 0.2558
1.5518 26.0 728 1.5715 0.2558
1.5078 27.0 756 1.5712 0.2558
1.509 28.0 784 1.5709 0.2558
1.5496 29.0 812 1.5706 0.2558
1.5569 30.0 840 1.5704 0.2558
1.5113 31.0 868 1.5701 0.2558
1.5157 32.0 896 1.5699 0.2558
1.5362 33.0 924 1.5696 0.2558
1.4946 34.0 952 1.5694 0.2558
1.6128 35.0 980 1.5692 0.2558
1.4515 36.0 1008 1.5691 0.2558
1.4956 37.0 1036 1.5689 0.2558
1.5189 38.0 1064 1.5688 0.2558
1.571 39.0 1092 1.5687 0.2558
1.549 40.0 1120 1.5685 0.2558
1.524 41.0 1148 1.5684 0.2558
1.5138 42.0 1176 1.5684 0.2558
1.4952 43.0 1204 1.5683 0.2558
1.5406 44.0 1232 1.5682 0.2558
1.6126 45.0 1260 1.5682 0.2558
1.5484 46.0 1288 1.5682 0.2558
1.5268 47.0 1316 1.5681 0.2558
1.4882 48.0 1344 1.5681 0.2558
1.5345 49.0 1372 1.5681 0.2558
1.5815 50.0 1400 1.5681 0.2558

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hkivancoral/hushem_5x_beit_base_sgd_00001_fold3

Finetuned
(286)
this model

Evaluation results