Edit model card

hushem_1x_beit_base_rms_0001_fold4

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2977
  • Accuracy: 0.5

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.3978 0.2619
1.9888 2.0 12 1.3961 0.2381
1.9888 3.0 18 1.3839 0.2619
1.4109 4.0 24 1.3035 0.3095
1.3832 5.0 30 1.2707 0.5714
1.3832 6.0 36 1.2845 0.3571
1.4922 7.0 42 1.4385 0.2857
1.4922 8.0 48 1.2908 0.2619
1.2776 9.0 54 1.3088 0.5238
1.269 10.0 60 1.2412 0.3333
1.269 11.0 66 1.1676 0.5238
1.2132 12.0 72 1.1566 0.4286
1.2132 13.0 78 1.0746 0.5714
1.115 14.0 84 1.2329 0.4286
1.1413 15.0 90 1.1499 0.4048
1.1413 16.0 96 1.0494 0.5476
0.9563 17.0 102 0.9577 0.5238
0.9563 18.0 108 1.2486 0.4048
0.9343 19.0 114 1.2396 0.5238
0.8964 20.0 120 1.5448 0.3810
0.8964 21.0 126 1.6028 0.4762
0.826 22.0 132 1.0756 0.5714
0.826 23.0 138 1.4576 0.4048
0.6612 24.0 144 1.5635 0.4286
0.7361 25.0 150 1.2476 0.5952
0.7361 26.0 156 1.6591 0.4048
0.5674 27.0 162 1.5837 0.5238
0.5674 28.0 168 2.8490 0.4286
0.5637 29.0 174 1.9394 0.5714
0.4528 30.0 180 2.5319 0.4762
0.4528 31.0 186 1.8994 0.5714
0.455 32.0 192 2.3813 0.5476
0.455 33.0 198 2.3989 0.5
0.4317 34.0 204 2.5912 0.5
0.3921 35.0 210 2.8985 0.4762
0.3921 36.0 216 2.9682 0.5
0.3189 37.0 222 3.2291 0.5
0.3189 38.0 228 3.0818 0.5476
0.3067 39.0 234 3.1819 0.5238
0.2523 40.0 240 3.2200 0.4524
0.2523 41.0 246 3.2572 0.5
0.2633 42.0 252 3.2977 0.5
0.2633 43.0 258 3.2977 0.5
0.2304 44.0 264 3.2977 0.5
0.2585 45.0 270 3.2977 0.5
0.2585 46.0 276 3.2977 0.5
0.2417 47.0 282 3.2977 0.5
0.2417 48.0 288 3.2977 0.5
0.2307 49.0 294 3.2977 0.5
0.2495 50.0 300 3.2977 0.5

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
14
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results