Edit model card

hushem_5x_beit_base_sgd_001_fold1

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5176
  • Accuracy: 0.4

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.4454 1.0 27 1.5496 0.2667
1.3864 2.0 54 1.5125 0.2667
1.3701 3.0 81 1.4961 0.2667
1.3167 4.0 108 1.5066 0.2889
1.2134 5.0 135 1.5082 0.3333
1.1476 6.0 162 1.5110 0.3333
1.1471 7.0 189 1.5248 0.3333
1.1363 8.0 216 1.5434 0.3556
1.0955 9.0 243 1.5579 0.3778
1.0745 10.0 270 1.5619 0.3333
1.0401 11.0 297 1.5614 0.3333
1.0047 12.0 324 1.5711 0.3333
1.0055 13.0 351 1.5627 0.3333
0.9832 14.0 378 1.5529 0.3333
0.9676 15.0 405 1.5726 0.3333
0.9641 16.0 432 1.5677 0.3333
0.9328 17.0 459 1.5601 0.3333
0.9518 18.0 486 1.5678 0.3333
0.9109 19.0 513 1.5762 0.3333
0.9218 20.0 540 1.5662 0.3333
0.8731 21.0 567 1.5698 0.3333
0.8636 22.0 594 1.5667 0.3333
0.8235 23.0 621 1.5658 0.3333
0.8569 24.0 648 1.5702 0.3333
0.8347 25.0 675 1.5568 0.3333
0.8597 26.0 702 1.5638 0.3333
0.8371 27.0 729 1.5541 0.3333
0.8073 28.0 756 1.5468 0.3556
0.8391 29.0 783 1.5399 0.3556
0.8305 30.0 810 1.5379 0.3556
0.7771 31.0 837 1.5433 0.3778
0.8158 32.0 864 1.5408 0.3556
0.8402 33.0 891 1.5426 0.3778
0.7881 34.0 918 1.5356 0.3778
0.798 35.0 945 1.5324 0.4
0.75 36.0 972 1.5330 0.4
0.7699 37.0 999 1.5355 0.3778
0.7585 38.0 1026 1.5345 0.4
0.7272 39.0 1053 1.5315 0.4
0.7453 40.0 1080 1.5287 0.4
0.7465 41.0 1107 1.5241 0.4
0.7238 42.0 1134 1.5231 0.4
0.7663 43.0 1161 1.5207 0.4
0.7014 44.0 1188 1.5176 0.4
0.7481 45.0 1215 1.5184 0.4
0.7298 46.0 1242 1.5191 0.4
0.7342 47.0 1269 1.5176 0.4
0.706 48.0 1296 1.5175 0.4
0.7649 49.0 1323 1.5176 0.4
0.7295 50.0 1350 1.5176 0.4

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results