Edit model card

beit-base-patch16-224-ve-U13-b-80b

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7122
  • Accuracy: 0.8478

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 80

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 6 1.3182 0.4565
1.6182 2.0 13 1.3056 0.4565
1.6182 2.92 19 1.2884 0.4565
1.592 4.0 26 1.2807 0.4565
1.4756 4.92 32 1.2991 0.4565
1.4756 6.0 39 1.2451 0.5
1.352 6.92 45 1.1845 0.5217
1.2143 8.0 52 1.0315 0.6087
1.2143 8.92 58 0.9289 0.5435
1.0327 10.0 65 0.8925 0.5435
0.8878 10.92 71 0.8633 0.5652
0.8878 12.0 78 0.7566 0.6304
0.7712 12.92 84 0.7669 0.7609
0.6808 14.0 91 0.7635 0.7609
0.6808 14.92 97 0.8653 0.6304
0.5844 16.0 104 0.7193 0.7174
0.4332 16.92 110 0.6186 0.7826
0.4332 18.0 117 1.0295 0.6739
0.3607 18.92 123 0.8007 0.7609
0.3134 20.0 130 0.6790 0.7826
0.3134 20.92 136 0.8013 0.7391
0.2988 22.0 143 0.7481 0.7609
0.2988 22.92 149 0.9280 0.6739
0.2487 24.0 156 0.6542 0.7391
0.1912 24.92 162 0.7134 0.7609
0.1912 26.0 169 0.8421 0.7609
0.1946 26.92 175 0.7284 0.7391
0.1685 28.0 182 0.7507 0.8261
0.1685 28.92 188 0.7610 0.8043
0.1646 30.0 195 0.8013 0.7826
0.166 30.92 201 0.8803 0.7826
0.166 32.0 208 0.7895 0.7391
0.1372 32.92 214 0.7760 0.7174
0.1424 34.0 221 0.9390 0.7174
0.1424 34.92 227 0.7839 0.8043
0.1399 36.0 234 0.9422 0.7609
0.1238 36.92 240 0.8710 0.7174
0.1238 38.0 247 0.8684 0.7826
0.123 38.92 253 0.8194 0.7609
0.1381 40.0 260 0.9698 0.7391
0.1381 40.92 266 0.8545 0.7609
0.1081 42.0 273 0.9925 0.6739
0.1081 42.92 279 0.9320 0.8043
0.0929 44.0 286 1.0242 0.7609
0.0898 44.92 292 0.9411 0.7609
0.0898 46.0 299 0.8995 0.7609
0.12 46.92 305 0.7741 0.7826
0.1126 48.0 312 0.7122 0.8478
0.1126 48.92 318 0.9099 0.7826
0.1088 50.0 325 1.1148 0.6957
0.0851 50.92 331 0.9297 0.8043
0.0851 52.0 338 0.8801 0.8043
0.1001 52.92 344 0.8428 0.8261
0.0718 54.0 351 0.9721 0.7826
0.0718 54.92 357 0.8771 0.8043
0.0842 56.0 364 0.9982 0.7826
0.1069 56.92 370 1.1083 0.7391
0.1069 58.0 377 0.9072 0.7826
0.0803 58.92 383 0.7979 0.8261
0.0752 60.0 390 0.7489 0.8478
0.0752 60.92 396 0.8023 0.8261
0.0646 62.0 403 0.8027 0.8261
0.0646 62.92 409 0.8275 0.7826
0.0829 64.0 416 0.8587 0.8043
0.0616 64.92 422 0.8870 0.8043
0.0616 66.0 429 0.8928 0.8043
0.0693 66.92 435 0.9289 0.7826
0.0657 68.0 442 0.9604 0.7609
0.0657 68.92 448 0.9560 0.7826
0.0588 70.0 455 0.9544 0.7609
0.0578 70.92 461 0.9419 0.7826
0.0578 72.0 468 0.9474 0.7826
0.0638 72.92 474 0.9540 0.7826
0.0592 73.85 480 0.9549 0.7826

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
0
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results