BilalMuftuoglu's picture
End of training
682f52b verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: beit-base-patch16-224-75-fold5
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9534883720930233

beit-base-patch16-224-75-fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2664
  • Accuracy: 0.9535

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 0.6862 0.5116
No log 2.0 4 0.5913 0.7209
No log 3.0 6 0.7204 0.6977
No log 4.0 8 0.5995 0.6977
0.6162 5.0 10 0.4235 0.8140
0.6162 6.0 12 0.3975 0.8140
0.6162 7.0 14 0.6029 0.7674
0.6162 8.0 16 0.4670 0.8140
0.6162 9.0 18 0.3448 0.8372
0.4312 10.0 20 0.4464 0.8372
0.4312 11.0 22 0.3396 0.8605
0.4312 12.0 24 0.4007 0.8372
0.4312 13.0 26 0.3398 0.8140
0.4312 14.0 28 0.4276 0.8605
0.3453 15.0 30 0.4336 0.8605
0.3453 16.0 32 0.3777 0.8140
0.3453 17.0 34 0.5910 0.8140
0.3453 18.0 36 0.6095 0.8140
0.3453 19.0 38 0.3570 0.8140
0.3288 20.0 40 0.5202 0.8140
0.3288 21.0 42 0.5604 0.8140
0.3288 22.0 44 0.2949 0.8372
0.3288 23.0 46 0.3442 0.8837
0.3288 24.0 48 0.2820 0.8372
0.2571 25.0 50 0.3240 0.8605
0.2571 26.0 52 0.2909 0.8837
0.2571 27.0 54 0.2429 0.8837
0.2571 28.0 56 0.2280 0.9302
0.2571 29.0 58 0.3984 0.8605
0.2012 30.0 60 0.2905 0.8605
0.2012 31.0 62 0.2509 0.9070
0.2012 32.0 64 0.2888 0.8605
0.2012 33.0 66 0.2689 0.8605
0.2012 34.0 68 0.2417 0.8837
0.1814 35.0 70 0.2418 0.9070
0.1814 36.0 72 0.2491 0.9070
0.1814 37.0 74 0.2998 0.9070
0.1814 38.0 76 0.2744 0.9302
0.1814 39.0 78 0.2664 0.9535
0.1555 40.0 80 0.2160 0.9302
0.1555 41.0 82 0.3875 0.9070
0.1555 42.0 84 0.4608 0.9070
0.1555 43.0 86 0.2978 0.9302
0.1555 44.0 88 0.4461 0.8837
0.1459 45.0 90 0.3603 0.9070
0.1459 46.0 92 0.2973 0.9302
0.1459 47.0 94 0.3385 0.8837
0.1459 48.0 96 0.3239 0.8837
0.1459 49.0 98 0.4315 0.8837
0.1372 50.0 100 0.3519 0.8837
0.1372 51.0 102 0.4148 0.8837
0.1372 52.0 104 0.4687 0.8837
0.1372 53.0 106 0.3287 0.8837
0.1372 54.0 108 0.3194 0.9070
0.1049 55.0 110 0.3703 0.8837
0.1049 56.0 112 0.3522 0.9070
0.1049 57.0 114 0.2572 0.9070
0.1049 58.0 116 0.2523 0.9070
0.1049 59.0 118 0.3136 0.9070
0.1143 60.0 120 0.3638 0.9070
0.1143 61.0 122 0.2916 0.9535
0.1143 62.0 124 0.2521 0.9302
0.1143 63.0 126 0.2735 0.9302
0.1143 64.0 128 0.3112 0.9302
0.0885 65.0 130 0.3246 0.9302
0.0885 66.0 132 0.3264 0.9070
0.0885 67.0 134 0.3351 0.9302
0.0885 68.0 136 0.3455 0.9302
0.0885 69.0 138 0.3579 0.9302
0.1064 70.0 140 0.3926 0.9302
0.1064 71.0 142 0.4370 0.9070
0.1064 72.0 144 0.4149 0.9302
0.1064 73.0 146 0.3315 0.9535
0.1064 74.0 148 0.2704 0.9302
0.1047 75.0 150 0.2600 0.9302
0.1047 76.0 152 0.3215 0.9535
0.1047 77.0 154 0.4110 0.9302
0.1047 78.0 156 0.4414 0.8837
0.1047 79.0 158 0.3589 0.9302
0.0937 80.0 160 0.3085 0.9535
0.0937 81.0 162 0.2889 0.9535
0.0937 82.0 164 0.2787 0.9535
0.0937 83.0 166 0.3251 0.9535
0.0937 84.0 168 0.4483 0.9070
0.0748 85.0 170 0.5490 0.8605
0.0748 86.0 172 0.5422 0.8605
0.0748 87.0 174 0.5282 0.8837
0.0748 88.0 176 0.5733 0.8605
0.0748 89.0 178 0.5978 0.8605
0.0834 90.0 180 0.5763 0.8605
0.0834 91.0 182 0.5270 0.8605
0.0834 92.0 184 0.4946 0.8837
0.0834 93.0 186 0.4881 0.9070
0.0834 94.0 188 0.5115 0.8605
0.1016 95.0 190 0.5445 0.8605
0.1016 96.0 192 0.5537 0.8605
0.1016 97.0 194 0.5451 0.8605
0.1016 98.0 196 0.5323 0.8605
0.1016 99.0 198 0.5190 0.8837
0.0657 100.0 200 0.5155 0.8837

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1