BilalMuftuoglu's picture
End of training
548a5e4 verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: beit-base-patch16-224-75-fold2
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9534883720930233

beit-base-patch16-224-75-fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2685
  • Accuracy: 0.9535

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 0.7091 0.5349
No log 2.0 4 0.6502 0.7209
No log 3.0 6 0.9193 0.6977
No log 4.0 8 0.7499 0.7442
0.6436 5.0 10 0.4527 0.8140
0.6436 6.0 12 0.4169 0.8372
0.6436 7.0 14 0.5773 0.7442
0.6436 8.0 16 0.4076 0.8605
0.6436 9.0 18 0.3939 0.8605
0.3863 10.0 20 0.4017 0.8605
0.3863 11.0 22 0.4918 0.8140
0.3863 12.0 24 0.2688 0.8372
0.3863 13.0 26 0.3884 0.8140
0.3863 14.0 28 0.3679 0.8140
0.2925 15.0 30 0.2802 0.8837
0.2925 16.0 32 0.2436 0.9070
0.2925 17.0 34 0.2337 0.9302
0.2925 18.0 36 0.3711 0.8140
0.2925 19.0 38 0.2372 0.9302
0.2289 20.0 40 0.2685 0.9535
0.2289 21.0 42 0.2610 0.9070
0.2289 22.0 44 0.3328 0.8372
0.2289 23.0 46 0.3479 0.8372
0.2289 24.0 48 0.2855 0.8837
0.219 25.0 50 0.2962 0.9070
0.219 26.0 52 0.4038 0.9070
0.219 27.0 54 0.3149 0.9070
0.219 28.0 56 0.3212 0.9070
0.219 29.0 58 0.4895 0.8605
0.1933 30.0 60 0.4335 0.8837
0.1933 31.0 62 0.3521 0.8372
0.1933 32.0 64 0.2960 0.8837
0.1933 33.0 66 0.4037 0.8372
0.1933 34.0 68 0.2913 0.8837
0.1892 35.0 70 0.3043 0.8837
0.1892 36.0 72 0.3602 0.9302
0.1892 37.0 74 0.3315 0.9302
0.1892 38.0 76 0.2674 0.9302
0.1892 39.0 78 0.2970 0.9535
0.15 40.0 80 0.2661 0.9535
0.15 41.0 82 0.2551 0.8837
0.15 42.0 84 0.2467 0.9302
0.15 43.0 86 0.3008 0.9535
0.15 44.0 88 0.3265 0.9302
0.1238 45.0 90 0.2668 0.9302
0.1238 46.0 92 0.2574 0.9302
0.1238 47.0 94 0.2498 0.9535
0.1238 48.0 96 0.3319 0.8837
0.1238 49.0 98 0.2358 0.9302
0.1063 50.0 100 0.2015 0.9302
0.1063 51.0 102 0.2171 0.9302
0.1063 52.0 104 0.3119 0.9302
0.1063 53.0 106 0.2674 0.9070
0.1063 54.0 108 0.3076 0.8837
0.1112 55.0 110 0.3182 0.8837
0.1112 56.0 112 0.3371 0.9070
0.1112 57.0 114 0.3540 0.9070
0.1112 58.0 116 0.4058 0.9070
0.1112 59.0 118 0.4013 0.9070
0.1128 60.0 120 0.3309 0.9302
0.1128 61.0 122 0.3272 0.9302
0.1128 62.0 124 0.4012 0.9070
0.1128 63.0 126 0.5794 0.8605
0.1128 64.0 128 0.3881 0.9070
0.1168 65.0 130 0.2990 0.9070
0.1168 66.0 132 0.3018 0.8837
0.1168 67.0 134 0.2561 0.9302
0.1168 68.0 136 0.2921 0.9302
0.1168 69.0 138 0.3258 0.9070
0.0846 70.0 140 0.2925 0.9302
0.0846 71.0 142 0.3073 0.9302
0.0846 72.0 144 0.3318 0.9302
0.0846 73.0 146 0.3427 0.9302
0.0846 74.0 148 0.3588 0.9070
0.0845 75.0 150 0.3939 0.9070
0.0845 76.0 152 0.3774 0.9070
0.0845 77.0 154 0.3746 0.9070
0.0845 78.0 156 0.4073 0.8837
0.0845 79.0 158 0.3886 0.9070
0.0885 80.0 160 0.3765 0.9070
0.0885 81.0 162 0.3977 0.9070
0.0885 82.0 164 0.3864 0.9070
0.0885 83.0 166 0.3809 0.9070
0.0885 84.0 168 0.4492 0.8605
0.0859 85.0 170 0.5479 0.8605
0.0859 86.0 172 0.5372 0.8605
0.0859 87.0 174 0.4512 0.8605
0.0859 88.0 176 0.3930 0.9070
0.0859 89.0 178 0.3842 0.9302
0.0764 90.0 180 0.3808 0.9302
0.0764 91.0 182 0.3787 0.9302
0.0764 92.0 184 0.3833 0.9070
0.0764 93.0 186 0.3912 0.9070
0.0764 94.0 188 0.3888 0.8837
0.0727 95.0 190 0.3817 0.8837
0.0727 96.0 192 0.3708 0.9070
0.0727 97.0 194 0.3640 0.9070
0.0727 98.0 196 0.3613 0.9302
0.0727 99.0 198 0.3607 0.9302
0.069 100.0 200 0.3605 0.9302

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1