BilalMuftuoglu's picture
End of training
5dbd3b4 verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: beit-base-patch16-224-85-fold5
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9318181818181818

beit-base-patch16-224-85-fold5

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2499
  • Accuracy: 0.9318

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 1.0896 0.2955
No log 2.0 4 0.6456 0.7273
No log 3.0 6 1.0355 0.7045
No log 4.0 8 0.9124 0.7045
0.7607 5.0 10 0.5809 0.7955
0.7607 6.0 12 0.6812 0.75
0.7607 7.0 14 0.6529 0.75
0.7607 8.0 16 0.7174 0.7273
0.7607 9.0 18 0.6619 0.6136
0.4221 10.0 20 0.8063 0.75
0.4221 11.0 22 0.6372 0.75
0.4221 12.0 24 0.5886 0.75
0.4221 13.0 26 0.6359 0.6364
0.4221 14.0 28 0.5585 0.75
0.3287 15.0 30 0.4541 0.7273
0.3287 16.0 32 0.7624 0.5682
0.3287 17.0 34 0.6806 0.75
0.3287 18.0 36 0.7708 0.7273
0.3287 19.0 38 0.4170 0.7273
0.3663 20.0 40 0.4282 0.7727
0.3663 21.0 42 0.5613 0.75
0.3663 22.0 44 0.4025 0.8409
0.3663 23.0 46 0.4109 0.7955
0.3663 24.0 48 0.4373 0.8409
0.2344 25.0 50 0.3211 0.8864
0.2344 26.0 52 0.5561 0.75
0.2344 27.0 54 0.3149 0.8636
0.2344 28.0 56 0.3166 0.7955
0.2344 29.0 58 0.4164 0.8636
0.2051 30.0 60 0.4345 0.8636
0.2051 31.0 62 0.3180 0.8636
0.2051 32.0 64 0.3673 0.8409
0.2051 33.0 66 0.4313 0.8409
0.2051 34.0 68 0.4359 0.8409
0.1694 35.0 70 0.3700 0.8182
0.1694 36.0 72 0.5843 0.7955
0.1694 37.0 74 0.4064 0.8636
0.1694 38.0 76 0.3992 0.8182
0.1694 39.0 78 0.3153 0.8636
0.1566 40.0 80 0.5581 0.8182
0.1566 41.0 82 0.2921 0.8636
0.1566 42.0 84 0.3217 0.8864
0.1566 43.0 86 0.3255 0.8864
0.1566 44.0 88 0.7238 0.75
0.1389 45.0 90 0.4053 0.8864
0.1389 46.0 92 0.2499 0.9318
0.1389 47.0 94 0.2584 0.8864
0.1389 48.0 96 0.4432 0.8409
0.1389 49.0 98 0.6965 0.7955
0.1311 50.0 100 0.3910 0.8409
0.1311 51.0 102 0.3017 0.8636
0.1311 52.0 104 0.3050 0.8636
0.1311 53.0 106 0.2193 0.8636
0.1311 54.0 108 0.2369 0.8409
0.1386 55.0 110 0.3143 0.8864
0.1386 56.0 112 0.2932 0.8864
0.1386 57.0 114 0.2725 0.9091
0.1386 58.0 116 0.5664 0.8409
0.1386 59.0 118 0.5875 0.8182
0.1194 60.0 120 0.4623 0.8636
0.1194 61.0 122 0.4716 0.8182
0.1194 62.0 124 0.5028 0.8182
0.1194 63.0 126 0.4558 0.8182
0.1194 64.0 128 0.4798 0.8182
0.1122 65.0 130 0.3827 0.8409
0.1122 66.0 132 0.3653 0.8409
0.1122 67.0 134 0.3972 0.8409
0.1122 68.0 136 0.5705 0.7727
0.1122 69.0 138 0.5935 0.7727
0.1041 70.0 140 0.3905 0.8636
0.1041 71.0 142 0.2791 0.8409
0.1041 72.0 144 0.2845 0.9091
0.1041 73.0 146 0.2401 0.8636
0.1041 74.0 148 0.2260 0.8864
0.0982 75.0 150 0.2454 0.8864
0.0982 76.0 152 0.3773 0.8864
0.0982 77.0 154 0.6185 0.8182
0.0982 78.0 156 0.7238 0.7727
0.0982 79.0 158 0.5469 0.8409
0.1065 80.0 160 0.4318 0.8636
0.1065 81.0 162 0.3348 0.8864
0.1065 82.0 164 0.3041 0.8636
0.1065 83.0 166 0.3350 0.8864
0.1065 84.0 168 0.3464 0.8864
0.0829 85.0 170 0.3375 0.8864
0.0829 86.0 172 0.3309 0.8864
0.0829 87.0 174 0.3325 0.8864
0.0829 88.0 176 0.3441 0.8864
0.0829 89.0 178 0.3456 0.8636
0.0902 90.0 180 0.3244 0.8636
0.0902 91.0 182 0.3126 0.8636
0.0902 92.0 184 0.3117 0.8636
0.0902 93.0 186 0.2877 0.8636
0.0902 94.0 188 0.2643 0.8636
0.0838 95.0 190 0.2525 0.8864
0.0838 96.0 192 0.2462 0.9091
0.0838 97.0 194 0.2417 0.9091
0.0838 98.0 196 0.2402 0.9091
0.0838 99.0 198 0.2409 0.9091
0.0747 100.0 200 0.2426 0.9091

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1