BilalMuftuoglu's picture
End of training
ef14e7b verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: beit-base-patch16-224-hasta-85-fold2
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7272727272727273

beit-base-patch16-224-hasta-85-fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0030
  • Accuracy: 0.7273

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 1.3943 0.0
No log 2.0 2 1.1771 0.3636
No log 3.0 3 1.0030 0.7273
No log 4.0 4 1.1175 0.7273
No log 5.0 5 1.3271 0.7273
No log 6.0 6 1.3905 0.7273
No log 7.0 7 1.2948 0.7273
No log 8.0 8 1.0699 0.7273
No log 9.0 9 0.9284 0.7273
0.3023 10.0 10 0.9573 0.7273
0.3023 11.0 11 1.1350 0.7273
0.3023 12.0 12 1.2566 0.7273
0.3023 13.0 13 1.2979 0.7273
0.3023 14.0 14 1.1942 0.7273
0.3023 15.0 15 1.1980 0.7273
0.3023 16.0 16 1.2017 0.7273
0.3023 17.0 17 1.4194 0.7273
0.3023 18.0 18 1.5204 0.7273
0.3023 19.0 19 1.3899 0.7273
0.1701 20.0 20 1.2407 0.7273
0.1701 21.0 21 1.3356 0.7273
0.1701 22.0 22 1.5076 0.7273
0.1701 23.0 23 1.4260 0.7273
0.1701 24.0 24 1.1877 0.7273
0.1701 25.0 25 1.0433 0.7273
0.1701 26.0 26 1.0261 0.7273
0.1701 27.0 27 1.0869 0.7273
0.1701 28.0 28 1.1074 0.7273
0.1701 29.0 29 1.0858 0.7273
0.1058 30.0 30 1.0020 0.7273
0.1058 31.0 31 0.9881 0.7273
0.1058 32.0 32 1.0530 0.7273
0.1058 33.0 33 1.3736 0.7273
0.1058 34.0 34 1.4768 0.7273
0.1058 35.0 35 1.4372 0.7273
0.1058 36.0 36 1.4594 0.7273
0.1058 37.0 37 1.4529 0.7273
0.1058 38.0 38 1.6027 0.7273
0.1058 39.0 39 1.7376 0.7273
0.065 40.0 40 1.8993 0.7273
0.065 41.0 41 1.9927 0.7273
0.065 42.0 42 1.8867 0.7273
0.065 43.0 43 1.6363 0.7273
0.065 44.0 44 1.5642 0.7273
0.065 45.0 45 1.5278 0.7273
0.065 46.0 46 1.5097 0.7273
0.065 47.0 47 1.5586 0.7273
0.065 48.0 48 1.5659 0.7273
0.065 49.0 49 1.5743 0.7273
0.061 50.0 50 1.5951 0.7273
0.061 51.0 51 1.6097 0.7273
0.061 52.0 52 1.6781 0.7273
0.061 53.0 53 1.7168 0.7273
0.061 54.0 54 1.6331 0.7273
0.061 55.0 55 1.5711 0.7273
0.061 56.0 56 1.6043 0.7273
0.061 57.0 57 1.6590 0.7273
0.061 58.0 58 1.6879 0.7273
0.061 59.0 59 1.6452 0.7273
0.0642 60.0 60 1.6099 0.7273
0.0642 61.0 61 1.5536 0.7273
0.0642 62.0 62 1.5496 0.7273
0.0642 63.0 63 1.5528 0.7273
0.0642 64.0 64 1.6351 0.7273
0.0642 65.0 65 1.7556 0.7273
0.0642 66.0 66 1.8993 0.7273
0.0642 67.0 67 2.0309 0.7273
0.0642 68.0 68 2.1548 0.7273
0.0642 69.0 69 2.2087 0.7273
0.0411 70.0 70 2.2062 0.7273
0.0411 71.0 71 2.1605 0.7273
0.0411 72.0 72 2.1347 0.7273
0.0411 73.0 73 2.0662 0.7273
0.0411 74.0 74 2.0683 0.7273
0.0411 75.0 75 2.0466 0.7273
0.0411 76.0 76 1.9756 0.7273
0.0411 77.0 77 1.8928 0.7273
0.0411 78.0 78 1.8972 0.7273
0.0411 79.0 79 1.9408 0.7273
0.0421 80.0 80 1.9690 0.7273
0.0421 81.0 81 2.0466 0.7273
0.0421 82.0 82 2.1174 0.7273
0.0421 83.0 83 2.1825 0.7273
0.0421 84.0 84 2.2527 0.7273
0.0421 85.0 85 2.2933 0.7273
0.0421 86.0 86 2.3311 0.7273
0.0421 87.0 87 2.3468 0.7273
0.0421 88.0 88 2.3222 0.7273
0.0421 89.0 89 2.2764 0.7273
0.0304 90.0 90 2.2190 0.7273
0.0304 91.0 91 2.1855 0.7273
0.0304 92.0 92 2.1677 0.7273
0.0304 93.0 93 2.1493 0.7273
0.0304 94.0 94 2.1259 0.7273
0.0304 95.0 95 2.1151 0.7273
0.0304 96.0 96 2.1179 0.7273
0.0304 97.0 97 2.1250 0.7273
0.0304 98.0 98 2.1302 0.7273
0.0304 99.0 99 2.1330 0.7273
0.0305 100.0 100 2.1360 0.7273

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1