BilalMuftuoglu's picture
End of training
8b1b375 verified
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: beit-base-patch16-224-hasta-85-fold3
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7272727272727273

beit-base-patch16-224-hasta-85-fold3

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8911
  • Accuracy: 0.7273

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 1.3301 0.1818
No log 2.0 2 1.1012 0.3636
No log 3.0 3 0.8911 0.7273
No log 4.0 4 0.9555 0.7273
No log 5.0 5 1.2582 0.7273
No log 6.0 6 1.5576 0.7273
No log 7.0 7 1.6687 0.7273
No log 8.0 8 1.5015 0.7273
No log 9.0 9 1.2729 0.7273
0.3584 10.0 10 1.2127 0.7273
0.3584 11.0 11 1.3132 0.7273
0.3584 12.0 12 1.3350 0.7273
0.3584 13.0 13 1.2579 0.7273
0.3584 14.0 14 1.3559 0.7273
0.3584 15.0 15 1.4231 0.7273
0.3584 16.0 16 1.5141 0.7273
0.3584 17.0 17 1.4200 0.7273
0.3584 18.0 18 1.2498 0.7273
0.3584 19.0 19 1.1456 0.7273
0.1919 20.0 20 1.1055 0.7273
0.1919 21.0 21 1.1937 0.7273
0.1919 22.0 22 1.2768 0.7273
0.1919 23.0 23 1.3224 0.7273
0.1919 24.0 24 1.3629 0.7273
0.1919 25.0 25 1.3238 0.7273
0.1919 26.0 26 1.2280 0.7273
0.1919 27.0 27 1.2446 0.7273
0.1919 28.0 28 1.2530 0.7273
0.1919 29.0 29 1.2468 0.7273
0.1447 30.0 30 1.1535 0.7273
0.1447 31.0 31 1.1125 0.7273
0.1447 32.0 32 1.2051 0.7273
0.1447 33.0 33 1.5902 0.7273
0.1447 34.0 34 1.8445 0.7273
0.1447 35.0 35 1.7222 0.7273
0.1447 36.0 36 1.5080 0.7273
0.1447 37.0 37 1.3542 0.7273
0.1447 38.0 38 1.3106 0.7273
0.1447 39.0 39 1.4533 0.7273
0.1053 40.0 40 1.6427 0.7273
0.1053 41.0 41 1.7518 0.7273
0.1053 42.0 42 1.7775 0.7273
0.1053 43.0 43 1.6831 0.7273
0.1053 44.0 44 1.6968 0.7273
0.1053 45.0 45 1.8236 0.7273
0.1053 46.0 46 1.8845 0.7273
0.1053 47.0 47 1.8785 0.7273
0.1053 48.0 48 1.8805 0.7273
0.1053 49.0 49 1.9625 0.7273
0.0771 50.0 50 1.9860 0.7273
0.0771 51.0 51 1.9708 0.7273
0.0771 52.0 52 1.9149 0.7273
0.0771 53.0 53 1.9064 0.7273
0.0771 54.0 54 1.8804 0.7273
0.0771 55.0 55 1.8467 0.7273
0.0771 56.0 56 1.8508 0.7273
0.0771 57.0 57 1.8675 0.7273
0.0771 58.0 58 1.8886 0.7273
0.0771 59.0 59 1.8860 0.7273
0.0528 60.0 60 1.8777 0.7273
0.0528 61.0 61 1.9119 0.7273
0.0528 62.0 62 1.9860 0.7273
0.0528 63.0 63 2.1003 0.7273
0.0528 64.0 64 2.1561 0.7273
0.0528 65.0 65 2.1454 0.7273
0.0528 66.0 66 2.0685 0.7273
0.0528 67.0 67 1.9261 0.7273
0.0528 68.0 68 1.6839 0.7273
0.0528 69.0 69 1.4306 0.7273
0.043 70.0 70 1.3800 0.7273
0.043 71.0 71 1.4814 0.7273
0.043 72.0 72 1.6014 0.7273
0.043 73.0 73 1.7792 0.7273
0.043 74.0 74 1.9423 0.7273
0.043 75.0 75 2.0590 0.7273
0.043 76.0 76 2.1119 0.7273
0.043 77.0 77 2.1116 0.7273
0.043 78.0 78 2.0979 0.7273
0.043 79.0 79 2.1457 0.7273
0.0429 80.0 80 2.2222 0.7273
0.0429 81.0 81 2.2803 0.7273
0.0429 82.0 82 2.3327 0.7273
0.0429 83.0 83 2.3643 0.7273
0.0429 84.0 84 2.3774 0.7273
0.0429 85.0 85 2.3838 0.7273
0.0429 86.0 86 2.4072 0.7273
0.0429 87.0 87 2.4189 0.7273
0.0429 88.0 88 2.4028 0.7273
0.0429 89.0 89 2.3847 0.7273
0.0475 90.0 90 2.3793 0.7273
0.0475 91.0 91 2.3831 0.7273
0.0475 92.0 92 2.3882 0.7273
0.0475 93.0 93 2.3964 0.7273
0.0475 94.0 94 2.4126 0.7273
0.0475 95.0 95 2.4309 0.7273
0.0475 96.0 96 2.4486 0.7273
0.0475 97.0 97 2.4628 0.7273
0.0475 98.0 98 2.4723 0.7273
0.0475 99.0 99 2.4775 0.7273
0.0337 100.0 100 2.4788 0.7273

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1