BilalMuftuoglu's picture
End of training
d7252f5 verified
|
raw
history blame
7.99 kB
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: beit-base-patch16-224-hasta-85-fold4
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7272727272727273

beit-base-patch16-224-hasta-85-fold4

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9258
  • Accuracy: 0.7273

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 1.0761 0.6364
No log 2.0 2 0.9258 0.7273
No log 3.0 3 0.8310 0.7273
No log 4.0 4 0.9402 0.7273
No log 5.0 5 1.1381 0.7273
No log 6.0 6 1.2812 0.7273
No log 7.0 7 1.2679 0.7273
No log 8.0 8 1.1704 0.7273
No log 9.0 9 1.1909 0.7273
0.3269 10.0 10 1.2981 0.7273
0.3269 11.0 11 1.2565 0.7273
0.3269 12.0 12 1.1475 0.7273
0.3269 13.0 13 1.0585 0.7273
0.3269 14.0 14 1.0294 0.7273
0.3269 15.0 15 1.0649 0.7273
0.3269 16.0 16 1.1712 0.7273
0.3269 17.0 17 1.2090 0.7273
0.3269 18.0 18 1.1579 0.7273
0.3269 19.0 19 1.0943 0.7273
0.1921 20.0 20 1.1877 0.7273
0.1921 21.0 21 1.3909 0.7273
0.1921 22.0 22 1.4301 0.7273
0.1921 23.0 23 1.4210 0.7273
0.1921 24.0 24 1.3994 0.7273
0.1921 25.0 25 1.3649 0.7273
0.1921 26.0 26 1.3244 0.7273
0.1921 27.0 27 1.2861 0.7273
0.1921 28.0 28 1.1634 0.7273
0.1921 29.0 29 0.9854 0.7273
0.1374 30.0 30 1.0608 0.7273
0.1374 31.0 31 1.3092 0.7273
0.1374 32.0 32 1.4679 0.7273
0.1374 33.0 33 1.4397 0.7273
0.1374 34.0 34 1.2949 0.7273
0.1374 35.0 35 1.2340 0.7273
0.1374 36.0 36 1.2524 0.7273
0.1374 37.0 37 1.2108 0.7273
0.1374 38.0 38 1.1878 0.7273
0.1374 39.0 39 1.1400 0.7273
0.0886 40.0 40 1.1186 0.7273
0.0886 41.0 41 1.3145 0.7273
0.0886 42.0 42 1.4749 0.7273
0.0886 43.0 43 1.5773 0.7273
0.0886 44.0 44 1.6792 0.7273
0.0886 45.0 45 1.7716 0.7273
0.0886 46.0 46 1.8943 0.7273
0.0886 47.0 47 1.8541 0.7273
0.0886 48.0 48 1.6656 0.7273
0.0886 49.0 49 1.4897 0.7273
0.0509 50.0 50 1.2921 0.7273
0.0509 51.0 51 1.2021 0.7273
0.0509 52.0 52 1.2643 0.7273
0.0509 53.0 53 1.4622 0.7273
0.0509 54.0 54 1.5043 0.7273
0.0509 55.0 55 1.5063 0.7273
0.0509 56.0 56 1.4604 0.7273
0.0509 57.0 57 1.3414 0.7273
0.0509 58.0 58 1.1789 0.7273
0.0509 59.0 59 1.1715 0.7273
0.0471 60.0 60 1.2550 0.7273
0.0471 61.0 61 1.3513 0.7273
0.0471 62.0 62 1.4922 0.7273
0.0471 63.0 63 1.6911 0.7273
0.0471 64.0 64 1.7747 0.7273
0.0471 65.0 65 1.7659 0.7273
0.0471 66.0 66 1.6730 0.7273
0.0471 67.0 67 1.5296 0.7273
0.0471 68.0 68 1.4973 0.7273
0.0471 69.0 69 1.4650 0.7273
0.0212 70.0 70 1.4970 0.7273
0.0212 71.0 71 1.5022 0.7273
0.0212 72.0 72 1.5275 0.7273
0.0212 73.0 73 1.5780 0.7273
0.0212 74.0 74 1.7149 0.7273
0.0212 75.0 75 1.8056 0.7273
0.0212 76.0 76 1.8394 0.7273
0.0212 77.0 77 1.8526 0.7273
0.0212 78.0 78 1.7944 0.7273
0.0212 79.0 79 1.7440 0.7273
0.0313 80.0 80 1.6994 0.7273
0.0313 81.0 81 1.6076 0.7273
0.0313 82.0 82 1.5753 0.7273
0.0313 83.0 83 1.5831 0.7273
0.0313 84.0 84 1.5471 0.7273
0.0313 85.0 85 1.5600 0.7273
0.0313 86.0 86 1.5832 0.7273
0.0313 87.0 87 1.5819 0.7273
0.0313 88.0 88 1.6053 0.7273
0.0313 89.0 89 1.6329 0.7273
0.0205 90.0 90 1.6751 0.7273
0.0205 91.0 91 1.6957 0.7273
0.0205 92.0 92 1.7326 0.7273
0.0205 93.0 93 1.7475 0.7273
0.0205 94.0 94 1.7503 0.7273
0.0205 95.0 95 1.7443 0.7273
0.0205 96.0 96 1.7483 0.7273
0.0205 97.0 97 1.7523 0.7273
0.0205 98.0 98 1.7516 0.7273
0.0205 99.0 99 1.7483 0.7273
0.0334 100.0 100 1.7462 0.7273

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1