BilalMuftuoglu's picture
End of training
5072c1d verified
---
license: apache-2.0
base_model: microsoft/beit-base-patch16-224
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: beit-base-patch16-224-75-fold4
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9534883720930233
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# beit-base-patch16-224-75-fold4
This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2509
- Accuracy: 0.9535
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 2 | 0.5130 | 0.7907 |
| No log | 2.0 | 4 | 0.4861 | 0.7907 |
| No log | 3.0 | 6 | 0.4775 | 0.7907 |
| No log | 4.0 | 8 | 0.4419 | 0.7907 |
| 0.4909 | 5.0 | 10 | 0.3672 | 0.8605 |
| 0.4909 | 6.0 | 12 | 0.3301 | 0.8837 |
| 0.4909 | 7.0 | 14 | 0.3131 | 0.8837 |
| 0.4909 | 8.0 | 16 | 0.4535 | 0.8605 |
| 0.4909 | 9.0 | 18 | 0.3088 | 0.8372 |
| 0.3473 | 10.0 | 20 | 0.4453 | 0.8837 |
| 0.3473 | 11.0 | 22 | 0.4234 | 0.8605 |
| 0.3473 | 12.0 | 24 | 0.3601 | 0.8837 |
| 0.3473 | 13.0 | 26 | 0.3658 | 0.9070 |
| 0.3473 | 14.0 | 28 | 0.3081 | 0.8837 |
| 0.2903 | 15.0 | 30 | 0.4128 | 0.8837 |
| 0.2903 | 16.0 | 32 | 0.2555 | 0.8605 |
| 0.2903 | 17.0 | 34 | 0.3341 | 0.8837 |
| 0.2903 | 18.0 | 36 | 0.2427 | 0.8837 |
| 0.2903 | 19.0 | 38 | 0.4325 | 0.8372 |
| 0.2673 | 20.0 | 40 | 0.2637 | 0.9070 |
| 0.2673 | 21.0 | 42 | 0.2919 | 0.8837 |
| 0.2673 | 22.0 | 44 | 0.3139 | 0.8837 |
| 0.2673 | 23.0 | 46 | 0.2411 | 0.8837 |
| 0.2673 | 24.0 | 48 | 0.4645 | 0.9070 |
| 0.2103 | 25.0 | 50 | 0.5084 | 0.8605 |
| 0.2103 | 26.0 | 52 | 0.2308 | 0.9070 |
| 0.2103 | 27.0 | 54 | 0.3450 | 0.8605 |
| 0.2103 | 28.0 | 56 | 0.3444 | 0.8605 |
| 0.2103 | 29.0 | 58 | 0.2546 | 0.9070 |
| 0.1673 | 30.0 | 60 | 0.9117 | 0.8140 |
| 0.1673 | 31.0 | 62 | 0.8437 | 0.8140 |
| 0.1673 | 32.0 | 64 | 0.6758 | 0.8372 |
| 0.1673 | 33.0 | 66 | 0.8019 | 0.8140 |
| 0.1673 | 34.0 | 68 | 0.3364 | 0.8837 |
| 0.1677 | 35.0 | 70 | 0.2928 | 0.8837 |
| 0.1677 | 36.0 | 72 | 0.2547 | 0.9070 |
| 0.1677 | 37.0 | 74 | 0.2969 | 0.8837 |
| 0.1677 | 38.0 | 76 | 0.5706 | 0.8837 |
| 0.1677 | 39.0 | 78 | 0.7006 | 0.8837 |
| 0.1407 | 40.0 | 80 | 0.4321 | 0.8837 |
| 0.1407 | 41.0 | 82 | 0.4366 | 0.8837 |
| 0.1407 | 42.0 | 84 | 0.3956 | 0.8837 |
| 0.1407 | 43.0 | 86 | 0.2290 | 0.8372 |
| 0.1407 | 44.0 | 88 | 0.3665 | 0.8837 |
| 0.1474 | 45.0 | 90 | 0.4465 | 0.8605 |
| 0.1474 | 46.0 | 92 | 0.7279 | 0.8605 |
| 0.1474 | 47.0 | 94 | 0.5259 | 0.8605 |
| 0.1474 | 48.0 | 96 | 0.5832 | 0.8837 |
| 0.1474 | 49.0 | 98 | 0.7328 | 0.8837 |
| 0.1344 | 50.0 | 100 | 0.3890 | 0.8837 |
| 0.1344 | 51.0 | 102 | 0.2642 | 0.8837 |
| 0.1344 | 52.0 | 104 | 0.3710 | 0.9070 |
| 0.1344 | 53.0 | 106 | 0.4773 | 0.9070 |
| 0.1344 | 54.0 | 108 | 0.3628 | 0.9302 |
| 0.1166 | 55.0 | 110 | 0.4389 | 0.9070 |
| 0.1166 | 56.0 | 112 | 0.4813 | 0.9070 |
| 0.1166 | 57.0 | 114 | 0.5328 | 0.9070 |
| 0.1166 | 58.0 | 116 | 0.5342 | 0.9070 |
| 0.1166 | 59.0 | 118 | 0.4892 | 0.9070 |
| 0.097 | 60.0 | 120 | 0.5857 | 0.9070 |
| 0.097 | 61.0 | 122 | 0.6681 | 0.9070 |
| 0.097 | 62.0 | 124 | 0.5947 | 0.9070 |
| 0.097 | 63.0 | 126 | 0.4749 | 0.9070 |
| 0.097 | 64.0 | 128 | 0.6091 | 0.8837 |
| 0.1076 | 65.0 | 130 | 0.9725 | 0.8605 |
| 0.1076 | 66.0 | 132 | 1.1372 | 0.8140 |
| 0.1076 | 67.0 | 134 | 0.7109 | 0.8605 |
| 0.1076 | 68.0 | 136 | 0.3549 | 0.9302 |
| 0.1076 | 69.0 | 138 | 0.2709 | 0.9302 |
| 0.0914 | 70.0 | 140 | 0.3316 | 0.9302 |
| 0.0914 | 71.0 | 142 | 0.3176 | 0.9302 |
| 0.0914 | 72.0 | 144 | 0.2509 | 0.9535 |
| 0.0914 | 73.0 | 146 | 0.2256 | 0.9070 |
| 0.0914 | 74.0 | 148 | 0.2570 | 0.9070 |
| 0.0815 | 75.0 | 150 | 0.3081 | 0.9535 |
| 0.0815 | 76.0 | 152 | 0.4199 | 0.9302 |
| 0.0815 | 77.0 | 154 | 0.4324 | 0.9302 |
| 0.0815 | 78.0 | 156 | 0.3928 | 0.9302 |
| 0.0815 | 79.0 | 158 | 0.3700 | 0.9302 |
| 0.0878 | 80.0 | 160 | 0.3812 | 0.9302 |
| 0.0878 | 81.0 | 162 | 0.4300 | 0.9302 |
| 0.0878 | 82.0 | 164 | 0.4289 | 0.9302 |
| 0.0878 | 83.0 | 166 | 0.4125 | 0.9302 |
| 0.0878 | 84.0 | 168 | 0.4351 | 0.9302 |
| 0.0725 | 85.0 | 170 | 0.5046 | 0.9302 |
| 0.0725 | 86.0 | 172 | 0.5692 | 0.9070 |
| 0.0725 | 87.0 | 174 | 0.5486 | 0.9070 |
| 0.0725 | 88.0 | 176 | 0.5310 | 0.9302 |
| 0.0725 | 89.0 | 178 | 0.4662 | 0.9302 |
| 0.0944 | 90.0 | 180 | 0.4070 | 0.9302 |
| 0.0944 | 91.0 | 182 | 0.3768 | 0.9302 |
| 0.0944 | 92.0 | 184 | 0.3884 | 0.9302 |
| 0.0944 | 93.0 | 186 | 0.3851 | 0.9302 |
| 0.0944 | 94.0 | 188 | 0.3759 | 0.9302 |
| 0.0739 | 95.0 | 190 | 0.3608 | 0.9302 |
| 0.0739 | 96.0 | 192 | 0.3456 | 0.9302 |
| 0.0739 | 97.0 | 194 | 0.3360 | 0.9302 |
| 0.0739 | 98.0 | 196 | 0.3312 | 0.9302 |
| 0.0739 | 99.0 | 198 | 0.3321 | 0.9302 |
| 0.0612 | 100.0 | 200 | 0.3331 | 0.9302 |
### Framework versions
- Transformers 4.40.2
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1