--- license: apache-2.0 base_model: microsoft/beit-base-patch16-224 tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: beit-base-patch16-224-85-fold2 results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder config: default split: train args: default metrics: - name: Accuracy type: accuracy value: 0.9318181818181818 --- # beit-base-patch16-224-85-fold2 This model is a fine-tuned version of [microsoft/beit-base-patch16-224](https://huggingface.co/microsoft/beit-base-patch16-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.2763 - Accuracy: 0.9318 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 2 | 0.6057 | 0.7273 | | No log | 2.0 | 4 | 0.6639 | 0.7045 | | No log | 3.0 | 6 | 0.7324 | 0.7045 | | No log | 4.0 | 8 | 0.5213 | 0.7273 | | 0.5701 | 5.0 | 10 | 0.4717 | 0.8182 | | 0.5701 | 6.0 | 12 | 0.5339 | 0.7045 | | 0.5701 | 7.0 | 14 | 0.4959 | 0.7273 | | 0.5701 | 8.0 | 16 | 0.4086 | 0.8409 | | 0.5701 | 9.0 | 18 | 0.4039 | 0.8182 | | 0.4248 | 10.0 | 20 | 0.4106 | 0.8182 | | 0.4248 | 11.0 | 22 | 0.4108 | 0.8409 | | 0.4248 | 12.0 | 24 | 0.4607 | 0.7727 | | 0.4248 | 13.0 | 26 | 0.4446 | 0.7727 | | 0.4248 | 14.0 | 28 | 0.3912 | 0.8409 | | 0.3579 | 15.0 | 30 | 0.5183 | 0.7727 | | 0.3579 | 16.0 | 32 | 0.2991 | 0.8864 | | 0.3579 | 17.0 | 34 | 0.3587 | 0.8182 | | 0.3579 | 18.0 | 36 | 0.3110 | 0.8182 | | 0.3579 | 19.0 | 38 | 0.3084 | 0.8636 | | 0.2838 | 20.0 | 40 | 0.3079 | 0.8864 | | 0.2838 | 21.0 | 42 | 0.3033 | 0.8409 | | 0.2838 | 22.0 | 44 | 0.3126 | 0.8409 | | 0.2838 | 23.0 | 46 | 0.3171 | 0.8864 | | 0.2838 | 24.0 | 48 | 0.2689 | 0.8636 | | 0.2705 | 25.0 | 50 | 0.3175 | 0.8409 | | 0.2705 | 26.0 | 52 | 0.3464 | 0.8409 | | 0.2705 | 27.0 | 54 | 0.3092 | 0.8636 | | 0.2705 | 28.0 | 56 | 0.3178 | 0.8636 | | 0.2705 | 29.0 | 58 | 0.4107 | 0.7955 | | 0.1887 | 30.0 | 60 | 0.4151 | 0.8182 | | 0.1887 | 31.0 | 62 | 0.5450 | 0.7955 | | 0.1887 | 32.0 | 64 | 0.2892 | 0.8409 | | 0.1887 | 33.0 | 66 | 0.4078 | 0.8409 | | 0.1887 | 34.0 | 68 | 0.2821 | 0.8636 | | 0.1692 | 35.0 | 70 | 0.2708 | 0.8636 | | 0.1692 | 36.0 | 72 | 0.2692 | 0.8864 | | 0.1692 | 37.0 | 74 | 0.2806 | 0.8864 | | 0.1692 | 38.0 | 76 | 0.4613 | 0.8182 | | 0.1692 | 39.0 | 78 | 0.2887 | 0.9091 | | 0.1623 | 40.0 | 80 | 0.4046 | 0.8409 | | 0.1623 | 41.0 | 82 | 0.4542 | 0.8409 | | 0.1623 | 42.0 | 84 | 0.3010 | 0.8636 | | 0.1623 | 43.0 | 86 | 0.2954 | 0.8636 | | 0.1623 | 44.0 | 88 | 0.2838 | 0.8864 | | 0.1522 | 45.0 | 90 | 0.2675 | 0.8864 | | 0.1522 | 46.0 | 92 | 0.2517 | 0.9091 | | 0.1522 | 47.0 | 94 | 0.2687 | 0.9091 | | 0.1522 | 48.0 | 96 | 0.2551 | 0.9091 | | 0.1522 | 49.0 | 98 | 0.2661 | 0.8864 | | 0.1379 | 50.0 | 100 | 0.3507 | 0.8182 | | 0.1379 | 51.0 | 102 | 0.2629 | 0.8864 | | 0.1379 | 52.0 | 104 | 0.2697 | 0.8864 | | 0.1379 | 53.0 | 106 | 0.3081 | 0.8636 | | 0.1379 | 54.0 | 108 | 0.3851 | 0.8409 | | 0.1283 | 55.0 | 110 | 0.3104 | 0.8636 | | 0.1283 | 56.0 | 112 | 0.3624 | 0.8864 | | 0.1283 | 57.0 | 114 | 0.3199 | 0.8864 | | 0.1283 | 58.0 | 116 | 0.4964 | 0.8182 | | 0.1283 | 59.0 | 118 | 0.3356 | 0.8864 | | 0.1335 | 60.0 | 120 | 0.2314 | 0.9091 | | 0.1335 | 61.0 | 122 | 0.2334 | 0.9091 | | 0.1335 | 62.0 | 124 | 0.3961 | 0.8636 | | 0.1335 | 63.0 | 126 | 0.3453 | 0.8636 | | 0.1335 | 64.0 | 128 | 0.2806 | 0.8636 | | 0.1353 | 65.0 | 130 | 0.3372 | 0.8636 | | 0.1353 | 66.0 | 132 | 0.2675 | 0.8864 | | 0.1353 | 67.0 | 134 | 0.3482 | 0.8864 | | 0.1353 | 68.0 | 136 | 0.3725 | 0.8636 | | 0.1353 | 69.0 | 138 | 0.3769 | 0.8636 | | 0.099 | 70.0 | 140 | 0.5170 | 0.8409 | | 0.099 | 71.0 | 142 | 0.4710 | 0.8636 | | 0.099 | 72.0 | 144 | 0.3266 | 0.9091 | | 0.099 | 73.0 | 146 | 0.3390 | 0.8636 | | 0.099 | 74.0 | 148 | 0.3051 | 0.8636 | | 0.1179 | 75.0 | 150 | 0.3030 | 0.9091 | | 0.1179 | 76.0 | 152 | 0.3208 | 0.9091 | | 0.1179 | 77.0 | 154 | 0.2954 | 0.9091 | | 0.1179 | 78.0 | 156 | 0.2777 | 0.9091 | | 0.1179 | 79.0 | 158 | 0.2763 | 0.9318 | | 0.1077 | 80.0 | 160 | 0.3059 | 0.9091 | | 0.1077 | 81.0 | 162 | 0.3445 | 0.8864 | | 0.1077 | 82.0 | 164 | 0.3239 | 0.9091 | | 0.1077 | 83.0 | 166 | 0.3175 | 0.9091 | | 0.1077 | 84.0 | 168 | 0.3214 | 0.9091 | | 0.0907 | 85.0 | 170 | 0.3313 | 0.9091 | | 0.0907 | 86.0 | 172 | 0.3492 | 0.9091 | | 0.0907 | 87.0 | 174 | 0.3644 | 0.9091 | | 0.0907 | 88.0 | 176 | 0.3637 | 0.9091 | | 0.0907 | 89.0 | 178 | 0.3750 | 0.9091 | | 0.0972 | 90.0 | 180 | 0.3845 | 0.9091 | | 0.0972 | 91.0 | 182 | 0.3749 | 0.9091 | | 0.0972 | 92.0 | 184 | 0.3721 | 0.8864 | | 0.0972 | 93.0 | 186 | 0.3680 | 0.8864 | | 0.0972 | 94.0 | 188 | 0.3634 | 0.8864 | | 0.0733 | 95.0 | 190 | 0.3565 | 0.9091 | | 0.0733 | 96.0 | 192 | 0.3519 | 0.9091 | | 0.0733 | 97.0 | 194 | 0.3529 | 0.9091 | | 0.0733 | 98.0 | 196 | 0.3536 | 0.9091 | | 0.0733 | 99.0 | 198 | 0.3561 | 0.9091 | | 0.079 | 100.0 | 200 | 0.3565 | 0.9091 | ### Framework versions - Transformers 4.40.2 - Pytorch 2.2.1+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1