Edit model card

beit-base-patch16-224-75-fold3

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3326
  • Accuracy: 0.9535

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 0.5223 0.8140
No log 2.0 4 0.5171 0.7907
No log 3.0 6 0.5562 0.7907
No log 4.0 8 0.5032 0.7907
0.5026 5.0 10 0.4636 0.7907
0.5026 6.0 12 0.4143 0.8372
0.5026 7.0 14 0.4338 0.8372
0.5026 8.0 16 0.4471 0.8372
0.5026 9.0 18 0.3882 0.8140
0.3578 10.0 20 0.3487 0.8372
0.3578 11.0 22 0.3975 0.8372
0.3578 12.0 24 0.3497 0.8372
0.3578 13.0 26 0.4257 0.8372
0.3578 14.0 28 0.2964 0.8837
0.322 15.0 30 0.4666 0.8372
0.322 16.0 32 0.3815 0.7907
0.322 17.0 34 0.2859 0.8837
0.322 18.0 36 0.5995 0.8140
0.322 19.0 38 0.4421 0.8372
0.2983 20.0 40 0.2857 0.9070
0.2983 21.0 42 0.3963 0.8605
0.2983 22.0 44 0.2775 0.9302
0.2983 23.0 46 0.4763 0.7442
0.2983 24.0 48 0.2789 0.8605
0.2545 25.0 50 0.3596 0.8372
0.2545 26.0 52 0.2447 0.9302
0.2545 27.0 54 0.4725 0.8140
0.2545 28.0 56 0.3389 0.8837
0.2545 29.0 58 0.3472 0.8140
0.1914 30.0 60 0.3326 0.9535
0.1914 31.0 62 0.2856 0.9302
0.1914 32.0 64 0.2857 0.8605
0.1914 33.0 66 0.2460 0.9302
0.1914 34.0 68 0.5097 0.8140
0.1782 35.0 70 0.2503 0.9535
0.1782 36.0 72 0.2338 0.9302
0.1782 37.0 74 0.4161 0.8837
0.1782 38.0 76 0.3522 0.9070
0.1782 39.0 78 0.2566 0.8372
0.1435 40.0 80 0.2658 0.8605
0.1435 41.0 82 0.2197 0.9302
0.1435 42.0 84 0.2200 0.8837
0.1435 43.0 86 0.2167 0.8837
0.1435 44.0 88 0.3905 0.9070
0.1131 45.0 90 0.5501 0.8837
0.1131 46.0 92 0.5251 0.9070
0.1131 47.0 94 0.4207 0.9070
0.1131 48.0 96 0.3181 0.8605
0.1131 49.0 98 0.4059 0.8140
0.1397 50.0 100 0.3061 0.8605
0.1397 51.0 102 0.2825 0.9302
0.1397 52.0 104 0.2886 0.9302
0.1397 53.0 106 0.3140 0.9070
0.1397 54.0 108 0.3363 0.9070
0.1048 55.0 110 0.3388 0.8837
0.1048 56.0 112 0.3391 0.9070
0.1048 57.0 114 0.3927 0.8605
0.1048 58.0 116 0.3799 0.8837
0.1048 59.0 118 0.5491 0.8140
0.1013 60.0 120 0.6218 0.8140
0.1013 61.0 122 0.4467 0.8372
0.1013 62.0 124 0.3842 0.9302
0.1013 63.0 126 0.3517 0.9302
0.1013 64.0 128 0.3083 0.9070
0.0987 65.0 130 0.3125 0.9535
0.0987 66.0 132 0.3968 0.9070
0.0987 67.0 134 0.3974 0.9070
0.0987 68.0 136 0.3304 0.9302
0.0987 69.0 138 0.3508 0.9070
0.0932 70.0 140 0.3760 0.9070
0.0932 71.0 142 0.3828 0.9535
0.0932 72.0 144 0.3291 0.9302
0.0932 73.0 146 0.3109 0.8837
0.0932 74.0 148 0.3078 0.9302
0.0917 75.0 150 0.4478 0.9070
0.0917 76.0 152 0.6594 0.8605
0.0917 77.0 154 0.6036 0.8605
0.0917 78.0 156 0.3920 0.9070
0.0917 79.0 158 0.3263 0.8605
0.0917 80.0 160 0.3551 0.8372
0.0917 81.0 162 0.3475 0.8372
0.0917 82.0 164 0.3338 0.8837
0.0917 83.0 166 0.3354 0.9070
0.0917 84.0 168 0.3611 0.9070
0.0901 85.0 170 0.3558 0.9070
0.0901 86.0 172 0.3412 0.8837
0.0901 87.0 174 0.3352 0.9302
0.0901 88.0 176 0.3627 0.9070
0.0901 89.0 178 0.3637 0.9070
0.0764 90.0 180 0.3369 0.9070
0.0764 91.0 182 0.3246 0.9302
0.0764 92.0 184 0.3208 0.9302
0.0764 93.0 186 0.3229 0.9302
0.0764 94.0 188 0.3326 0.9302
0.0664 95.0 190 0.3340 0.9302
0.0664 96.0 192 0.3296 0.9302
0.0664 97.0 194 0.3268 0.9302
0.0664 98.0 196 0.3260 0.9535
0.0664 99.0 198 0.3288 0.9535
0.057 100.0 200 0.3298 0.9535

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference API
Drag image file here or click to browse from your device
This model can be loaded on Inference API (serverless).

Finetuned from

Evaluation results