Edit model card

hushem_1x_beit_base_rms_0001_fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2742
  • Accuracy: 0.5333

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.4059 0.2444
2.1162 2.0 12 1.4011 0.2444
2.1162 3.0 18 1.4001 0.2444
1.4079 4.0 24 1.4023 0.2444
1.3851 5.0 30 1.3440 0.4
1.3851 6.0 36 1.6621 0.2444
1.3464 7.0 42 1.3490 0.2889
1.3464 8.0 48 1.3162 0.2667
1.2763 9.0 54 1.5389 0.2444
1.2353 10.0 60 1.1918 0.5111
1.2353 11.0 66 1.2702 0.3111
1.1503 12.0 72 1.1819 0.4667
1.1503 13.0 78 1.1946 0.4
1.1428 14.0 84 1.2858 0.4222
0.9448 15.0 90 1.2191 0.5333
0.9448 16.0 96 1.0792 0.4667
0.8793 17.0 102 1.0942 0.5333
0.8793 18.0 108 1.0695 0.5333
0.7925 19.0 114 1.5298 0.4889
0.7637 20.0 120 1.4292 0.4889
0.7637 21.0 126 1.1665 0.4889
0.6936 22.0 132 1.2681 0.4444
0.6936 23.0 138 1.4911 0.4667
0.6862 24.0 144 1.6737 0.4889
0.6196 25.0 150 1.3333 0.5111
0.6196 26.0 156 2.1751 0.4889
0.5849 27.0 162 1.6904 0.4444
0.5849 28.0 168 2.4209 0.5333
0.5413 29.0 174 1.3664 0.4444
0.4937 30.0 180 2.0398 0.5111
0.4937 31.0 186 1.5682 0.5111
0.4704 32.0 192 2.0516 0.5556
0.4704 33.0 198 2.7441 0.5778
0.4133 34.0 204 2.2801 0.5111
0.354 35.0 210 2.5861 0.5333
0.354 36.0 216 2.6593 0.5333
0.3074 37.0 222 2.7263 0.5333
0.3074 38.0 228 2.8622 0.4889
0.2263 39.0 234 3.1445 0.5556
0.2656 40.0 240 3.2265 0.5333
0.2656 41.0 246 3.2774 0.5333
0.2527 42.0 252 3.2742 0.5333
0.2527 43.0 258 3.2742 0.5333
0.2167 44.0 264 3.2742 0.5333
0.2791 45.0 270 3.2742 0.5333
0.2791 46.0 276 3.2742 0.5333
0.2024 47.0 282 3.2742 0.5333
0.2024 48.0 288 3.2742 0.5333
0.2259 49.0 294 3.2742 0.5333
0.2149 50.0 300 3.2742 0.5333

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference API
Drag image file here or click to browse from your device
This model can be loaded on Inference API (serverless).

Finetuned from

Evaluation results