Edit model card

hushem_1x_beit_base_sgd_00001_fold2

This model is a fine-tuned version of microsoft/beit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5467
  • Accuracy: 0.2667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.5555 0.2667
1.6026 2.0 12 1.5551 0.2667
1.6026 3.0 18 1.5546 0.2667
1.5488 4.0 24 1.5542 0.2667
1.6016 5.0 30 1.5538 0.2667
1.6016 6.0 36 1.5534 0.2667
1.5779 7.0 42 1.5530 0.2667
1.5779 8.0 48 1.5527 0.2667
1.588 9.0 54 1.5523 0.2667
1.5533 10.0 60 1.5519 0.2667
1.5533 11.0 66 1.5516 0.2667
1.5856 12.0 72 1.5513 0.2667
1.5856 13.0 78 1.5510 0.2667
1.5657 14.0 84 1.5507 0.2667
1.5825 15.0 90 1.5503 0.2667
1.5825 16.0 96 1.5501 0.2667
1.5958 17.0 102 1.5498 0.2667
1.5958 18.0 108 1.5495 0.2667
1.578 19.0 114 1.5493 0.2667
1.5925 20.0 120 1.5491 0.2667
1.5925 21.0 126 1.5489 0.2667
1.5804 22.0 132 1.5486 0.2667
1.5804 23.0 138 1.5484 0.2667
1.5969 24.0 144 1.5482 0.2667
1.5643 25.0 150 1.5481 0.2667
1.5643 26.0 156 1.5479 0.2667
1.5656 27.0 162 1.5478 0.2667
1.5656 28.0 168 1.5476 0.2667
1.5441 29.0 174 1.5475 0.2667
1.587 30.0 180 1.5474 0.2667
1.587 31.0 186 1.5473 0.2667
1.5666 32.0 192 1.5472 0.2667
1.5666 33.0 198 1.5471 0.2667
1.5492 34.0 204 1.5470 0.2667
1.5567 35.0 210 1.5469 0.2667
1.5567 36.0 216 1.5469 0.2667
1.5593 37.0 222 1.5468 0.2667
1.5593 38.0 228 1.5468 0.2667
1.5776 39.0 234 1.5468 0.2667
1.5552 40.0 240 1.5467 0.2667
1.5552 41.0 246 1.5467 0.2667
1.5605 42.0 252 1.5467 0.2667
1.5605 43.0 258 1.5467 0.2667
1.6075 44.0 264 1.5467 0.2667
1.5667 45.0 270 1.5467 0.2667
1.5667 46.0 276 1.5467 0.2667
1.5665 47.0 282 1.5467 0.2667
1.5665 48.0 288 1.5467 0.2667
1.5544 49.0 294 1.5467 0.2667
1.5829 50.0 300 1.5467 0.2667

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results