Edit model card

Prahas10/roof-shingles

This model is a fine-tuned version of google/vit-base-patch16-384 on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1015
  • Validation Loss: 0.3231
  • Train Accuracy: 0.9083
  • Epoch: 29

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 4e-05, 'decay_steps': 138270, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.0001}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Accuracy Epoch
3.8367 2.9703 0.4403 0
1.3092 1.6169 0.7093 1
0.4529 1.4414 0.7112 2
0.2229 0.8445 0.8368 3
0.1451 0.7074 0.8556 4
0.1053 0.8585 0.7992 5
0.1175 1.0721 0.7389 6
0.1388 0.5802 0.8542 7
0.0647 0.3764 0.9083 8
0.1049 1.0484 0.7366 9
0.0740 0.6191 0.8321 10
0.0816 0.6273 0.8283 11
0.0981 0.2901 0.9172 12
0.0614 0.5081 0.8523 13
0.0548 0.4983 0.8612 14
0.0652 0.8008 0.7850 15
0.0857 0.5845 0.8415 16
0.0847 0.6887 0.8184 17
0.0645 0.6104 0.8405 18
0.0891 0.4770 0.8532 19
0.0532 0.5074 0.8500 20
0.0483 0.8208 0.7850 21
0.0498 0.2679 0.9083 22
0.0406 0.3261 0.9036 23
0.0578 0.6373 0.8340 24
0.1010 0.5037 0.8481 25
0.0583 0.2993 0.8984 26
0.0398 0.1538 0.9492 27
0.0492 0.4397 0.8641 28
0.1015 0.3231 0.9083 29

Framework versions

  • Transformers 4.41.1
  • TensorFlow 2.15.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from