Edit model card

Prahas10/roof-test

This model is a fine-tuned version of google/vit-base-patch32-384 on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0637
  • Validation Loss: 0.1264
  • Train Accuracy: 0.9474
  • Epoch: 28

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 4e-05, 'decay_steps': 3990, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.0001}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Accuracy Epoch
2.6939 2.4863 0.2807 0
2.1820 2.2454 0.4912 1
1.8026 1.8798 0.4912 2
1.4641 1.6673 0.5439 3
1.1288 1.3594 0.6842 4
0.9426 1.0517 0.8070 5
0.6577 0.8531 0.8421 6
0.5025 0.6971 0.8772 7
0.3976 0.5785 0.8596 8
0.3052 0.5568 0.9123 9
0.2562 0.5137 0.8947 10
0.3250 0.4415 0.9298 11
0.2773 0.8003 0.7368 12
0.2694 0.4544 0.8421 13
0.2180 0.5179 0.8947 14
0.1515 0.3450 0.9825 15
0.1386 0.2818 0.9825 16
0.1058 0.1962 0.9649 17
0.0724 0.2456 0.9825 18
0.0604 0.2432 0.9649 19
0.0718 0.2548 1.0 20
0.0507 0.2760 0.9474 21
0.0453 0.1565 0.9825 22
0.0274 0.1377 0.9825 23
0.0396 0.1906 0.9649 24
0.0360 0.1217 0.9825 25
0.0307 0.2234 0.9474 26
0.0427 0.2861 0.9298 27
0.0637 0.1264 0.9474 28

Framework versions

  • Transformers 4.38.2
  • TensorFlow 2.15.0
  • Datasets 2.16.1
  • Tokenizers 0.15.2
Downloads last month
1

Finetuned from