Edit model card

NInjaQuarrior/vit-base-patch16-224-in21k-disaster2

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1002
  • Train Caccuracy: 0.9936
  • Train Fn: 14.0
  • Train Fp: 17.0
  • Train Tn: 4647.0
  • Train Tp: 2318.0
  • Validation Loss: 0.1452
  • Validation Caccuracy: 0.9757
  • Validation Fn: 9.0
  • Validation Fp: 12.0
  • Validation Tn: 812.0
  • Validation Tp: 403.0
  • Epoch: 2

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 219, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
  • training_precision: mixed_float16

Training results

Train Loss Train Caccuracy Train Fn Train Fp Train Tn Train Tp Validation Loss Validation Caccuracy Validation Fn Validation Fp Validation Tn Validation Tp Epoch
0.5143 0.9112 729.0 38.0 4626.0 1603.0 0.2132 0.9684 14.0 14.0 810.0 398.0 0
0.1474 0.9850 36.0 41.0 4623.0 2296.0 0.1596 0.9709 12.0 14.0 810.0 400.0 1
0.1002 0.9936 14.0 17.0 4647.0 2318.0 0.1452 0.9757 9.0 12.0 812.0 403.0 2

Framework versions

  • Transformers 4.35.2
  • TensorFlow 2.10.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
7

Finetuned from