Edit model card

V4_Image_classification__points_durs__google_vit-base-patch16-224-in21k

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2221
  • Accuracy: 0.9560

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6743 1.0 13 0.6315 0.7566
0.6051 2.0 26 0.4384 0.9150
0.4588 3.0 39 0.2402 0.9326
0.1818 4.0 52 0.1702 0.9384
0.1102 5.0 65 0.1409 0.9413
0.0733 6.0 78 0.1516 0.9501
0.0423 7.0 91 0.1613 0.9560
0.0286 8.0 104 0.1843 0.9501
0.0192 9.0 117 0.1672 0.9560
0.0159 10.0 130 0.1703 0.9589
0.0173 11.0 143 0.1729 0.9560
0.0143 12.0 156 0.1786 0.9560
0.0105 13.0 169 0.1821 0.9560
0.0091 14.0 182 0.1827 0.9589
0.0096 15.0 195 0.1859 0.9560
0.0081 16.0 208 0.1989 0.9560
0.0075 17.0 221 0.2012 0.9560
0.0347 18.0 234 0.2507 0.9384
0.0232 19.0 247 0.2271 0.9413
0.0065 20.0 260 0.1950 0.9589
0.0102 21.0 273 0.2378 0.9472
0.0064 22.0 286 0.2265 0.9501
0.0058 23.0 299 0.2033 0.9560
0.0055 24.0 312 0.2402 0.9501
0.005 25.0 325 0.2500 0.9443
0.0054 26.0 338 0.2450 0.9472
0.0048 27.0 351 0.2431 0.9501
0.0047 28.0 364 0.2439 0.9472
0.0046 29.0 377 0.2445 0.9472
0.0044 30.0 390 0.2434 0.9472
0.0042 31.0 403 0.2441 0.9472
0.0042 32.0 416 0.2426 0.9472
0.0042 33.0 429 0.2414 0.9472
0.004 34.0 442 0.2383 0.9472
0.004 35.0 455 0.2349 0.9472
0.0039 36.0 468 0.2340 0.9472
0.0038 37.0 481 0.2325 0.9472
0.0037 38.0 494 0.2311 0.9501
0.0038 39.0 507 0.2280 0.9501
0.0037 40.0 520 0.2263 0.9531
0.0036 41.0 533 0.2248 0.9531
0.0036 42.0 546 0.2242 0.9531
0.0036 43.0 559 0.2236 0.9531
0.0035 44.0 572 0.2231 0.9560
0.0035 45.0 585 0.2224 0.9560
0.0035 46.0 598 0.2223 0.9560
0.0035 47.0 611 0.2220 0.9560
0.0035 48.0 624 0.2221 0.9560
0.0034 49.0 637 0.2221 0.9560
0.0035 50.0 650 0.2221 0.9560

Framework versions

  • Transformers 4.30.0
  • Pytorch 2.1.1
  • Datasets 2.15.0
  • Tokenizers 0.13.3
Downloads last month
1