Edit model card

vit-gabor-detection-v3

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4139
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 200
  • eval_batch_size: 200
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.6629 0.5
No log 2.0 2 0.6564 0.5
No log 3.0 3 0.6496 0.5
No log 4.0 4 0.6428 0.5
No log 5.0 5 0.6362 0.5
No log 6.0 6 0.6296 0.5
No log 7.0 7 0.6232 0.5
No log 8.0 8 0.6172 0.5
No log 9.0 9 0.6113 1.0
0.509 10.0 10 0.6058 1.0
0.509 11.0 11 0.6005 1.0
0.509 12.0 12 0.5950 1.0
0.509 13.0 13 0.5892 1.0
0.509 14.0 14 0.5832 1.0
0.509 15.0 15 0.5765 1.0
0.509 16.0 16 0.5699 1.0
0.509 17.0 17 0.5630 1.0
0.509 18.0 18 0.5562 1.0
0.509 19.0 19 0.5494 1.0
0.248 20.0 20 0.5426 1.0
0.248 21.0 21 0.5360 1.0
0.248 22.0 22 0.5295 1.0
0.248 23.0 23 0.5231 1.0
0.248 24.0 24 0.5175 1.0
0.248 25.0 25 0.5126 1.0
0.248 26.0 26 0.5079 1.0
0.248 27.0 27 0.5034 1.0
0.248 28.0 28 0.4991 1.0
0.248 29.0 29 0.4949 1.0
0.119 30.0 30 0.4908 1.0
0.119 31.0 31 0.4868 1.0
0.119 32.0 32 0.4833 1.0
0.119 33.0 33 0.4803 1.0
0.119 34.0 34 0.4777 1.0
0.119 35.0 35 0.4751 1.0
0.119 36.0 36 0.4727 1.0
0.119 37.0 37 0.4704 1.0
0.119 38.0 38 0.4681 1.0
0.119 39.0 39 0.4658 1.0
0.0692 40.0 40 0.4635 1.0
0.0692 41.0 41 0.4612 1.0
0.0692 42.0 42 0.4588 1.0
0.0692 43.0 43 0.4564 1.0
0.0692 44.0 44 0.4542 1.0
0.0692 45.0 45 0.4522 1.0
0.0692 46.0 46 0.4504 1.0
0.0692 47.0 47 0.4488 1.0
0.0692 48.0 48 0.4474 1.0
0.0692 49.0 49 0.4463 1.0
0.0487 50.0 50 0.4453 1.0
0.0487 51.0 51 0.4444 1.0
0.0487 52.0 52 0.4435 1.0
0.0487 53.0 53 0.4427 1.0
0.0487 54.0 54 0.4419 1.0
0.0487 55.0 55 0.4410 1.0
0.0487 56.0 56 0.4402 1.0
0.0487 57.0 57 0.4394 1.0
0.0487 58.0 58 0.4385 1.0
0.0487 59.0 59 0.4375 1.0
0.0374 60.0 60 0.4366 1.0
0.0374 61.0 61 0.4356 1.0
0.0374 62.0 62 0.4347 1.0
0.0374 63.0 63 0.4338 1.0
0.0374 64.0 64 0.4328 1.0
0.0374 65.0 65 0.4319 1.0
0.0374 66.0 66 0.4311 1.0
0.0374 67.0 67 0.4302 1.0
0.0374 68.0 68 0.4294 1.0
0.0374 69.0 69 0.4286 1.0
0.0321 70.0 70 0.4278 1.0
0.0321 71.0 71 0.4271 1.0
0.0321 72.0 72 0.4264 1.0
0.0321 73.0 73 0.4257 1.0
0.0321 74.0 74 0.4251 1.0
0.0321 75.0 75 0.4245 1.0
0.0321 76.0 76 0.4239 1.0
0.0321 77.0 77 0.4233 1.0
0.0321 78.0 78 0.4228 1.0
0.0321 79.0 79 0.4223 1.0
0.0285 80.0 80 0.4219 1.0
0.0285 81.0 81 0.4215 1.0
0.0285 82.0 82 0.4211 1.0
0.0285 83.0 83 0.4206 1.0
0.0285 84.0 84 0.4201 1.0
0.0285 85.0 85 0.4197 1.0
0.0285 86.0 86 0.4192 1.0
0.0285 87.0 87 0.4189 1.0
0.0285 88.0 88 0.4185 1.0
0.0285 89.0 89 0.4182 1.0
0.0268 90.0 90 0.4179 1.0
0.0268 91.0 91 0.4176 1.0
0.0268 92.0 92 0.4173 1.0
0.0268 93.0 93 0.4170 1.0
0.0268 94.0 94 0.4168 1.0
0.0268 95.0 95 0.4165 1.0
0.0268 96.0 96 0.4163 1.0
0.0268 97.0 97 0.4161 1.0
0.0268 98.0 98 0.4159 1.0
0.0268 99.0 99 0.4157 1.0
0.0249 100.0 100 0.4155 1.0
0.0249 101.0 101 0.4154 1.0
0.0249 102.0 102 0.4152 1.0
0.0249 103.0 103 0.4151 1.0
0.0249 104.0 104 0.4150 1.0
0.0249 105.0 105 0.4148 1.0
0.0249 106.0 106 0.4147 1.0
0.0249 107.0 107 0.4146 1.0
0.0249 108.0 108 0.4145 1.0
0.0249 109.0 109 0.4144 1.0
0.0242 110.0 110 0.4144 1.0
0.0242 111.0 111 0.4143 1.0
0.0242 112.0 112 0.4142 1.0
0.0242 113.0 113 0.4141 1.0
0.0242 114.0 114 0.4141 1.0
0.0242 115.0 115 0.4140 1.0
0.0242 116.0 116 0.4140 1.0
0.0242 117.0 117 0.4139 1.0
0.0242 118.0 118 0.4139 1.0
0.0242 119.0 119 0.4139 1.0
0.0292 120.0 120 0.4139 1.0

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.4.0
  • Tokenizers 0.15.0
Downloads last month
5
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results