nicolasdupuisroy's picture
End of training
4fe24a2
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - image-classification
  - vision
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-gabor-detection-v2
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 1

vit-gabor-detection-v2

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0186
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 200
  • eval_batch_size: 200
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.5751 1.0
No log 2.0 2 0.5081 1.0
No log 3.0 3 0.4654 1.0
No log 4.0 4 0.4014 1.0
No log 5.0 5 0.3692 1.0
No log 6.0 6 0.3327 1.0
No log 7.0 7 0.2937 1.0
No log 8.0 8 0.2775 1.0
No log 9.0 9 0.2335 1.0
0.4432 10.0 10 0.2092 1.0
0.4432 11.0 11 0.2007 1.0
0.4432 12.0 12 0.1674 1.0
0.4432 13.0 13 0.1546 1.0
0.4432 14.0 14 0.1393 1.0
0.4432 15.0 15 0.1297 1.0
0.4432 16.0 16 0.1219 1.0
0.4432 17.0 17 0.1090 1.0
0.4432 18.0 18 0.1012 1.0
0.4432 19.0 19 0.0981 1.0
0.1696 20.0 20 0.0874 1.0
0.1696 21.0 21 0.0812 1.0
0.1696 22.0 22 0.0750 1.0
0.1696 23.0 23 0.0754 1.0
0.1696 24.0 24 0.0693 1.0
0.1696 25.0 25 0.0642 1.0
0.1696 26.0 26 0.0610 1.0
0.1696 27.0 27 0.0586 1.0
0.1696 28.0 28 0.0569 1.0
0.1696 29.0 29 0.0532 1.0
0.0792 30.0 30 0.0506 1.0
0.0792 31.0 31 0.0495 1.0
0.0792 32.0 32 0.0476 1.0
0.0792 33.0 33 0.0457 1.0
0.0792 34.0 34 0.0442 1.0
0.0792 35.0 35 0.0419 1.0
0.0792 36.0 36 0.0404 1.0
0.0792 37.0 37 0.0396 1.0
0.0792 38.0 38 0.0384 1.0
0.0792 39.0 39 0.0377 1.0
0.049 40.0 40 0.0366 1.0
0.049 41.0 41 0.0370 1.0
0.049 42.0 42 0.0339 1.0
0.049 43.0 43 0.0330 1.0
0.049 44.0 44 0.0344 1.0
0.049 45.0 45 0.0324 1.0
0.049 46.0 46 0.0323 1.0
0.049 47.0 47 0.0311 1.0
0.049 48.0 48 0.0308 1.0
0.049 49.0 49 0.0294 1.0
0.0359 50.0 50 0.0297 1.0
0.0359 51.0 51 0.0289 1.0
0.0359 52.0 52 0.0285 1.0
0.0359 53.0 53 0.0280 1.0
0.0359 54.0 54 0.0270 1.0
0.0359 55.0 55 0.0265 1.0
0.0359 56.0 56 0.0266 1.0
0.0359 57.0 57 0.0261 1.0
0.0359 58.0 58 0.0268 1.0
0.0359 59.0 59 0.0255 1.0
0.0293 60.0 60 0.0255 1.0
0.0293 61.0 61 0.0246 1.0
0.0293 62.0 62 0.0256 1.0
0.0293 63.0 63 0.0247 1.0
0.0293 64.0 64 0.0241 1.0
0.0293 65.0 65 0.0241 1.0
0.0293 66.0 66 0.0234 1.0
0.0293 67.0 67 0.0236 1.0
0.0293 68.0 68 0.0228 1.0
0.0293 69.0 69 0.0233 1.0
0.0256 70.0 70 0.0227 1.0
0.0256 71.0 71 0.0227 1.0
0.0256 72.0 72 0.0230 1.0
0.0256 73.0 73 0.0222 1.0
0.0256 74.0 74 0.0220 1.0
0.0256 75.0 75 0.0221 1.0
0.0256 76.0 76 0.0219 1.0
0.0256 77.0 77 0.0215 1.0
0.0256 78.0 78 0.0210 1.0
0.0256 79.0 79 0.0209 1.0
0.0234 80.0 80 0.0212 1.0
0.0234 81.0 81 0.0212 1.0
0.0234 82.0 82 0.0206 1.0
0.0234 83.0 83 0.0210 1.0
0.0234 84.0 84 0.0204 1.0
0.0234 85.0 85 0.0205 1.0
0.0234 86.0 86 0.0204 1.0
0.0234 87.0 87 0.0203 1.0
0.0234 88.0 88 0.0200 1.0
0.0234 89.0 89 0.0203 1.0
0.0218 90.0 90 0.0196 1.0
0.0218 91.0 91 0.0199 1.0
0.0218 92.0 92 0.0198 1.0
0.0218 93.0 93 0.0196 1.0
0.0218 94.0 94 0.0195 1.0
0.0218 95.0 95 0.0198 1.0
0.0218 96.0 96 0.0197 1.0
0.0218 97.0 97 0.0193 1.0
0.0218 98.0 98 0.0195 1.0
0.0218 99.0 99 0.0194 1.0
0.0208 100.0 100 0.0192 1.0
0.0208 101.0 101 0.0190 1.0
0.0208 102.0 102 0.0188 1.0
0.0208 103.0 103 0.0191 1.0
0.0208 104.0 104 0.0193 1.0
0.0208 105.0 105 0.0193 1.0
0.0208 106.0 106 0.0190 1.0
0.0208 107.0 107 0.0191 1.0
0.0208 108.0 108 0.0186 1.0
0.0208 109.0 109 0.0188 1.0
0.0202 110.0 110 0.0187 1.0
0.0202 111.0 111 0.0191 1.0
0.0202 112.0 112 0.0188 1.0
0.0202 113.0 113 0.0185 1.0
0.0202 114.0 114 0.0188 1.0
0.0202 115.0 115 0.0183 1.0
0.0202 116.0 116 0.0187 1.0
0.0202 117.0 117 0.0185 1.0
0.0202 118.0 118 0.0184 1.0
0.0202 119.0 119 0.0188 1.0
0.0197 120.0 120 0.0185 1.0

Framework versions

  • Transformers 4.37.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0