nicolasdupuisroy's picture
Training in progress, epoch 121
ef9b1bf verified
|
raw
history blame
7.92 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - image-classification
  - vision
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-letter-identification-v3
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.5846153846153846

vit-letter-identification-v3

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5403
  • Accuracy: 0.5846

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 80
  • eval_batch_size: 80
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Accuracy Validation Loss
No log 1.0 7 0.0154 3.9449
3.9333 2.0 14 0.0231 3.9367
3.8939 3.0 21 0.0308 3.9280
3.8939 4.0 28 0.0462 3.9167
3.8562 5.0 35 0.0692 3.9033
3.8008 6.0 42 0.0769 3.8874
3.8008 7.0 49 0.1077 3.8670
3.7555 8.0 56 0.1 3.8495
3.6917 9.0 63 0.1154 3.8305
3.6372 10.0 70 0.1385 3.8138
3.6372 11.0 77 0.1231 3.7966
3.5846 12.0 84 0.1538 3.7767
3.5047 13.0 91 0.2308 3.7516
3.5047 14.0 98 0.2385 3.7279
3.4547 15.0 105 0.2385 3.7031
3.3796 16.0 112 0.2692 3.6725
3.3796 17.0 119 0.2769 3.6462
3.3283 18.0 126 0.2923 3.6226
3.2728 19.0 133 0.2846 3.6022
3.2229 20.0 140 0.2769 3.5930
3.2229 21.0 147 0.3308 3.5748
3.1514 22.0 154 0.3385 3.5404
3.1179 23.0 161 0.3385 3.5146
3.1179 24.0 168 0.3462 3.4916
3.0559 25.0 175 0.3385 3.4733
3.0051 26.0 182 0.3615 3.4540
3.0051 27.0 189 0.3692 3.4499
2.9775 28.0 196 0.3769 3.4355
2.9277 29.0 203 0.3846 3.4166
2.9066 30.0 210 0.4 3.4007
2.9066 31.0 217 0.3692 3.3826
2.8464 32.0 224 0.4077 3.3698
2.8044 33.0 231 0.4077 3.3509
2.8044 34.0 238 0.3769 3.3243
2.7699 35.0 245 0.3923 3.3201
2.7251 36.0 252 0.4 3.3013
2.7251 37.0 259 0.4231 3.2936
2.6915 38.0 266 0.4538 3.2827
2.6527 39.0 273 0.4615 3.2627
2.6541 40.0 280 0.4615 3.2581
2.6541 41.0 287 0.4231 3.2342
2.5968 42.0 294 0.4385 3.2211
2.573 43.0 301 0.4077 3.2122
2.573 44.0 308 0.4615 3.2259
2.554 45.0 315 0.4308 3.2271
2.5222 46.0 322 0.4462 3.2208
2.5222 47.0 329 0.4462 3.2139
2.5085 48.0 336 0.4538 3.2040
2.4593 49.0 343 0.4923 3.2053
2.4585 50.0 350 0.4769 3.1822
2.4585 51.0 357 0.4692 3.1697
2.4228 52.0 364 0.4692 3.1589
2.3954 53.0 371 0.4769 3.1375
2.3954 54.0 378 0.4538 3.1092
2.3641 55.0 385 0.4769 3.0999
2.3651 56.0 392 0.4615 3.0860
2.3651 57.0 399 0.4615 3.0813
2.3182 58.0 406 0.4923 3.0692
2.3029 59.0 413 0.4846 3.0610
2.2988 60.0 420 0.4615 3.0627
2.2988 61.0 427 0.4692 3.0520
2.2865 62.0 434 0.4538 3.0395
2.2623 63.0 441 0.4615 3.0357
2.2623 64.0 448 0.4615 3.0333
2.2252 65.0 455 0.4769 3.0229
2.2339 66.0 462 0.4769 3.0203
2.2339 67.0 469 0.4923 3.0076
2.2017 68.0 476 0.4846 2.9876
2.1972 69.0 483 0.4923 2.9716
2.1964 70.0 490 0.5 2.9632
2.1964 71.0 497 0.4923 2.9597
2.1775 72.0 504 0.5 2.9581
2.1619 73.0 511 0.5077 2.9516
2.1619 74.0 518 0.5154 2.9356
2.1633 75.0 525 0.5077 2.9286
2.1207 76.0 532 0.5154 2.9266
2.1207 77.0 539 0.5231 2.9205
2.1353 78.0 546 0.5154 2.9131
2.1075 79.0 553 0.5231 2.9075
2.1025 80.0 560 0.5231 2.9073
2.1025 81.0 567 0.5154 2.9174
2.1031 82.0 574 0.5308 2.9131
2.0932 83.0 581 0.5308 2.9092
2.0932 84.0 588 0.5308 2.8978
2.0861 85.0 595 0.5308 2.8871
2.0478 86.0 602 0.5385 2.8829
2.0478 87.0 609 0.5462 2.8804
2.0815 88.0 616 0.5462 2.8725
2.0756 89.0 623 0.5462 2.8694
2.065 90.0 630 0.5462 2.8665
2.065 91.0 637 0.5462 2.8615
2.0572 92.0 644 0.5462 2.8599
2.0358 93.0 651 0.5462 2.8620
2.0358 94.0 658 0.5462 2.8629
2.0663 95.0 665 0.5538 2.8625
2.0353 96.0 672 0.5538 2.8628
2.0353 97.0 679 0.5538 2.8629
2.0506 98.0 686 0.5538 2.8622
2.0494 99.0 693 0.5538 2.8622
2.0566 100.0 700 0.5538 2.8622

Framework versions

  • Transformers 4.37.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.4.0
  • Tokenizers 0.15.0