Edit model card

vit-letter-identification-v2

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1135
  • Accuracy: 0.8627

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 100
  • eval_batch_size: 102
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 3.2331 0.0882
3.2363 2.0 12 3.2025 0.1373
3.2363 3.0 18 3.1761 0.1863
3.1622 4.0 24 3.1238 0.2255
3.0918 5.0 30 3.0789 0.3137
3.0918 6.0 36 3.0280 0.3235
3.0081 7.0 42 2.9878 0.3431
3.0081 8.0 48 2.9316 0.3824
2.9118 9.0 54 2.8864 0.4314
2.8231 10.0 60 2.8314 0.4510
2.8231 11.0 66 2.7817 0.5196
2.7149 12.0 72 2.7278 0.5196
2.7149 13.0 78 2.6796 0.5588
2.6202 14.0 84 2.6203 0.5882
2.5243 15.0 90 2.5674 0.5882
2.5243 16.0 96 2.5170 0.6078
2.4279 17.0 102 2.4672 0.6176
2.4279 18.0 108 2.4285 0.5980
2.3404 19.0 114 2.3784 0.6569
2.2633 20.0 120 2.3348 0.6471
2.2633 21.0 126 2.2872 0.6667
2.1838 22.0 132 2.2539 0.6569
2.1838 23.0 138 2.2232 0.6765
2.1022 24.0 144 2.1867 0.6471
2.0364 25.0 150 2.1489 0.6863
2.0364 26.0 156 2.1099 0.7255
1.96 27.0 162 2.0767 0.7157
1.96 28.0 168 2.0417 0.7157
1.9235 29.0 174 2.0162 0.7353
1.8484 30.0 180 1.9787 0.7451
1.8484 31.0 186 1.9548 0.7451
1.7971 32.0 192 1.9329 0.7549
1.7971 33.0 198 1.9052 0.7647
1.7409 34.0 204 1.8827 0.7549
1.7006 35.0 210 1.8589 0.7745
1.7006 36.0 216 1.8294 0.7843
1.6426 37.0 222 1.8098 0.7843
1.6426 38.0 228 1.7809 0.7647
1.6102 39.0 234 1.7643 0.7843
1.5704 40.0 240 1.7399 0.8039
1.5704 41.0 246 1.7193 0.8137
1.5264 42.0 252 1.6980 0.8333
1.5264 43.0 258 1.6840 0.8039
1.4821 44.0 264 1.6644 0.8235
1.4506 45.0 270 1.6467 0.8235
1.4506 46.0 276 1.6333 0.8235
1.4358 47.0 282 1.6095 0.8235
1.4358 48.0 288 1.5906 0.8235
1.3695 49.0 294 1.5720 0.8431
1.367 50.0 300 1.5610 0.8333
1.367 51.0 306 1.5440 0.8529
1.3299 52.0 312 1.5359 0.8333
1.3299 53.0 318 1.5129 0.8333
1.2765 54.0 324 1.5057 0.8235
1.2785 55.0 330 1.4867 0.8235
1.2785 56.0 336 1.4751 0.8333
1.2355 57.0 342 1.4553 0.8235
1.2355 58.0 348 1.4491 0.8235
1.2418 59.0 354 1.4289 0.8431
1.2058 60.0 360 1.4185 0.8235
1.2058 61.0 366 1.4104 0.8333
1.164 62.0 372 1.3968 0.8333
1.164 63.0 378 1.3846 0.8431
1.1529 64.0 384 1.3697 0.8431
1.1408 65.0 390 1.3633 0.8431
1.1408 66.0 396 1.3505 0.8431
1.1102 67.0 402 1.3371 0.8529
1.1102 68.0 408 1.3282 0.8529
1.0906 69.0 414 1.3240 0.8431
1.0759 70.0 420 1.3163 0.8431
1.0759 71.0 426 1.3044 0.8529
1.0651 72.0 432 1.2924 0.8431
1.0651 73.0 438 1.2867 0.8529
1.0501 74.0 444 1.2749 0.8529
1.0238 75.0 450 1.2688 0.8431
1.0238 76.0 456 1.2568 0.8529
1.0046 77.0 462 1.2502 0.8529
1.0046 78.0 468 1.2460 0.8529
0.9946 79.0 474 1.2455 0.8431
0.9998 80.0 480 1.2343 0.8529
0.9998 81.0 486 1.2286 0.8529
0.9709 82.0 492 1.2195 0.8431
0.9709 83.0 498 1.2126 0.8529
0.963 84.0 504 1.2102 0.8431
0.9499 85.0 510 1.2024 0.8431
0.9499 86.0 516 1.1980 0.8529
0.937 87.0 522 1.1912 0.8529
0.937 88.0 528 1.1883 0.8431
0.9389 89.0 534 1.1845 0.8529
0.9181 90.0 540 1.1811 0.8529
0.9181 91.0 546 1.1777 0.8431
0.9219 92.0 552 1.1743 0.8627
0.9219 93.0 558 1.1675 0.8627
0.9067 94.0 564 1.1598 0.8627
0.9009 95.0 570 1.1601 0.8627
0.9009 96.0 576 1.1564 0.8529
0.8914 97.0 582 1.1505 0.8529
0.8914 98.0 588 1.1487 0.8529
0.8739 99.0 594 1.1480 0.8627
0.8742 100.0 600 1.1413 0.8529
0.8742 101.0 606 1.1368 0.8627
0.8679 102.0 612 1.1361 0.8627
0.8679 103.0 618 1.1317 0.8627
0.8516 104.0 624 1.1296 0.8529
0.876 105.0 630 1.1288 0.8627
0.876 106.0 636 1.1264 0.8627
0.8591 107.0 642 1.1238 0.8627
0.8591 108.0 648 1.1227 0.8627
0.8586 109.0 654 1.1208 0.8627
0.8415 110.0 660 1.1194 0.8627
0.8415 111.0 666 1.1185 0.8627
0.8465 112.0 672 1.1178 0.8529
0.8465 113.0 678 1.1184 0.8529
0.8503 114.0 684 1.1183 0.8431
0.8332 115.0 690 1.1174 0.8431
0.8332 116.0 696 1.1165 0.8431
0.8476 117.0 702 1.1153 0.8529
0.8476 118.0 708 1.1142 0.8529
0.8382 119.0 714 1.1137 0.8627
0.8527 120.0 720 1.1135 0.8627

Framework versions

  • Transformers 4.37.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
14
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results