Edit model card

tobikoi-classifier-alpha1

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0002
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150.0

Training results

Training Loss Epoch Step Accuracy Validation Loss
0.4014 1.0 54 0.7632 0.3552
0.2253 2.0 108 0.9737 0.1712
0.0768 3.0 162 0.9868 0.0763
0.0694 4.0 216 0.9868 0.0615
0.0433 5.0 270 0.9868 0.0504
0.1045 6.0 324 0.9868 0.0323
0.0148 7.0 378 0.9868 0.0436
0.0156 8.0 432 0.9868 0.0271
0.0109 9.0 486 0.9868 0.0511
0.0142 10.0 540 0.9868 0.0563
0.0307 11.0 594 0.9868 0.0633
0.0092 12.0 648 0.9868 0.0430
0.007 13.0 702 0.9868 0.0508
0.0059 14.0 756 0.9868 0.0598
0.0057 15.0 810 0.9868 0.0639
0.0513 16.0 864 0.9868 0.0579
0.0259 17.0 918 0.9868 0.0707
0.0111 18.0 972 0.9868 0.0611
0.014 19.0 1026 0.9868 0.0620
0.004 20.0 1080 1.0 0.0058
0.0036 21.0 1134 1.0 0.0044
0.0545 22.0 1188 1.0 0.0114
0.0131 23.0 1242 0.9868 0.0621
0.0651 24.0 1296 0.9868 0.0692
0.0047 25.0 1350 1.0 0.0034
0.0374 26.0 1404 1.0 0.0031
0.0482 27.0 1458 1.0 0.0045
0.0026 28.0 1512 1.0 0.0028
0.0038 29.0 1566 1.0 0.0025
0.0027 30.0 1620 1.0 0.0023
0.0145 31.0 1674 0.9868 0.0698
0.0022 32.0 1728 0.9868 0.0255
0.0025 33.0 1782 1.0 0.0095
0.0022 34.0 1836 0.9868 0.0725
0.0019 35.0 1890 0.9868 0.0592
0.0159 36.0 1944 0.9868 0.0747
0.0018 37.0 1998 0.9868 0.0244
0.0016 38.0 2052 1.0 0.0019
0.0017 39.0 2106 1.0 0.0018
0.053 40.0 2160 1.0 0.0023
0.0016 41.0 2214 1.0 0.0061
0.0015 42.0 2268 1.0 0.0102
0.0015 43.0 2322 1.0 0.0019
0.0015 44.0 2376 1.0 0.0062
0.0014 45.0 2430 1.0 0.0014
0.0015 46.0 2484 1.0 0.0015
0.0013 47.0 2538 0.9868 0.0672
0.0012 48.0 2592 1.0 0.0015
0.0012 49.0 2646 0.9868 0.0700
0.0012 50.0 2700 0.9868 0.0579
0.0011 51.0 2754 0.9868 0.0571
0.001 52.0 2808 0.9868 0.0670
0.001 53.0 2862 0.9868 0.0730
0.0013 54.0 2916 0.9868 0.0135
0.001 55.0 2970 0.9868 0.0836
0.0009 56.0 3024 1.0 0.0010
0.0009 57.0 3078 0.9868 0.0122
0.001 58.0 3132 0.9868 0.0105
0.0017 59.0 3186 1.0 0.0074
0.0009 60.0 3240 1.0 0.0010
0.0009 61.0 3294 1.0 0.0009
0.0381 62.0 3348 1.0 0.0020
0.0008 63.0 3402 1.0 0.0008
0.0099 64.0 3456 1.0 0.0008
0.0007 65.0 3510 0.9868 0.0757
0.0008 66.0 3564 0.9868 0.0764
0.0007 67.0 3618 0.9737 0.1257
0.0007 68.0 3672 0.9868 0.0098
0.0736 69.0 3726 1.0 0.0008
0.0007 70.0 3780 0.9868 0.0605
0.0006 71.0 3834 1.0 0.0012
0.001 72.0 3888 0.9737 0.1666
0.0042 73.0 3942 1.0 0.0007
0.0006 74.0 3996 1.0 0.0007
0.0007 75.0 4050 1.0 0.0007
0.0006 76.0 4104 0.9868 0.0331
0.0006 77.0 4158 0.9868 0.0169
0.0345 78.0 4212 1.0 0.0006
0.0005 79.0 4266 0.9868 0.0762
0.0005 80.0 4320 1.0 0.0007
0.0005 81.0 4374 1.0 0.0005
0.0005 82.0 4428 1.0 0.0006
0.0005 83.0 4482 1.0 0.0005
0.0005 84.0 4536 1.0 0.0005
0.0047 85.0 4590 1.0 0.0007
0.0005 86.0 4644 1.0 0.0005
0.0005 87.0 4698 1.0 0.0005
0.0004 88.0 4752 1.0 0.0004
0.0004 89.0 4806 1.0 0.0004
0.0005 90.0 4860 1.0 0.0005
0.0004 91.0 4914 1.0 0.0005
0.0067 92.0 4968 1.0 0.0004
0.0004 93.0 5022 1.0 0.0004
0.0004 94.0 5076 1.0 0.0004
0.0004 95.0 5130 1.0 0.0004
0.0004 96.0 5184 1.0 0.0004
0.0004 97.0 5238 1.0 0.0004
0.0004 98.0 5292 1.0 0.0004
0.0003 99.0 5346 1.0 0.0004
0.0003 100.0 5400 1.0 0.0003
0.0003 101.0 5454 1.0 0.0004
0.0004 102.0 5508 1.0 0.0005
0.0004 103.0 5562 1.0 0.0005
0.0004 104.0 5616 1.0 0.0004
0.0006 105.0 5670 1.0 0.0003
0.0005 106.0 5724 1.0 0.0003
0.0003 107.0 5778 1.0 0.0003
0.0003 108.0 5832 1.0 0.0003
0.0003 109.0 5886 1.0 0.0003
0.0003 110.0 5940 1.0 0.0003
0.0003 111.0 5994 1.0 0.0003
0.0003 112.0 6048 1.0 0.0003
0.0003 113.0 6102 1.0 0.0003
0.0003 114.0 6156 1.0 0.0003
0.0003 115.0 6210 1.0 0.0003
0.0003 116.0 6264 1.0 0.0003
0.0003 117.0 6318 1.0 0.0003
0.0003 118.0 6372 1.0 0.0003
0.0002 119.0 6426 1.0 0.0002
0.0002 120.0 6480 1.0 0.0002
0.0002 121.0 6534 1.0 0.0002
0.0003 122.0 6588 1.0 0.0002
0.0002 123.0 6642 1.0 0.0002
0.0002 124.0 6696 1.0 0.0002
0.0002 125.0 6750 1.0 0.0002
0.0002 126.0 6804 1.0 0.0002
0.0712 127.0 6858 1.0 0.0002
0.0002 128.0 6912 1.0 0.0002
0.0002 129.0 6966 1.0 0.0002
0.0002 130.0 7020 1.0 0.0002
0.0002 131.0 7074 1.0 0.0002
0.0002 132.0 7128 1.0 0.0002
0.0002 133.0 7182 1.0 0.0002
0.0002 134.0 7236 1.0 0.0002
0.0002 135.0 7290 1.0 0.0002
0.0003 136.0 7344 1.0 0.0002
0.0002 137.0 7398 1.0 0.0002
0.0002 138.0 7452 1.0 0.0002
0.0028 139.0 7506 1.0 0.0002
0.0006 140.0 7560 1.0 0.0002
0.0002 141.0 7614 1.0 0.0002
0.0002 142.0 7668 1.0 0.0002
0.0004 143.0 7722 1.0 0.0002
0.0002 144.0 7776 1.0 0.0002
0.0002 145.0 7830 1.0 0.0002
0.1028 146.0 7884 1.0 0.0002
0.0002 147.0 7938 1.0 0.0002
0.0002 148.0 7992 1.0 0.0002
0.0002 149.0 8046 1.0 0.0002
0.0002 150.0 8100 1.0 0.0002

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.1.1+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
6
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ppicazo/tobikoi-classifier-alpha1

Finetuned
(1693)
this model

Evaluation results