talli96123's picture
End of training
6af11af verified
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: meat_calssify_fresh_crop_fixed_epoch100_V_0_3
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.759493670886076

meat_calssify_fresh_crop_fixed_epoch100_V_0_3

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8015
  • Accuracy: 0.7595

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1038 1.0 10 1.1003 0.3165
1.0874 2.0 20 1.0824 0.4367
1.0699 3.0 30 1.0695 0.4051
1.0363 4.0 40 1.0423 0.4620
0.9912 5.0 50 0.9911 0.5253
0.9378 6.0 60 0.9630 0.5380
0.8829 7.0 70 0.9239 0.5696
0.8168 8.0 80 0.9127 0.5759
0.758 9.0 90 0.8583 0.6392
0.7032 10.0 100 0.7911 0.6772
0.6348 11.0 110 0.8342 0.5949
0.6082 12.0 120 0.8187 0.6203
0.5936 13.0 130 0.6830 0.7468
0.5726 14.0 140 0.8194 0.6962
0.5774 15.0 150 0.7164 0.6709
0.4617 16.0 160 0.8145 0.6456
0.4399 17.0 170 0.6810 0.7215
0.4065 18.0 180 0.7049 0.7089
0.4012 19.0 190 0.7462 0.6962
0.3553 20.0 200 0.7550 0.6835
0.419 21.0 210 0.9119 0.5886
0.3862 22.0 220 0.7467 0.6646
0.3218 23.0 230 0.7734 0.7025
0.2856 24.0 240 0.7714 0.6772
0.2853 25.0 250 0.7505 0.7215
0.3005 26.0 260 0.8128 0.7152
0.3128 27.0 270 0.8520 0.6456
0.3146 28.0 280 0.8034 0.6962
0.3091 29.0 290 0.8961 0.6835
0.3128 30.0 300 0.8976 0.6772
0.2696 31.0 310 0.7489 0.7089
0.286 32.0 320 0.6697 0.7532
0.3084 33.0 330 0.8061 0.7089
0.2527 34.0 340 0.6714 0.7658
0.2239 35.0 350 0.6858 0.7595
0.2251 36.0 360 0.7142 0.7405
0.2049 37.0 370 0.6644 0.7785
0.2141 38.0 380 0.6722 0.7532
0.2265 39.0 390 0.8172 0.7089
0.1847 40.0 400 0.5876 0.7848
0.1692 41.0 410 0.7485 0.7468
0.1759 42.0 420 0.6973 0.7468
0.2126 43.0 430 0.7180 0.7468
0.2217 44.0 440 0.8617 0.6772
0.1662 45.0 450 0.7264 0.7468
0.1168 46.0 460 0.6226 0.7848
0.1737 47.0 470 0.7201 0.7658
0.1673 48.0 480 0.7411 0.7658
0.1992 49.0 490 0.6667 0.7722
0.1327 50.0 500 0.8436 0.6962
0.1409 51.0 510 0.8467 0.6899
0.1325 52.0 520 0.8331 0.7278
0.1247 53.0 530 0.6017 0.7722
0.1215 54.0 540 0.5934 0.8038
0.1556 55.0 550 0.8121 0.7215
0.1615 56.0 560 0.5814 0.7911
0.1268 57.0 570 0.6809 0.7468
0.1258 58.0 580 0.5749 0.7975
0.1128 59.0 590 0.6332 0.8101
0.1519 60.0 600 0.7176 0.7785
0.1303 61.0 610 0.6800 0.7405
0.1256 62.0 620 0.7101 0.7468
0.1139 63.0 630 0.7587 0.7532
0.0914 64.0 640 0.6320 0.8038
0.1314 65.0 650 0.7287 0.7658
0.1402 66.0 660 0.9050 0.7025
0.0947 67.0 670 0.5996 0.7785
0.0902 68.0 680 0.6142 0.8038
0.1101 69.0 690 0.8431 0.7405
0.1138 70.0 700 0.6796 0.7658
0.0875 71.0 710 0.8089 0.7405
0.1006 72.0 720 0.6522 0.7532
0.0811 73.0 730 0.7060 0.7975
0.0919 74.0 740 0.7761 0.7658
0.0717 75.0 750 0.8626 0.7089
0.0961 76.0 760 0.6235 0.7975
0.0869 77.0 770 0.6554 0.7975
0.096 78.0 780 0.6839 0.7658
0.0926 79.0 790 0.6482 0.8038
0.0702 80.0 800 0.8970 0.7405
0.0802 81.0 810 0.7076 0.7785
0.0631 82.0 820 0.5784 0.8228
0.0874 83.0 830 0.6042 0.8038
0.092 84.0 840 0.6569 0.8038
0.0827 85.0 850 0.7801 0.7848
0.0796 86.0 860 0.7321 0.7975
0.0731 87.0 870 0.6231 0.7911
0.0731 88.0 880 0.6244 0.8038
0.0694 89.0 890 0.6433 0.7975
0.0601 90.0 900 0.7026 0.7785
0.0715 91.0 910 0.5609 0.8165
0.0782 92.0 920 0.5387 0.8481
0.0685 93.0 930 0.5740 0.8165
0.0508 94.0 940 0.6352 0.8291
0.0871 95.0 950 0.6687 0.8038
0.0533 96.0 960 0.5791 0.8165
0.0525 97.0 970 0.8043 0.7532
0.0884 98.0 980 0.7164 0.7911
0.0619 99.0 990 0.7417 0.7785
0.0703 100.0 1000 0.8015 0.7595

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1