Edit model card

meat_calssify_fresh_crop_fixed_epoch120_V_0_1

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5969
  • Accuracy: 0.8165

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0907 1.0 10 1.0937 0.3291
1.0777 2.0 20 1.0915 0.3354
1.0596 3.0 30 1.0801 0.3544
1.0333 4.0 40 1.0463 0.4810
0.9965 5.0 50 0.9945 0.5316
0.9588 6.0 60 0.9807 0.5127
0.9183 7.0 70 0.9944 0.5063
0.8966 8.0 80 1.0074 0.5127
0.8199 9.0 90 0.9472 0.5506
0.8089 10.0 100 0.9102 0.5886
0.7151 11.0 110 0.7697 0.6962
0.6578 12.0 120 0.9557 0.5570
0.6375 13.0 130 0.7262 0.6709
0.5994 14.0 140 0.7609 0.6709
0.5086 15.0 150 0.7112 0.7089
0.4771 16.0 160 0.6239 0.7658
0.4631 17.0 170 0.7229 0.7278
0.5158 18.0 180 0.8230 0.6835
0.4938 19.0 190 0.6708 0.7278
0.4287 20.0 200 0.5675 0.7722
0.3256 21.0 210 0.6100 0.7468
0.3555 22.0 220 0.6967 0.7342
0.3453 23.0 230 0.6273 0.7532
0.3391 24.0 240 0.7153 0.7468
0.3129 25.0 250 0.7745 0.6835
0.3296 26.0 260 0.7254 0.7089
0.2695 27.0 270 0.6191 0.7658
0.2704 28.0 280 0.6397 0.7342
0.2466 29.0 290 0.7419 0.7342
0.2655 30.0 300 0.6966 0.7532
0.2448 31.0 310 0.6931 0.7532
0.2286 32.0 320 0.5090 0.7975
0.2419 33.0 330 0.5987 0.7405
0.1965 34.0 340 0.7482 0.7405
0.197 35.0 350 0.8851 0.7089
0.1829 36.0 360 0.5874 0.7848
0.187 37.0 370 0.7249 0.7405
0.2011 38.0 380 0.6821 0.7468
0.1927 39.0 390 0.7133 0.7848
0.2043 40.0 400 0.6956 0.7468
0.1661 41.0 410 0.6911 0.7532
0.1625 42.0 420 0.7424 0.7405
0.1876 43.0 430 0.6222 0.7722
0.1865 44.0 440 0.6250 0.8101
0.1604 45.0 450 0.6950 0.7405
0.2072 46.0 460 0.8475 0.7025
0.1816 47.0 470 0.5856 0.7975
0.1683 48.0 480 0.6331 0.7911
0.1698 49.0 490 0.8147 0.7278
0.1724 50.0 500 0.5908 0.7848
0.1231 51.0 510 0.4522 0.8291
0.1121 52.0 520 0.5198 0.8291
0.1335 53.0 530 0.6377 0.7785
0.1414 54.0 540 0.6638 0.7722
0.1263 55.0 550 0.6548 0.7722
0.1427 56.0 560 0.6146 0.7785
0.1137 57.0 570 0.5344 0.8101
0.1504 58.0 580 0.7023 0.7785
0.1549 59.0 590 0.8293 0.7152
0.1142 60.0 600 0.7865 0.7658
0.1271 61.0 610 0.6282 0.8038
0.1134 62.0 620 0.7117 0.7658
0.0954 63.0 630 0.5500 0.8165
0.1011 64.0 640 0.5801 0.7911
0.0878 65.0 650 0.4268 0.8418
0.1065 66.0 660 0.6277 0.8038
0.1298 67.0 670 0.5940 0.8038
0.111 68.0 680 0.6945 0.7785
0.0955 69.0 690 0.6320 0.8038
0.0728 70.0 700 0.6484 0.7975
0.0893 71.0 710 0.5842 0.8165
0.0962 72.0 720 0.5417 0.8354
0.0963 73.0 730 0.7366 0.7848
0.1103 74.0 740 0.5413 0.8354
0.1145 75.0 750 0.6310 0.7911
0.1093 76.0 760 0.5101 0.8481
0.0934 77.0 770 0.5049 0.8101
0.0914 78.0 780 0.6828 0.7975
0.0739 79.0 790 0.8685 0.7595
0.1098 80.0 800 0.5542 0.7975
0.0748 81.0 810 0.4864 0.8354
0.0763 82.0 820 0.6708 0.7785
0.0805 83.0 830 0.5730 0.8291
0.0936 84.0 840 0.5680 0.8228
0.0664 85.0 850 0.5645 0.8228
0.0932 86.0 860 0.4601 0.8608
0.0846 87.0 870 0.7324 0.7911
0.0799 88.0 880 0.6234 0.8101
0.0707 89.0 890 0.4808 0.8544
0.0626 90.0 900 0.6119 0.7848
0.066 91.0 910 0.5173 0.8228
0.0701 92.0 920 0.7111 0.7722
0.0862 93.0 930 0.6035 0.7975
0.0397 94.0 940 0.5329 0.8418
0.095 95.0 950 0.6635 0.7911
0.0688 96.0 960 0.4878 0.8734
0.07 97.0 970 0.5253 0.8608
0.0704 98.0 980 0.4443 0.8608
0.0883 99.0 990 0.5571 0.8165
0.064 100.0 1000 0.7047 0.7911
0.0547 101.0 1010 0.6558 0.8101
0.0686 102.0 1020 0.6330 0.8165
0.0806 103.0 1030 0.5754 0.8354
0.0481 104.0 1040 0.5074 0.8544
0.0499 105.0 1050 0.6701 0.8165
0.0703 106.0 1060 0.6151 0.8291
0.0921 107.0 1070 0.5935 0.8354
0.0426 108.0 1080 0.6534 0.7975
0.0618 109.0 1090 0.5265 0.8165
0.0597 110.0 1100 0.5604 0.8354
0.0471 111.0 1110 0.5451 0.8354
0.0541 112.0 1120 0.5182 0.8544
0.0369 113.0 1130 0.5276 0.8291
0.0571 114.0 1140 0.4766 0.8354
0.0469 115.0 1150 0.6508 0.8101
0.0877 116.0 1160 0.5894 0.8418
0.0681 117.0 1170 0.4952 0.8418
0.0303 118.0 1180 0.5804 0.8418
0.0536 119.0 1190 0.7055 0.8101
0.0576 120.0 1200 0.5969 0.8165

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results