Edit model card

invitrace-vit-food

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3123
  • Accuracy: 0.7150

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
5.0709 0.0803 200 5.0587 0.0460
4.953 0.1605 400 4.9295 0.1174
4.8249 0.2408 600 4.7934 0.2099
4.6523 0.3210 800 4.6597 0.2266
4.5648 0.4013 1000 4.5348 0.2814
4.5212 0.4815 1200 4.4125 0.2882
4.2975 0.5618 1400 4.2945 0.3345
4.2548 0.6421 1600 4.1838 0.3761
4.0395 0.7223 1800 4.0774 0.3873
3.9615 0.8026 2000 3.9822 0.4080
3.9325 0.8828 2200 3.8886 0.4124
3.8862 0.9631 2400 3.7977 0.4479
3.3958 1.0433 2600 3.6938 0.4618
3.4241 1.1236 2800 3.6019 0.4722
3.3326 1.2039 3000 3.5157 0.4951
3.2437 1.2841 3200 3.4209 0.5139
3.2519 1.3644 3400 3.3291 0.5198
3.2528 1.4446 3600 3.2425 0.5308
3.117 1.5249 3800 3.1715 0.5505
3.014 1.6051 4000 3.0965 0.5408
2.8688 1.6854 4200 3.0171 0.5577
2.9096 1.7657 4400 2.9386 0.5748
2.8936 1.8459 4600 2.8630 0.5788
2.7947 1.9262 4800 2.7981 0.5842
2.7247 2.0064 5000 2.7218 0.5910
2.3716 2.0867 5200 2.6467 0.6091
2.3813 2.1669 5400 2.5855 0.6037
2.1125 2.2472 5600 2.5160 0.6091
2.0332 2.3274 5800 2.4506 0.6207
2.0413 2.4077 6000 2.3949 0.6263
2.3041 2.4880 6200 2.3396 0.6183
1.7894 2.5682 6400 2.2855 0.6372
1.9194 2.6485 6600 2.2341 0.6386
2.0286 2.7287 6800 2.1997 0.6392
1.7409 2.8090 7000 2.1385 0.6478
1.794 2.8892 7200 2.0876 0.6528
1.6189 2.9695 7400 2.0540 0.6570
1.5587 3.0498 7600 2.0068 0.6629
1.2941 3.1300 7800 1.9610 0.6701
1.3048 3.2103 8000 1.9325 0.6639
1.1526 3.2905 8200 1.8883 0.6699
1.2333 3.3708 8400 1.8505 0.6693
1.1094 3.4510 8600 1.8273 0.6703
1.4851 3.5313 8800 1.7896 0.6829
1.1991 3.6116 9000 1.7648 0.6829
1.1898 3.6918 9200 1.7250 0.6903
0.973 3.7721 9400 1.7261 0.6769
1.2646 3.8523 9600 1.6804 0.6920
1.0756 3.9326 9800 1.6639 0.6934
1.0885 4.0128 10000 1.6324 0.6924
0.8466 4.0931 10200 1.6131 0.6994
0.8781 4.1734 10400 1.5981 0.6942
0.8557 4.2536 10600 1.5804 0.6988
0.852 4.3339 10800 1.5532 0.7014
0.7597 4.4141 11000 1.5395 0.7036
0.9044 4.4944 11200 1.5195 0.7034
0.7762 4.5746 11400 1.5106 0.7014
0.6486 4.6549 11600 1.4979 0.7034
0.7373 4.7352 11800 1.4804 0.7032
0.9194 4.8154 12000 1.4659 0.7038
0.6513 4.8957 12200 1.4487 0.7050
0.7235 4.9759 12400 1.4307 0.7082
0.4407 5.0562 12600 1.4304 0.7082
0.5979 5.1364 12800 1.4227 0.7112
0.6776 5.2167 13000 1.4237 0.7048
0.5239 5.2970 13200 1.4098 0.7066
0.5614 5.3772 13400 1.3947 0.7110
0.5483 5.4575 13600 1.3901 0.7114
0.4797 5.5377 13800 1.3844 0.7082
0.5795 5.6180 14000 1.3816 0.7120
0.5108 5.6982 14200 1.3748 0.7098
0.3919 5.7785 14400 1.3662 0.7138
0.5572 5.8587 14600 1.3542 0.7154
0.5333 5.9390 14800 1.3451 0.7152
0.2997 6.0193 15000 1.3406 0.7217
0.3923 6.0995 15200 1.3472 0.7162
0.4682 6.1798 15400 1.3437 0.7170
0.3758 6.2600 15600 1.3396 0.7162
0.3123 6.3403 15800 1.3393 0.7182
0.2974 6.4205 16000 1.3303 0.7150
0.3374 6.5008 16200 1.3275 0.7170
0.5128 6.5811 16400 1.3322 0.7126
0.4074 6.6613 16600 1.3254 0.7162
0.4761 6.7416 16800 1.3249 0.7144
0.2215 6.8218 17000 1.3247 0.7134
0.4581 6.9021 17200 1.3237 0.7120
0.2686 6.9823 17400 1.3138 0.7162
0.375 7.0626 17600 1.3197 0.7146
0.2512 7.1429 17800 1.3172 0.7146
0.3274 7.2231 18000 1.3222 0.7134
0.3209 7.3034 18200 1.3272 0.7126
0.2441 7.3836 18400 1.3216 0.7124
0.2725 7.4639 18600 1.3156 0.7132
0.2326 7.5441 18800 1.3155 0.7132
0.3594 7.6244 19000 1.3140 0.7162
0.2297 7.7047 19200 1.3133 0.7152
0.3722 7.7849 19400 1.3160 0.7130
0.202 7.8652 19600 1.3131 0.7142
0.2272 7.9454 19800 1.3123 0.7150

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
85.9M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Tuu-invitrace/invitrace-vit-food

Finetuned
(1681)
this model

Evaluation results