invitrace-ilivewell
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.3020
- Accuracy: 0.7233
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
5.0735 | 0.0803 | 200 | 5.0486 | 0.0462 |
4.947 | 0.1605 | 400 | 4.9254 | 0.1136 |
4.8134 | 0.2408 | 600 | 4.7931 | 0.2101 |
4.653 | 0.3210 | 800 | 4.6667 | 0.2133 |
4.5977 | 0.4013 | 1000 | 4.5528 | 0.2777 |
4.557 | 0.4815 | 1200 | 4.4413 | 0.2940 |
4.2948 | 0.5618 | 1400 | 4.3298 | 0.3233 |
4.2969 | 0.6421 | 1600 | 4.2270 | 0.3502 |
4.0779 | 0.7223 | 1800 | 4.1218 | 0.3743 |
4.0473 | 0.8026 | 2000 | 4.0337 | 0.3897 |
4.0347 | 0.8828 | 2200 | 3.9426 | 0.3879 |
3.9185 | 0.9631 | 2400 | 3.8602 | 0.4158 |
3.5221 | 1.0433 | 2600 | 3.7700 | 0.4275 |
3.5359 | 1.1236 | 2800 | 3.6728 | 0.4553 |
3.4622 | 1.2039 | 3000 | 3.5906 | 0.4541 |
3.431 | 1.2841 | 3200 | 3.5025 | 0.4732 |
3.3443 | 1.3644 | 3400 | 3.4168 | 0.4869 |
3.4344 | 1.4446 | 3600 | 3.3382 | 0.5025 |
3.249 | 1.5249 | 3800 | 3.2703 | 0.5160 |
3.2028 | 1.6051 | 4000 | 3.2017 | 0.4927 |
3.0678 | 1.6854 | 4200 | 3.1264 | 0.5152 |
3.0626 | 1.7657 | 4400 | 3.0487 | 0.5410 |
2.953 | 1.8459 | 4600 | 2.9699 | 0.5414 |
3.0011 | 1.9262 | 4800 | 2.9165 | 0.5503 |
2.9428 | 2.0064 | 5000 | 2.8433 | 0.5665 |
2.6717 | 2.0867 | 5200 | 2.7818 | 0.5569 |
2.6253 | 2.1669 | 5400 | 2.7195 | 0.5715 |
2.3998 | 2.2472 | 5600 | 2.6458 | 0.5874 |
2.2518 | 2.3274 | 5800 | 2.5901 | 0.5922 |
2.4229 | 2.4077 | 6000 | 2.5301 | 0.5912 |
2.592 | 2.4880 | 6200 | 2.4855 | 0.5984 |
2.0625 | 2.5682 | 6400 | 2.4277 | 0.6002 |
2.22 | 2.6485 | 6600 | 2.3784 | 0.6087 |
2.3326 | 2.7287 | 6800 | 2.3250 | 0.6123 |
2.1592 | 2.8090 | 7000 | 2.2958 | 0.6095 |
2.1491 | 2.8892 | 7200 | 2.2226 | 0.6303 |
2.0644 | 2.9695 | 7400 | 2.1923 | 0.6231 |
1.916 | 3.0498 | 7600 | 2.1510 | 0.6352 |
1.7435 | 3.1300 | 7800 | 2.0985 | 0.6388 |
1.761 | 3.2103 | 8000 | 2.0753 | 0.6404 |
1.5321 | 3.2905 | 8200 | 2.0396 | 0.6426 |
1.6117 | 3.3708 | 8400 | 1.9855 | 0.6530 |
1.5593 | 3.4510 | 8600 | 1.9805 | 0.6352 |
1.9288 | 3.5313 | 8800 | 1.9188 | 0.6564 |
1.5736 | 3.6116 | 9000 | 1.9141 | 0.6556 |
1.5544 | 3.6918 | 9200 | 1.8633 | 0.6619 |
1.3811 | 3.7721 | 9400 | 1.8466 | 0.6621 |
1.608 | 3.8523 | 9600 | 1.8116 | 0.6687 |
1.533 | 3.9326 | 9800 | 1.7784 | 0.6733 |
1.5496 | 4.0128 | 10000 | 1.7532 | 0.6755 |
1.3532 | 4.0931 | 10200 | 1.7399 | 0.6779 |
1.3787 | 4.1734 | 10400 | 1.6996 | 0.6795 |
1.4278 | 4.2536 | 10600 | 1.6893 | 0.6771 |
1.3531 | 4.3339 | 10800 | 1.6629 | 0.6759 |
1.2811 | 4.4141 | 11000 | 1.6493 | 0.6801 |
1.3787 | 4.4944 | 11200 | 1.6278 | 0.6855 |
1.2663 | 4.5746 | 11400 | 1.6101 | 0.6926 |
1.0892 | 4.6549 | 11600 | 1.5842 | 0.6887 |
1.3045 | 4.7352 | 11800 | 1.5758 | 0.6911 |
1.4239 | 4.8154 | 12000 | 1.5647 | 0.6930 |
1.065 | 4.8957 | 12200 | 1.5403 | 0.6905 |
1.1467 | 4.9759 | 12400 | 1.5257 | 0.6986 |
0.8755 | 5.0562 | 12600 | 1.5075 | 0.6964 |
1.0427 | 5.1364 | 12800 | 1.4977 | 0.7074 |
1.264 | 5.2167 | 13000 | 1.4951 | 0.6956 |
0.9822 | 5.2970 | 13200 | 1.4787 | 0.6990 |
1.1234 | 5.3772 | 13400 | 1.4673 | 0.7008 |
0.9394 | 5.4575 | 13600 | 1.4632 | 0.6998 |
0.9231 | 5.5377 | 13800 | 1.4346 | 0.7074 |
1.1829 | 5.6180 | 14000 | 1.4364 | 0.7092 |
0.9687 | 5.6982 | 14200 | 1.4231 | 0.7080 |
0.8915 | 5.7785 | 14400 | 1.4166 | 0.7104 |
1.013 | 5.8587 | 14600 | 1.4056 | 0.7110 |
1.0437 | 5.9390 | 14800 | 1.3840 | 0.7186 |
0.8936 | 6.0193 | 15000 | 1.3896 | 0.7142 |
0.8968 | 6.0995 | 15200 | 1.3853 | 0.7118 |
0.8978 | 6.1798 | 15400 | 1.3748 | 0.7154 |
0.8638 | 6.2600 | 15600 | 1.3686 | 0.7190 |
0.7187 | 6.3403 | 15800 | 1.3664 | 0.7186 |
0.7554 | 6.4205 | 16000 | 1.3672 | 0.7124 |
0.7664 | 6.5008 | 16200 | 1.3484 | 0.7192 |
0.9791 | 6.5811 | 16400 | 1.3500 | 0.7178 |
0.8325 | 6.6613 | 16600 | 1.3387 | 0.7184 |
1.0476 | 6.7416 | 16800 | 1.3390 | 0.7174 |
0.7053 | 6.8218 | 17000 | 1.3268 | 0.7217 |
0.9869 | 6.9021 | 17200 | 1.3270 | 0.7204 |
0.8179 | 6.9823 | 17400 | 1.3169 | 0.7297 |
0.9584 | 7.0626 | 17600 | 1.3119 | 0.7271 |
0.6394 | 7.1429 | 17800 | 1.3158 | 0.7243 |
0.9094 | 7.2231 | 18000 | 1.3056 | 0.7231 |
0.7837 | 7.3034 | 18200 | 1.3174 | 0.7239 |
0.7168 | 7.3836 | 18400 | 1.3088 | 0.7265 |
0.8603 | 7.4639 | 18600 | 1.3149 | 0.7204 |
0.6326 | 7.5441 | 18800 | 1.3041 | 0.7253 |
0.8656 | 7.6244 | 19000 | 1.3075 | 0.7253 |
0.7517 | 7.7047 | 19200 | 1.3181 | 0.7227 |
0.8719 | 7.7849 | 19400 | 1.2977 | 0.7273 |
0.6939 | 7.8652 | 19600 | 1.2965 | 0.7249 |
0.8371 | 7.9454 | 19800 | 1.3020 | 0.7233 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 64
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Invitrace/I-live-well-foodai
Base model
google/vit-base-patch16-224-in21k