Edit model card

114-tiny_tobacco3482_kd

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1493
  • Accuracy: 0.71
  • Brier Loss: 0.5252
  • Nll: 1.5452
  • F1 Micro: 0.7100
  • F1 Macro: 0.6059
  • Ece: 0.3891
  • Aurc: 0.0888

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 0.9704 0.23 0.8929 7.3757 0.23 0.1649 0.2840 0.7790
No log 2.0 26 0.4936 0.34 0.8279 4.5780 0.34 0.2115 0.3215 0.5666
No log 3.0 39 0.3759 0.455 0.7568 3.7623 0.455 0.3073 0.3262 0.3769
No log 4.0 52 0.3165 0.545 0.7159 2.7078 0.545 0.4004 0.3944 0.2710
No log 5.0 65 0.2807 0.6 0.6537 2.7061 0.6 0.4506 0.3539 0.2091
No log 6.0 78 0.2714 0.575 0.6476 2.6202 0.575 0.4324 0.3576 0.2081
No log 7.0 91 0.2473 0.64 0.6163 2.4990 0.64 0.5080 0.3882 0.1616
No log 8.0 104 0.2752 0.625 0.5837 2.5795 0.625 0.5022 0.3291 0.1892
No log 9.0 117 0.3128 0.59 0.6027 2.7893 0.59 0.4859 0.2944 0.2296
No log 10.0 130 0.2292 0.66 0.5612 2.3152 0.66 0.5253 0.3822 0.1155
No log 11.0 143 0.2676 0.665 0.5632 2.6937 0.665 0.5479 0.3608 0.1422
No log 12.0 156 0.2512 0.65 0.5543 2.2519 0.65 0.5533 0.3324 0.1300
No log 13.0 169 0.2053 0.67 0.5555 1.9904 0.67 0.5739 0.3659 0.1162
No log 14.0 182 0.2281 0.68 0.5613 2.2343 0.68 0.5508 0.3683 0.1193
No log 15.0 195 0.2029 0.705 0.5511 1.6184 0.705 0.5984 0.4175 0.0937
No log 16.0 208 0.2090 0.71 0.5459 1.9750 0.7100 0.6052 0.3911 0.0983
No log 17.0 221 0.1828 0.705 0.5385 1.8272 0.705 0.5973 0.3700 0.0969
No log 18.0 234 0.1739 0.73 0.5358 1.6202 0.7300 0.6180 0.4115 0.0962
No log 19.0 247 0.1847 0.685 0.5300 2.1083 0.685 0.5717 0.3582 0.1047
No log 20.0 260 0.1839 0.69 0.5390 1.8560 0.69 0.5932 0.3708 0.1090
No log 21.0 273 0.1756 0.72 0.5417 1.7203 0.72 0.6132 0.4000 0.0855
No log 22.0 286 0.1727 0.69 0.5212 1.9503 0.69 0.5853 0.3574 0.1041
No log 23.0 299 0.1684 0.72 0.5333 1.5951 0.72 0.6229 0.3922 0.0943
No log 24.0 312 0.1652 0.735 0.5263 1.6768 0.735 0.6519 0.4001 0.0920
No log 25.0 325 0.1637 0.735 0.5363 1.6879 0.735 0.6514 0.4079 0.0819
No log 26.0 338 0.1609 0.675 0.5299 1.5660 0.675 0.5602 0.3593 0.0989
No log 27.0 351 0.1581 0.725 0.5210 1.5886 0.7250 0.6206 0.3739 0.0847
No log 28.0 364 0.1591 0.71 0.5286 1.7728 0.7100 0.6076 0.3868 0.0921
No log 29.0 377 0.1544 0.715 0.5251 1.6215 0.715 0.6201 0.3813 0.0948
No log 30.0 390 0.1618 0.705 0.5340 1.5824 0.705 0.6064 0.3853 0.1000
No log 31.0 403 0.1580 0.705 0.5202 1.7228 0.705 0.5949 0.3710 0.0963
No log 32.0 416 0.1531 0.72 0.5257 1.6330 0.72 0.6137 0.3857 0.0904
No log 33.0 429 0.1521 0.72 0.5248 1.6212 0.72 0.6349 0.3928 0.0898
No log 34.0 442 0.1526 0.71 0.5261 1.4652 0.7100 0.6141 0.3829 0.0905
No log 35.0 455 0.1529 0.7 0.5256 1.5784 0.7 0.5926 0.3885 0.0887
No log 36.0 468 0.1526 0.735 0.5268 1.5163 0.735 0.6514 0.3991 0.0878
No log 37.0 481 0.1497 0.695 0.5222 1.6068 0.695 0.5785 0.3918 0.0919
No log 38.0 494 0.1488 0.72 0.5248 1.5401 0.72 0.6115 0.3905 0.0891
0.1554 39.0 507 0.1504 0.715 0.5208 1.5917 0.715 0.6152 0.3730 0.0894
0.1554 40.0 520 0.1487 0.725 0.5260 1.5258 0.7250 0.6399 0.3998 0.0879
0.1554 41.0 533 0.1484 0.71 0.5250 1.6093 0.7100 0.6073 0.3908 0.0880
0.1554 42.0 546 0.1481 0.715 0.5245 1.5711 0.715 0.6096 0.3857 0.0860
0.1554 43.0 559 0.1493 0.705 0.5243 1.6261 0.705 0.6000 0.3727 0.0901
0.1554 44.0 572 0.1495 0.71 0.5242 1.5942 0.7100 0.6080 0.3808 0.0868
0.1554 45.0 585 0.1495 0.71 0.5242 1.5417 0.7100 0.6059 0.3813 0.0881
0.1554 46.0 598 0.1490 0.715 0.5239 1.5403 0.715 0.6134 0.3826 0.0893
0.1554 47.0 611 0.1486 0.715 0.5248 1.5387 0.715 0.6112 0.3754 0.0883
0.1554 48.0 624 0.1491 0.71 0.5252 1.5527 0.7100 0.6059 0.3761 0.0889
0.1554 49.0 637 0.1491 0.71 0.5249 1.5545 0.7100 0.6059 0.3880 0.0885
0.1554 50.0 650 0.1489 0.71 0.5247 1.5376 0.7100 0.6059 0.3900 0.0895
0.1554 51.0 663 0.1492 0.71 0.5257 1.5385 0.7100 0.6059 0.3857 0.0890
0.1554 52.0 676 0.1491 0.71 0.5251 1.5460 0.7100 0.6059 0.3816 0.0888
0.1554 53.0 689 0.1491 0.71 0.5248 1.5429 0.7100 0.6059 0.3806 0.0886
0.1554 54.0 702 0.1489 0.71 0.5247 1.5426 0.7100 0.6059 0.3949 0.0887
0.1554 55.0 715 0.1492 0.71 0.5258 1.5550 0.7100 0.6059 0.3921 0.0890
0.1554 56.0 728 0.1492 0.71 0.5248 1.5470 0.7100 0.6059 0.3859 0.0888
0.1554 57.0 741 0.1491 0.71 0.5251 1.5447 0.7100 0.6059 0.4035 0.0888
0.1554 58.0 754 0.1491 0.71 0.5248 1.5440 0.7100 0.6059 0.4033 0.0886
0.1554 59.0 767 0.1491 0.71 0.5246 1.5561 0.7100 0.6059 0.3920 0.0890
0.1554 60.0 780 0.1492 0.71 0.5251 1.5461 0.7100 0.6059 0.3847 0.0889
0.1554 61.0 793 0.1493 0.71 0.5251 1.5455 0.7100 0.6059 0.3931 0.0887
0.1554 62.0 806 0.1493 0.71 0.5252 1.5443 0.7100 0.6059 0.3912 0.0889
0.1554 63.0 819 0.1493 0.71 0.5253 1.5441 0.7100 0.6059 0.3944 0.0887
0.1554 64.0 832 0.1492 0.71 0.5249 1.5444 0.7100 0.6059 0.3891 0.0888
0.1554 65.0 845 0.1492 0.71 0.5255 1.5430 0.7100 0.6059 0.3995 0.0888
0.1554 66.0 858 0.1493 0.71 0.5250 1.5435 0.7100 0.6059 0.3991 0.0890
0.1554 67.0 871 0.1493 0.71 0.5252 1.5449 0.7100 0.6059 0.3991 0.0890
0.1554 68.0 884 0.1492 0.71 0.5251 1.5458 0.7100 0.6059 0.3968 0.0889
0.1554 69.0 897 0.1493 0.71 0.5250 1.5468 0.7100 0.6059 0.4036 0.0888
0.1554 70.0 910 0.1494 0.71 0.5253 1.5464 0.7100 0.6059 0.3889 0.0887
0.1554 71.0 923 0.1493 0.71 0.5251 1.5452 0.7100 0.6059 0.3888 0.0887
0.1554 72.0 936 0.1493 0.71 0.5250 1.5457 0.7100 0.6059 0.3928 0.0888
0.1554 73.0 949 0.1494 0.71 0.5253 1.5455 0.7100 0.6059 0.3946 0.0889
0.1554 74.0 962 0.1493 0.71 0.5251 1.5441 0.7100 0.6059 0.3928 0.0888
0.1554 75.0 975 0.1493 0.71 0.5252 1.5455 0.7100 0.6059 0.3929 0.0891
0.1554 76.0 988 0.1493 0.71 0.5252 1.5449 0.7100 0.6059 0.3940 0.0886
0.0002 77.0 1001 0.1493 0.71 0.5253 1.5455 0.7100 0.6059 0.3891 0.0887
0.0002 78.0 1014 0.1493 0.71 0.5251 1.5468 0.7100 0.6059 0.3889 0.0887
0.0002 79.0 1027 0.1494 0.71 0.5252 1.5462 0.7100 0.6059 0.3891 0.0888
0.0002 80.0 1040 0.1493 0.71 0.5252 1.5443 0.7100 0.6059 0.3994 0.0886
0.0002 81.0 1053 0.1493 0.71 0.5251 1.5451 0.7100 0.6059 0.3890 0.0887
0.0002 82.0 1066 0.1493 0.71 0.5251 1.5448 0.7100 0.6059 0.3938 0.0886
0.0002 83.0 1079 0.1493 0.71 0.5251 1.5453 0.7100 0.6059 0.3890 0.0886
0.0002 84.0 1092 0.1493 0.71 0.5252 1.5462 0.7100 0.6059 0.3890 0.0888
0.0002 85.0 1105 0.1493 0.71 0.5252 1.5454 0.7100 0.6059 0.3891 0.0887
0.0002 86.0 1118 0.1493 0.71 0.5252 1.5452 0.7100 0.6059 0.3890 0.0887
0.0002 87.0 1131 0.1494 0.71 0.5252 1.5452 0.7100 0.6059 0.3891 0.0888
0.0002 88.0 1144 0.1494 0.71 0.5252 1.5453 0.7100 0.6059 0.3891 0.0886
0.0002 89.0 1157 0.1493 0.71 0.5252 1.5451 0.7100 0.6059 0.3891 0.0887
0.0002 90.0 1170 0.1493 0.71 0.5252 1.5449 0.7100 0.6059 0.3891 0.0887
0.0002 91.0 1183 0.1493 0.71 0.5252 1.5454 0.7100 0.6059 0.3890 0.0887
0.0002 92.0 1196 0.1493 0.71 0.5252 1.5450 0.7100 0.6059 0.3891 0.0887
0.0002 93.0 1209 0.1493 0.71 0.5252 1.5454 0.7100 0.6059 0.3891 0.0888
0.0002 94.0 1222 0.1493 0.71 0.5252 1.5453 0.7100 0.6059 0.3890 0.0886
0.0002 95.0 1235 0.1493 0.71 0.5252 1.5454 0.7100 0.6059 0.3891 0.0887
0.0002 96.0 1248 0.1493 0.71 0.5252 1.5450 0.7100 0.6059 0.3890 0.0887
0.0002 97.0 1261 0.1493 0.71 0.5252 1.5455 0.7100 0.6059 0.3891 0.0888
0.0002 98.0 1274 0.1494 0.71 0.5252 1.5453 0.7100 0.6059 0.3891 0.0886
0.0002 99.0 1287 0.1493 0.71 0.5252 1.5452 0.7100 0.6059 0.3891 0.0887
0.0002 100.0 1300 0.1493 0.71 0.5252 1.5452 0.7100 0.6059 0.3891 0.0888

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1.post200
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
2