Edit model card

vit-base-patch16-224-in21k-finetune-os300_norm

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3499
  • Accuracy: 0.8577

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.005
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.038 0.98 11 0.7215 0.6568
0.7212 1.96 22 0.7280 0.6568
0.7201 2.93 33 0.7285 0.6568
0.7308 4.0 45 0.7297 0.6568
0.7341 4.98 56 0.7277 0.6568
0.7255 5.96 67 0.7350 0.6568
0.7274 6.93 78 0.7258 0.6568
0.7189 8.0 90 0.7205 0.6568
0.7194 8.98 101 0.7117 0.6568
0.7437 9.96 112 0.7340 0.6568
0.7578 10.93 123 0.7317 0.6568
0.7307 12.0 135 0.7288 0.6568
0.7279 12.98 146 0.7246 0.6568
0.727 13.96 157 0.7166 0.6568
0.7161 14.93 168 0.7306 0.5117
0.6775 16.0 180 0.6360 0.6568
0.6487 16.98 191 0.6166 0.7113
0.607 17.96 202 0.5871 0.7240
0.5961 18.93 213 0.5606 0.7183
0.5681 20.0 225 0.5459 0.7381
0.5756 20.98 236 0.5375 0.7481
0.5666 21.96 247 0.5720 0.7042
0.5658 22.93 258 0.5127 0.7481
0.5461 24.0 270 0.5254 0.7360
0.5484 24.98 281 0.5124 0.7431
0.5442 25.96 292 0.5665 0.7282
0.5573 26.93 303 0.5019 0.7594
0.535 28.0 315 0.5112 0.7792
0.5319 28.98 326 0.4729 0.7856
0.4953 29.96 337 0.6292 0.7318
0.5408 30.93 348 0.5083 0.7877
0.5215 32.0 360 0.5131 0.7799
0.5291 32.98 371 0.4867 0.7983
0.4971 33.96 382 0.4742 0.7962
0.5004 34.93 393 0.4930 0.7806
0.4868 36.0 405 0.4550 0.8061
0.4784 36.98 416 0.4667 0.7912
0.469 37.96 427 0.4915 0.7856
0.455 38.93 438 0.5016 0.7537
0.4903 40.0 450 0.4874 0.7877
0.4904 40.98 461 0.5222 0.7629
0.513 41.96 472 0.4772 0.7877
0.4913 42.93 483 0.5386 0.7629
0.5216 44.0 495 0.4830 0.7827
0.4931 44.98 506 0.4692 0.7948
0.4835 45.96 517 0.4941 0.7757
0.5035 46.93 528 0.4716 0.7884
0.5068 48.0 540 0.5210 0.7841
0.5207 48.98 551 0.4656 0.8132
0.4753 49.96 562 0.4529 0.8025
0.4718 50.93 573 0.4403 0.8075
0.4757 52.0 585 0.4305 0.8132
0.4352 52.98 596 0.4104 0.8245
0.4349 53.96 607 0.4390 0.8125
0.4508 54.93 618 0.4409 0.8011
0.4596 56.0 630 0.4131 0.8323
0.4321 56.98 641 0.4257 0.8188
0.4433 57.96 652 0.4421 0.7969
0.4423 58.93 663 0.4430 0.7990
0.446 60.0 675 0.4328 0.8181
0.425 60.98 686 0.4385 0.8011
0.4363 61.96 697 0.4225 0.8139
0.4358 62.93 708 0.4114 0.8224
0.415 64.0 720 0.4110 0.8174
0.423 64.98 731 0.4090 0.8238
0.4161 65.96 742 0.4011 0.8160
0.4103 66.93 753 0.4207 0.8188
0.4254 68.0 765 0.4503 0.8004
0.429 68.98 776 0.4392 0.8033
0.4341 69.96 787 0.4159 0.8209
0.4574 70.93 798 0.4165 0.8224
0.4136 72.0 810 0.3954 0.8337
0.4226 72.98 821 0.3996 0.8301
0.4124 73.96 832 0.4205 0.8089
0.4209 74.93 843 0.4288 0.8146
0.4493 76.0 855 0.4193 0.8167
0.4302 76.98 866 0.4239 0.8132
0.4385 77.96 877 0.4187 0.8160
0.4388 78.93 888 0.4379 0.8047
0.4294 80.0 900 0.4048 0.8309
0.4207 80.98 911 0.4287 0.8139
0.4316 81.96 922 0.4183 0.8202
0.4283 82.93 933 0.4091 0.8224
0.4227 84.0 945 0.4070 0.8231
0.4335 84.98 956 0.4184 0.8224
0.4433 85.96 967 0.4148 0.8132
0.4287 86.93 978 0.4188 0.8167
0.4327 88.0 990 0.4091 0.8224
0.427 88.98 1001 0.4118 0.8202
0.4194 89.96 1012 0.4220 0.8153
0.4213 90.93 1023 0.4195 0.8096
0.4288 92.0 1035 0.4023 0.8188
0.4123 92.98 1046 0.4005 0.8393
0.4172 93.96 1057 0.3812 0.8309
0.4109 94.93 1068 0.3838 0.8294
0.4128 96.0 1080 0.3878 0.8294
0.3976 96.98 1091 0.4023 0.8259
0.4097 97.96 1102 0.3979 0.8153
0.4059 98.93 1113 0.3953 0.8294
0.4011 100.0 1125 0.3804 0.8344
0.4126 100.98 1136 0.3915 0.8259
0.425 101.96 1147 0.4140 0.8160
0.4066 102.93 1158 0.4207 0.8238
0.4265 104.0 1170 0.4016 0.8259
0.4225 104.98 1181 0.4059 0.8252
0.4201 105.96 1192 0.3980 0.8309
0.408 106.93 1203 0.4171 0.8202
0.422 108.0 1215 0.4475 0.8096
0.4251 108.98 1226 0.4139 0.8224
0.4261 109.96 1237 0.4113 0.8167
0.4147 110.93 1248 0.4355 0.8089
0.4407 112.0 1260 0.4453 0.8146
0.4167 112.98 1271 0.3987 0.8372
0.4152 113.96 1282 0.4008 0.8273
0.3952 114.93 1293 0.3843 0.8351
0.4159 116.0 1305 0.3949 0.8330
0.4014 116.98 1316 0.4113 0.8040
0.4203 117.96 1327 0.3988 0.8309
0.4159 118.93 1338 0.4037 0.8351
0.4065 120.0 1350 0.3847 0.8393
0.3938 120.98 1361 0.4023 0.8280
0.4202 121.96 1372 0.4015 0.8301
0.4316 122.93 1383 0.4156 0.8174
0.416 124.0 1395 0.3924 0.8344
0.4141 124.98 1406 0.3839 0.8358
0.4157 125.96 1417 0.3940 0.8224
0.3906 126.93 1428 0.3826 0.8287
0.4051 128.0 1440 0.3807 0.8316
0.3835 128.98 1451 0.3866 0.8386
0.3976 129.96 1462 0.3832 0.8457
0.3939 130.93 1473 0.3745 0.8351
0.3862 132.0 1485 0.3897 0.8408
0.3919 132.98 1496 0.3841 0.8429
0.3928 133.96 1507 0.3744 0.8507
0.3976 134.93 1518 0.3610 0.8535
0.3834 136.0 1530 0.3711 0.8422
0.3827 136.98 1541 0.3860 0.8422
0.4036 137.96 1552 0.3973 0.8301
0.3862 138.93 1563 0.3720 0.8429
0.3876 140.0 1575 0.3701 0.8478
0.3941 140.98 1586 0.3579 0.8500
0.3692 141.96 1597 0.3609 0.8521
0.3791 142.93 1608 0.3666 0.8493
0.3774 144.0 1620 0.3601 0.8521
0.3708 144.98 1631 0.3592 0.8549
0.3943 145.96 1642 0.3593 0.8493
0.3856 146.93 1653 0.3686 0.8429
0.381 148.0 1665 0.3755 0.8429
0.3965 148.98 1676 0.3698 0.8471
0.3862 149.96 1687 0.3641 0.8485
0.3825 150.93 1698 0.3652 0.8528
0.3751 152.0 1710 0.3672 0.8422
0.3812 152.98 1721 0.3626 0.8507
0.3805 153.96 1732 0.3615 0.8493
0.3755 154.93 1743 0.3678 0.8500
0.3802 156.0 1755 0.3682 0.8478
0.3781 156.98 1766 0.3802 0.8485
0.3845 157.96 1777 0.3753 0.8507
0.3893 158.93 1788 0.3694 0.8485
0.3676 160.0 1800 0.3652 0.8493
0.4114 160.98 1811 0.4020 0.8309
0.39 161.96 1822 0.3615 0.8528
0.3831 162.93 1833 0.3570 0.8535
0.3651 164.0 1845 0.3642 0.8401
0.3662 164.98 1856 0.3557 0.8577
0.3878 165.96 1867 0.3650 0.8457
0.376 166.93 1878 0.3601 0.8500
0.3724 168.0 1890 0.3617 0.8570
0.3661 168.98 1901 0.3677 0.8535
0.3869 169.96 1912 0.3617 0.8500
0.3717 170.93 1923 0.3594 0.8436
0.3698 172.0 1935 0.3632 0.8514
0.3761 172.98 1946 0.3614 0.8471
0.3847 173.96 1957 0.3566 0.8535
0.3716 174.93 1968 0.3570 0.8528
0.3695 176.0 1980 0.3557 0.8556
0.3702 176.98 1991 0.3544 0.8556
0.372 177.96 2002 0.3522 0.8542
0.3648 178.93 2013 0.3562 0.8493
0.3744 180.0 2025 0.3577 0.8507
0.3546 180.98 2036 0.3524 0.8535
0.3613 181.96 2047 0.3478 0.8528
0.3581 182.93 2058 0.3534 0.8549
0.3709 184.0 2070 0.3637 0.8521
0.3699 184.98 2081 0.3544 0.8549
0.3701 185.96 2092 0.3506 0.8613
0.3634 186.93 2103 0.3559 0.8592
0.3668 188.0 2115 0.3510 0.8585
0.3629 188.98 2126 0.3485 0.8592
0.3544 189.96 2137 0.3478 0.8627
0.3714 190.93 2148 0.3512 0.8592
0.3681 192.0 2160 0.3522 0.8592
0.3466 192.98 2171 0.3523 0.8570
0.3727 193.96 2182 0.3504 0.8606
0.3564 194.93 2193 0.3501 0.8577
0.3616 195.56 2200 0.3499 0.8577

Framework versions

  • Transformers 4.39.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
1

Finetuned from