FASHION-vision / README.md
nathanReitinger's picture
Training in progress, epoch 1
17a31f3 verified
|
raw
history blame
No virus
7.73 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: FASHION-vision
    results: []

FASHION-vision

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3565
  • Accuracy: 0.9122

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.463 1.0 375 1.4452 0.7076
0.8341 2.0 750 0.8374 0.7926
0.6173 3.0 1125 0.6483 0.8257
0.5593 4.0 1500 0.5328 0.8436
0.4456 5.0 1875 0.4808 0.8488
0.3895 6.0 2250 0.4191 0.8617
0.3257 7.0 2625 0.3950 0.8638
0.3644 8.0 3000 0.3657 0.8733
0.3603 9.0 3375 0.3515 0.878
0.3574 10.0 3750 0.3482 0.8782
0.2885 11.0 4125 0.3352 0.8823
0.3217 12.0 4500 0.3236 0.8833
0.2861 13.0 4875 0.3292 0.8811
0.263 14.0 5250 0.3083 0.8944
0.2265 15.0 5625 0.3035 0.8938
0.2407 16.0 6000 0.3094 0.8897
0.251 17.0 6375 0.3113 0.8894
0.2251 18.0 6750 0.2934 0.8951
0.2124 19.0 7125 0.3084 0.895
0.1974 20.0 7500 0.3144 0.8936
0.1907 21.0 7875 0.3048 0.8972
0.1857 22.0 8250 0.3046 0.896
0.1696 23.0 8625 0.3014 0.8982
0.2066 24.0 9000 0.2943 0.8985
0.2106 25.0 9375 0.3057 0.8984
0.2036 26.0 9750 0.3103 0.8968
0.1629 27.0 10125 0.3100 0.9003
0.1711 28.0 10500 0.3112 0.8978
0.144 29.0 10875 0.3285 0.897
0.1738 30.0 11250 0.3250 0.8968
0.1616 31.0 11625 0.3205 0.8979
0.1504 32.0 12000 0.3321 0.8947
0.1894 33.0 12375 0.3121 0.8963
0.1346 34.0 12750 0.3079 0.9017
0.1538 35.0 13125 0.3131 0.9045
0.1453 36.0 13500 0.3180 0.9042
0.1467 37.0 13875 0.3125 0.9042
0.1667 38.0 14250 0.3107 0.9035
0.1149 39.0 14625 0.3427 0.899
0.1248 40.0 15000 0.3152 0.9033
0.155 41.0 15375 0.3235 0.9015
0.1321 42.0 15750 0.3220 0.9065
0.156 43.0 16125 0.3326 0.9024
0.1511 44.0 16500 0.3351 0.8988
0.1039 45.0 16875 0.3309 0.9052
0.1277 46.0 17250 0.3552 0.9001
0.1147 47.0 17625 0.3462 0.9032
0.13 48.0 18000 0.3374 0.9009
0.1348 49.0 18375 0.3475 0.9006
0.1188 50.0 18750 0.3419 0.9067
0.1532 51.0 19125 0.3444 0.9025
0.1173 52.0 19500 0.3387 0.9034
0.1189 53.0 19875 0.3407 0.9033
0.13 54.0 20250 0.3614 0.9016
0.1206 55.0 20625 0.3404 0.9047
0.0989 56.0 21000 0.3560 0.903
0.1036 57.0 21375 0.3462 0.9056
0.1095 58.0 21750 0.3497 0.9031
0.143 59.0 22125 0.3364 0.9064
0.0889 60.0 22500 0.3544 0.9047
0.1008 61.0 22875 0.3510 0.904
0.1343 62.0 23250 0.3461 0.9069
0.1019 63.0 23625 0.3365 0.9058
0.1125 64.0 24000 0.3372 0.9086
0.143 65.0 24375 0.3433 0.9072
0.0971 66.0 24750 0.3390 0.9102
0.1147 67.0 25125 0.3493 0.9091
0.0931 68.0 25500 0.3469 0.9093
0.1127 69.0 25875 0.3421 0.9069
0.0935 70.0 26250 0.3535 0.9058
0.1152 71.0 26625 0.3313 0.9093
0.1288 72.0 27000 0.3661 0.9069
0.1244 73.0 27375 0.3405 0.9103
0.1158 74.0 27750 0.3345 0.9104
0.1255 75.0 28125 0.3367 0.9091
0.0886 76.0 28500 0.3657 0.9096
0.1008 77.0 28875 0.3468 0.9086
0.1209 78.0 29250 0.3489 0.9096
0.0944 79.0 29625 0.3511 0.9058
0.0928 80.0 30000 0.3509 0.9097
0.0932 81.0 30375 0.3485 0.9097
0.0973 82.0 30750 0.3584 0.9075
0.0964 83.0 31125 0.3459 0.9107
0.1262 84.0 31500 0.3648 0.9107
0.113 85.0 31875 0.3483 0.9083
0.0828 86.0 32250 0.3396 0.9116
0.1104 87.0 32625 0.3370 0.9119
0.0804 88.0 33000 0.3596 0.9117
0.0905 89.0 33375 0.3538 0.912
0.1064 90.0 33750 0.3497 0.9112
0.0917 91.0 34125 0.3392 0.9139
0.0813 92.0 34500 0.3561 0.9109
0.074 93.0 34875 0.3475 0.9098
0.0922 94.0 35250 0.3482 0.9114
0.0752 95.0 35625 0.3751 0.9097
0.0751 96.0 36000 0.3530 0.9103
0.0818 97.0 36375 0.3477 0.9137
0.0677 98.0 36750 0.3495 0.9115
0.0838 99.0 37125 0.3533 0.9114
0.0772 100.0 37500 0.3565 0.9122

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.2
  • Datasets 2.19.0
  • Tokenizers 0.19.1