Melo1512's picture
End of training
571a1ac verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/vit-msn-small
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-msn-small-lateral_flow_ivalidation_train_test_6
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8791208791208791

vit-msn-small-lateral_flow_ivalidation_train_test_6

This model is a fine-tuned version of facebook/vit-msn-small on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4371
  • Accuracy: 0.8791

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-07
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.3
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6672 0.9231 6 0.6980 0.4212
0.6617 2.0 13 0.6965 0.4249
0.6699 2.9231 19 0.6944 0.4396
0.662 4.0 26 0.6910 0.4396
0.6548 4.9231 32 0.6873 0.4579
0.6541 6.0 39 0.6825 0.4835
0.6222 6.9231 45 0.6777 0.5311
0.6555 8.0 52 0.6719 0.5421
0.6226 8.9231 58 0.6665 0.5861
0.5989 10.0 65 0.6603 0.6154
0.5754 10.9231 71 0.6555 0.6264
0.6251 12.0 78 0.6493 0.6484
0.5796 12.9231 84 0.6446 0.6667
0.5763 14.0 91 0.6390 0.6667
0.5952 14.9231 97 0.6333 0.6850
0.5675 16.0 104 0.6269 0.7033
0.5453 16.9231 110 0.6211 0.7106
0.5199 18.0 117 0.6150 0.7143
0.541 18.9231 123 0.6090 0.7216
0.5273 20.0 130 0.6007 0.7289
0.495 20.9231 136 0.5934 0.7289
0.4855 22.0 143 0.5855 0.7473
0.4763 22.9231 149 0.5787 0.7363
0.4287 24.0 156 0.5693 0.7509
0.445 24.9231 162 0.5619 0.7692
0.4343 26.0 169 0.5540 0.7802
0.3748 26.9231 175 0.5467 0.7875
0.4041 28.0 182 0.5421 0.8022
0.3543 28.9231 188 0.5291 0.8205
0.3972 30.0 195 0.5134 0.8278
0.3716 30.9231 201 0.5150 0.8242
0.3871 32.0 208 0.5100 0.8315
0.3729 32.9231 214 0.4986 0.8352
0.3286 34.0 221 0.4946 0.8462
0.4261 34.9231 227 0.4957 0.8388
0.4014 36.0 234 0.4850 0.8535
0.3514 36.9231 240 0.4807 0.8535
0.3883 38.0 247 0.4767 0.8535
0.3219 38.9231 253 0.4763 0.8535
0.4351 40.0 260 0.4738 0.8571
0.3068 40.9231 266 0.4688 0.8645
0.3356 42.0 273 0.4585 0.8645
0.345 42.9231 279 0.4541 0.8681
0.3254 44.0 286 0.4584 0.8645
0.3164 44.9231 292 0.4592 0.8571
0.3657 46.0 299 0.4534 0.8608
0.2655 46.9231 305 0.4502 0.8645
0.2981 48.0 312 0.4452 0.8645
0.3508 48.9231 318 0.4371 0.8791
0.3419 50.0 325 0.4394 0.8755
0.2668 50.9231 331 0.4430 0.8755
0.2972 52.0 338 0.4395 0.8718
0.3514 52.9231 344 0.4371 0.8755
0.3012 54.0 351 0.4330 0.8791
0.2725 54.9231 357 0.4298 0.8791
0.2547 56.0 364 0.4289 0.8718
0.2896 56.9231 370 0.4282 0.8718
0.3469 58.0 377 0.4273 0.8718
0.3528 58.9231 383 0.4269 0.8718
0.2552 60.0 390 0.4324 0.8681
0.239 60.9231 396 0.4319 0.8645
0.3321 62.0 403 0.4270 0.8718
0.3115 62.9231 409 0.4184 0.8718
0.306 64.0 416 0.4169 0.8718
0.3086 64.9231 422 0.4176 0.8718
0.4256 66.0 429 0.4196 0.8718
0.2798 66.9231 435 0.4219 0.8718
0.3016 68.0 442 0.4224 0.8718
0.2791 68.9231 448 0.4207 0.8718
0.2651 70.0 455 0.4189 0.8718
0.2466 70.9231 461 0.4178 0.8718
0.1913 72.0 468 0.4177 0.8718
0.2719 72.9231 474 0.4164 0.8718
0.3364 74.0 481 0.4166 0.8718
0.283 74.9231 487 0.4179 0.8755
0.2891 76.0 494 0.4174 0.8755
0.2625 76.9231 500 0.4180 0.8755
0.2843 78.0 507 0.4184 0.8718
0.375 78.9231 513 0.4167 0.8755
0.3107 80.0 520 0.4150 0.8755
0.3742 80.9231 526 0.4145 0.8718
0.2574 82.0 533 0.4145 0.8755
0.329 82.9231 539 0.4149 0.8755
0.2727 84.0 546 0.4145 0.8755
0.2977 84.9231 552 0.4149 0.8755
0.2611 86.0 559 0.4160 0.8718
0.2542 86.9231 565 0.4170 0.8718
0.2665 88.0 572 0.4171 0.8718
0.2654 88.9231 578 0.4170 0.8718
0.3059 90.0 585 0.4172 0.8718
0.2377 90.9231 591 0.4173 0.8718
0.2896 92.0 598 0.4172 0.8718
0.3133 92.3077 600 0.4172 0.8718

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1