Edit model card

cvt-21-384-22k-finetuned-LeafType

This model is a fine-tuned version of microsoft/cvt-21-384-22k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2232
  • Accuracy: 0.9310

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 160
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 11 0.4857 0.7586
No log 2.0 22 0.2275 0.9163
No log 3.0 33 0.2791 0.8867
No log 4.0 44 0.3083 0.8522
No log 5.0 55 0.1496 0.9458
No log 6.0 66 0.2092 0.8966
No log 7.0 77 0.3349 0.8522
No log 8.0 88 0.1513 0.9557
No log 9.0 99 0.1385 0.9557
No log 10.0 110 0.1676 0.9360
No log 11.0 121 0.1284 0.9507
No log 12.0 132 0.1981 0.9015
No log 13.0 143 0.1346 0.9360
No log 14.0 154 0.3235 0.8621
No log 15.0 165 0.1506 0.9310
No log 16.0 176 0.1536 0.9360
No log 17.0 187 0.1529 0.9458
No log 18.0 198 0.2431 0.8966
No log 19.0 209 0.2251 0.9163
No log 20.0 220 0.2068 0.9163
No log 21.0 231 0.1429 0.9458
No log 22.0 242 0.2078 0.9163
No log 23.0 253 0.1428 0.9360
No log 24.0 264 0.2391 0.8916
No log 25.0 275 0.1512 0.9458
No log 26.0 286 0.1512 0.9458
No log 27.0 297 0.1303 0.9606
No log 28.0 308 0.2381 0.8867
No log 29.0 319 0.1544 0.9458
No log 30.0 330 0.2149 0.8916
No log 31.0 341 0.1387 0.9557
No log 32.0 352 0.1698 0.9360
No log 33.0 363 0.1750 0.9409
No log 34.0 374 0.2041 0.9113
No log 35.0 385 0.1888 0.9360
No log 36.0 396 0.1527 0.9458
No log 37.0 407 0.1721 0.9360
No log 38.0 418 0.1720 0.9409
No log 39.0 429 0.1810 0.9360
No log 40.0 440 0.2041 0.9261
No log 41.0 451 0.1945 0.9261
No log 42.0 462 0.2211 0.9163
No log 43.0 473 0.2284 0.9163
No log 44.0 484 0.1905 0.9261
No log 45.0 495 0.2329 0.9113
0.2716 46.0 506 0.1839 0.9360
0.2716 47.0 517 0.1854 0.9310
0.2716 48.0 528 0.1923 0.9310
0.2716 49.0 539 0.2254 0.9261
0.2716 50.0 550 0.2677 0.9015
0.2716 51.0 561 0.2620 0.9015
0.2716 52.0 572 0.2224 0.9261
0.2716 53.0 583 0.2238 0.9310
0.2716 54.0 594 0.2355 0.9212
0.2716 55.0 605 0.2533 0.9261
0.2716 56.0 616 0.2237 0.9310
0.2716 57.0 627 0.2296 0.9310
0.2716 58.0 638 0.2361 0.9310
0.2716 59.0 649 0.2358 0.9310
0.2716 60.0 660 0.2232 0.9310

Framework versions

  • Transformers 4.38.1
  • Pytorch 1.10.0+cu111
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
4
Safetensors
Model size
31.3M params
Tensor type
F32
·

Finetuned from

Evaluation results