Edit model card

deit-base-patch16-224-finetuned-ind-17-imbalanced-aadhaarmask-14687

This model is a fine-tuned version of facebook/deit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3950
  • Accuracy: 0.8310
  • Recall: 0.8310
  • F1: 0.8298
  • Precision: 0.8360

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Recall F1 Precision
0.8293 0.9974 293 0.7793 0.7680 0.7680 0.7403 0.7277
0.5921 1.9983 587 0.5663 0.7940 0.7940 0.7843 0.7839
0.4308 2.9991 881 0.4589 0.8208 0.8208 0.8161 0.8213
0.3999 4.0 1175 0.4772 0.8263 0.8263 0.8216 0.8337
0.4801 4.9974 1468 0.4258 0.8378 0.8378 0.8306 0.8463
0.4201 5.9983 1762 0.4120 0.8246 0.8246 0.8213 0.8394
0.3233 6.9991 2056 0.3989 0.8306 0.8306 0.8268 0.8445
0.3954 8.0 2350 0.3794 0.8365 0.8365 0.8341 0.8383
0.2835 8.9974 2643 0.4438 0.8318 0.8318 0.8278 0.8434
0.2913 9.9983 2937 0.3799 0.8416 0.8416 0.8404 0.8451
0.3261 10.9991 3231 0.3694 0.8297 0.8297 0.8272 0.8306
0.3299 12.0 3525 0.3637 0.8442 0.8442 0.8425 0.8529
0.3273 12.9974 3818 0.3649 0.8421 0.8421 0.8411 0.8482
0.2596 13.9983 4112 0.4152 0.8259 0.8259 0.8213 0.8281
0.2813 14.9991 4406 0.3578 0.8429 0.8429 0.8409 0.8491
0.2406 16.0 4700 0.3813 0.8323 0.8323 0.8285 0.8362
0.2263 16.9974 4993 0.3808 0.8318 0.8318 0.8275 0.8377
0.3192 17.9983 5287 0.3625 0.8412 0.8412 0.8372 0.8484
0.2003 18.9991 5581 0.3549 0.8438 0.8438 0.8430 0.8462
0.2431 20.0 5875 0.3620 0.8425 0.8425 0.8408 0.8467
0.2654 20.9974 6168 0.3865 0.8340 0.8340 0.8320 0.8338
0.2989 21.9983 6462 0.3632 0.8463 0.8463 0.8449 0.8498
0.2403 22.9991 6756 0.3824 0.8301 0.8301 0.8267 0.8304
0.2393 24.0 7050 0.3607 0.8489 0.8489 0.8473 0.8519
0.2305 24.9974 7343 0.3758 0.8365 0.8365 0.8350 0.8401
0.2654 25.9983 7637 0.3652 0.8421 0.8421 0.8392 0.8415
0.176 26.9991 7931 0.3929 0.8306 0.8306 0.8289 0.8385
0.1893 28.0 8225 0.3794 0.8374 0.8374 0.8365 0.8404
0.2652 28.9974 8518 0.3995 0.8387 0.8387 0.8372 0.8423
0.2029 29.9983 8812 0.3981 0.8433 0.8433 0.8411 0.8430
0.1799 30.9991 9106 0.3554 0.8352 0.8352 0.8340 0.8368
0.2002 32.0 9400 0.3618 0.8310 0.8310 0.8300 0.8322
0.1525 32.9974 9693 0.3629 0.8348 0.8348 0.8343 0.8381
0.1663 33.9983 9987 0.3664 0.8425 0.8425 0.8410 0.8427
0.1728 34.9991 10281 0.3928 0.8429 0.8429 0.8415 0.8468
0.2252 36.0 10575 0.3842 0.8421 0.8421 0.8420 0.8443
0.1554 36.9974 10868 0.3889 0.8301 0.8301 0.8294 0.8349
0.2179 37.9983 11162 0.3775 0.8399 0.8399 0.8389 0.8429
0.1771 38.9991 11456 0.3906 0.8306 0.8306 0.8291 0.8324
0.2167 40.0 11750 0.3870 0.8404 0.8404 0.8382 0.8456
0.1563 40.9974 12043 0.3779 0.8284 0.8284 0.8277 0.8288
0.1419 41.9983 12337 0.4049 0.8340 0.8340 0.8327 0.8360
0.2083 42.9991 12631 0.3800 0.8421 0.8421 0.8410 0.8427
0.2185 44.0 12925 0.3964 0.8433 0.8433 0.8422 0.8441
0.1989 44.9974 13218 0.3870 0.8340 0.8340 0.8339 0.8357
0.1731 45.9983 13512 0.4206 0.8340 0.8340 0.8335 0.8357
0.1831 46.9991 13806 0.4027 0.8429 0.8429 0.8422 0.8439
0.1471 48.0 14100 0.4016 0.8318 0.8318 0.8307 0.8320
0.1879 48.9974 14393 0.3877 0.8438 0.8438 0.8441 0.8468
0.1775 49.8723 14650 0.3984 0.8421 0.8421 0.8408 0.8428

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.0a0+81ea7a4
  • Datasets 2.18.0
  • Tokenizers 0.19.1
Downloads last month
20
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results