Edit model card

convnext-tiny-224-finetuned-brs2

This model is a fine-tuned version of facebook/convnext-tiny-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2502
  • Accuracy: 0.7925
  • F1: 0.7556
  • Precision (ppv): 0.8095
  • Recall (sensitivity): 0.7083
  • Specificity: 0.8621
  • Npv: 0.7812
  • Auc: 0.7852

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 4
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision (ppv) Recall (sensitivity) Specificity Npv Auc
0.6884 1.89 100 0.6907 0.5472 0.4286 0.5 0.375 0.6897 0.5714 0.5323
0.5868 3.77 200 0.6604 0.6415 0.4242 0.7778 0.2917 0.9310 0.6136 0.6114
0.4759 5.66 300 0.6273 0.6604 0.5 0.75 0.375 0.8966 0.6341 0.6358
0.3599 7.55 400 0.6520 0.6604 0.5 0.75 0.375 0.8966 0.6341 0.6358
0.3248 9.43 500 0.9115 0.6415 0.4571 0.7273 0.3333 0.8966 0.6190 0.6149
0.3117 11.32 600 0.8608 0.6604 0.5263 0.7143 0.4167 0.8621 0.6410 0.6394
0.4208 13.21 700 0.8774 0.6792 0.5641 0.7333 0.4583 0.8621 0.6579 0.6602
0.5267 15.09 800 1.0131 0.6792 0.5405 0.7692 0.4167 0.8966 0.65 0.6566
0.234 16.98 900 1.1498 0.6981 0.5556 0.8333 0.4167 0.9310 0.6585 0.6739
0.7581 18.87 1000 1.0952 0.7170 0.6154 0.8 0.5 0.8966 0.6842 0.6983
0.1689 20.75 1100 1.1653 0.6981 0.5789 0.7857 0.4583 0.8966 0.6667 0.6774
0.0765 22.64 1200 1.1245 0.7170 0.6667 0.7143 0.625 0.7931 0.7188 0.7091
0.6287 24.53 1300 1.2222 0.6981 0.6 0.75 0.5 0.8621 0.6757 0.6810
0.0527 26.42 1400 1.2350 0.7358 0.6818 0.75 0.625 0.8276 0.7273 0.7263
0.3622 28.3 1500 1.1022 0.7547 0.6667 0.8667 0.5417 0.9310 0.7105 0.7364
0.3227 30.19 1600 1.1541 0.7170 0.6154 0.8 0.5 0.8966 0.6842 0.6983
0.3849 32.08 1700 1.2818 0.7170 0.6154 0.8 0.5 0.8966 0.6842 0.6983
0.4528 33.96 1800 1.3213 0.6981 0.5789 0.7857 0.4583 0.8966 0.6667 0.6774
0.1824 35.85 1900 1.3171 0.7170 0.6512 0.7368 0.5833 0.8276 0.7059 0.7055
0.0367 37.74 2000 1.4484 0.7170 0.6154 0.8 0.5 0.8966 0.6842 0.6983
0.07 39.62 2100 1.3521 0.7547 0.6977 0.7895 0.625 0.8621 0.7353 0.7435
0.0696 41.51 2200 1.2636 0.7358 0.65 0.8125 0.5417 0.8966 0.7027 0.7191
0.1554 43.4 2300 1.2225 0.7358 0.6667 0.7778 0.5833 0.8621 0.7143 0.7227
0.2346 45.28 2400 1.2627 0.7547 0.6829 0.8235 0.5833 0.8966 0.7222 0.7399
0.097 47.17 2500 1.4892 0.7170 0.6667 0.7143 0.625 0.7931 0.7188 0.7091
0.2494 49.06 2600 1.5282 0.7170 0.6512 0.7368 0.5833 0.8276 0.7059 0.7055
0.0734 50.94 2700 1.3989 0.7170 0.6341 0.7647 0.5417 0.8621 0.6944 0.7019
0.1077 52.83 2800 1.5155 0.6792 0.5641 0.7333 0.4583 0.8621 0.6579 0.6602
0.2456 54.72 2900 1.4400 0.7170 0.6512 0.7368 0.5833 0.8276 0.7059 0.7055
0.0823 56.6 3000 1.4511 0.7358 0.65 0.8125 0.5417 0.8966 0.7027 0.7191
0.0471 58.49 3100 1.5114 0.7547 0.6829 0.8235 0.5833 0.8966 0.7222 0.7399
0.0144 60.38 3200 1.4412 0.7925 0.7317 0.8824 0.625 0.9310 0.75 0.7780
0.1235 62.26 3300 1.2029 0.7547 0.6977 0.7895 0.625 0.8621 0.7353 0.7435
0.0121 64.15 3400 1.4925 0.7358 0.6667 0.7778 0.5833 0.8621 0.7143 0.7227
0.2126 66.04 3500 1.3614 0.7547 0.6667 0.8667 0.5417 0.9310 0.7105 0.7364
0.0496 67.92 3600 1.2960 0.7736 0.7143 0.8333 0.625 0.8966 0.7429 0.7608
0.1145 69.81 3700 1.3763 0.7547 0.6829 0.8235 0.5833 0.8966 0.7222 0.7399
0.1272 71.7 3800 1.6328 0.7170 0.5946 0.8462 0.4583 0.9310 0.675 0.6947
0.0007 73.58 3900 1.5622 0.7547 0.6977 0.7895 0.625 0.8621 0.7353 0.7435
0.0101 75.47 4000 1.1811 0.7925 0.7442 0.8421 0.6667 0.8966 0.7647 0.7816
0.0002 77.36 4100 1.8533 0.6981 0.5789 0.7857 0.4583 0.8966 0.6667 0.6774
0.0423 79.25 4200 1.2510 0.7547 0.6977 0.7895 0.625 0.8621 0.7353 0.7435
0.0036 81.13 4300 1.3443 0.7547 0.6829 0.8235 0.5833 0.8966 0.7222 0.7399
0.0432 83.02 4400 1.2864 0.7736 0.7273 0.8 0.6667 0.8621 0.7576 0.7644
0.0021 84.91 4500 0.8999 0.7925 0.7755 0.76 0.7917 0.7931 0.8214 0.7924
0.0002 86.79 4600 1.3634 0.7925 0.7442 0.8421 0.6667 0.8966 0.7647 0.7816
0.0044 88.68 4700 1.7830 0.7358 0.65 0.8125 0.5417 0.8966 0.7027 0.7191
0.0003 90.57 4800 1.2640 0.7736 0.7273 0.8 0.6667 0.8621 0.7576 0.7644
0.0253 92.45 4900 1.2649 0.7925 0.7442 0.8421 0.6667 0.8966 0.7647 0.7816
0.0278 94.34 5000 1.7485 0.7170 0.6512 0.7368 0.5833 0.8276 0.7059 0.7055
0.1608 96.23 5100 1.2641 0.8113 0.7727 0.85 0.7083 0.8966 0.7879 0.8024
0.0017 98.11 5200 1.6380 0.7170 0.6667 0.7143 0.625 0.7931 0.7188 0.7091
0.001 100.0 5300 1.2502 0.7925 0.7556 0.8095 0.7083 0.8621 0.7812 0.7852

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
3

Evaluation results