Edit model card

segformer-b4-crack-segmentation-dataset

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0594
  • Mean Iou: 0.3346
  • Mean Accuracy: 0.6691
  • Overall Accuracy: 0.6691
  • Accuracy Background: nan
  • Accuracy Crack: 0.6691
  • Iou Background: 0.0
  • Iou Crack: 0.6691

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Crack Iou Background Iou Crack
0.2287 0.02 100 0.2515 0.1734 0.3468 0.3468 nan 0.3468 0.0 0.3468
0.1792 0.04 200 0.1594 0.1671 0.3342 0.3342 nan 0.3342 0.0 0.3342
0.1177 0.06 300 0.1762 0.1044 0.2088 0.2088 nan 0.2088 0.0 0.2088
0.0821 0.08 400 0.1706 0.2065 0.4130 0.4130 nan 0.4130 0.0 0.4130
0.0666 0.1 500 0.1507 0.1931 0.3863 0.3863 nan 0.3863 0.0 0.3863
0.0675 0.12 600 0.1374 0.3114 0.6227 0.6227 nan 0.6227 0.0 0.6227
0.0267 0.15 700 0.1400 0.2171 0.4342 0.4342 nan 0.4342 0.0 0.4342
0.0192 0.17 800 0.1067 0.1594 0.3187 0.3187 nan 0.3187 0.0 0.3187
0.0711 0.19 900 0.1002 0.2915 0.5830 0.5830 nan 0.5830 0.0 0.5830
0.0761 0.21 1000 0.0785 0.3099 0.6199 0.6199 nan 0.6199 0.0 0.6199
0.0802 0.23 1100 0.0829 0.3086 0.6173 0.6173 nan 0.6173 0.0 0.6173
0.1058 0.25 1200 0.0895 0.2139 0.4278 0.4278 nan 0.4278 0.0 0.4278
0.0409 0.27 1300 0.0792 0.3237 0.6475 0.6475 nan 0.6475 0.0 0.6475
0.063 0.29 1400 0.0739 0.3084 0.6168 0.6168 nan 0.6168 0.0 0.6168
0.0669 0.31 1500 0.0747 0.3326 0.6653 0.6653 nan 0.6653 0.0 0.6653
0.1277 0.33 1600 0.0735 0.3149 0.6297 0.6297 nan 0.6297 0.0 0.6297
0.0388 0.35 1700 0.0708 0.2525 0.5050 0.5050 nan 0.5050 0.0 0.5050
0.0332 0.37 1800 0.0726 0.2908 0.5816 0.5816 nan 0.5816 0.0 0.5816
0.0435 0.4 1900 0.0673 0.2893 0.5786 0.5786 nan 0.5786 0.0 0.5786
0.1297 0.42 2000 0.0698 0.3438 0.6877 0.6877 nan 0.6877 0.0 0.6877
0.1202 0.44 2100 0.0745 0.2899 0.5798 0.5798 nan 0.5798 0.0 0.5798
0.0549 0.46 2200 0.0657 0.3522 0.7044 0.7044 nan 0.7044 0.0 0.7044
0.0223 0.48 2300 0.0808 0.2686 0.5372 0.5372 nan 0.5372 0.0 0.5372
0.0464 0.5 2400 0.0631 0.3221 0.6442 0.6442 nan 0.6442 0.0 0.6442
0.0364 0.52 2500 0.0778 0.3410 0.6820 0.6820 nan 0.6820 0.0 0.6820
0.047 0.54 2600 0.0689 0.3489 0.6978 0.6978 nan 0.6978 0.0 0.6978
0.0322 0.56 2700 0.0640 0.2863 0.5727 0.5727 nan 0.5727 0.0 0.5727
0.0453 0.58 2800 0.0574 0.3340 0.6681 0.6681 nan 0.6681 0.0 0.6681
0.0347 0.6 2900 0.0611 0.3289 0.6578 0.6578 nan 0.6578 0.0 0.6578
0.0916 0.62 3000 0.0609 0.3357 0.6714 0.6714 nan 0.6714 0.0 0.6714
0.0523 0.65 3100 0.0557 0.3318 0.6637 0.6637 nan 0.6637 0.0 0.6637
0.1246 0.67 3200 0.0558 0.3294 0.6588 0.6588 nan 0.6588 0.0 0.6588
0.0501 0.69 3300 0.0697 0.2955 0.5910 0.5910 nan 0.5910 0.0 0.5910
0.0312 0.71 3400 0.0604 0.3414 0.6827 0.6827 nan 0.6827 0.0 0.6827
0.0449 0.73 3500 0.0612 0.3305 0.6611 0.6611 nan 0.6611 0.0 0.6611
0.0111 0.75 3600 0.0617 0.2930 0.5860 0.5860 nan 0.5860 0.0 0.5860
0.0206 0.77 3700 0.0627 0.3663 0.7326 0.7326 nan 0.7326 0.0 0.7326
0.051 0.79 3800 0.0649 0.3159 0.6318 0.6318 nan 0.6318 0.0 0.6318
0.0243 0.81 3900 0.0600 0.3370 0.6740 0.6740 nan 0.6740 0.0 0.6740
0.0108 0.83 4000 0.0614 0.3595 0.7190 0.7190 nan 0.7190 0.0 0.7190
0.0951 0.85 4100 0.0564 0.3571 0.7142 0.7142 nan 0.7142 0.0 0.7142
0.0731 0.87 4200 0.0597 0.3497 0.6994 0.6994 nan 0.6994 0.0 0.6994
0.0307 0.9 4300 0.0636 0.3468 0.6937 0.6937 nan 0.6937 0.0 0.6937
0.1039 0.92 4400 0.0594 0.3397 0.6795 0.6795 nan 0.6795 0.0 0.6795
0.0083 0.94 4500 0.0606 0.3512 0.7024 0.7024 nan 0.7024 0.0 0.7024
0.0113 0.96 4600 0.0597 0.3288 0.6576 0.6576 nan 0.6576 0.0 0.6576
0.0417 0.98 4700 0.0595 0.3405 0.6811 0.6811 nan 0.6811 0.0 0.6811
0.1944 1.0 4800 0.0594 0.3346 0.6691 0.6691 nan 0.6691 0.0 0.6691

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
17

Dataset used to train varcoder/segformer-b4-crack-segmentation-dataset