Edit model card

CrackSeg-MIT-b0-aug

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0578
  • Mean Iou: 0.3169
  • Mean Accuracy: 0.6337
  • Overall Accuracy: 0.6337
  • Accuracy Background: nan
  • Accuracy Crack: 0.6337
  • Iou Background: 0.0
  • Iou Crack: 0.6337

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Crack Iou Background Iou Crack
0.2102 0.04 100 0.1362 0.1116 0.2232 0.2232 nan 0.2232 0.0 0.2232
0.065 0.08 200 0.1125 0.0153 0.0305 0.0305 nan 0.0305 0.0 0.0305
0.1738 0.12 300 0.1165 0.1976 0.3953 0.3953 nan 0.3953 0.0 0.3953
0.0476 0.17 400 0.1979 0.0120 0.0241 0.0241 nan 0.0241 0.0 0.0241
0.0524 0.21 500 0.1063 0.0533 0.1066 0.1066 nan 0.1066 0.0 0.1066
0.0496 0.25 600 0.1154 0.1646 0.3292 0.3292 nan 0.3292 0.0 0.3292
0.0497 0.29 700 0.0795 0.3184 0.6368 0.6368 nan 0.6368 0.0 0.6368
0.032 0.33 800 0.0905 0.1792 0.3583 0.3583 nan 0.3583 0.0 0.3583
0.1207 0.37 900 0.0738 0.2401 0.4802 0.4802 nan 0.4802 0.0 0.4802
0.0511 0.41 1000 0.0883 0.2591 0.5182 0.5182 nan 0.5182 0.0 0.5182
0.0264 0.46 1100 0.0815 0.1655 0.3309 0.3309 nan 0.3309 0.0 0.3309
0.0719 0.5 1200 0.0772 0.3040 0.6080 0.6080 nan 0.6080 0.0 0.6080
0.042 0.54 1300 0.0707 0.2797 0.5593 0.5593 nan 0.5593 0.0 0.5593
0.167 0.58 1400 0.0685 0.3609 0.7218 0.7218 nan 0.7218 0.0 0.7218
0.0206 0.62 1500 0.0655 0.2469 0.4937 0.4937 nan 0.4937 0.0 0.4937
0.0211 0.66 1600 0.0937 0.3334 0.6668 0.6668 nan 0.6668 0.0 0.6668
0.0659 0.7 1700 0.0750 0.2382 0.4764 0.4764 nan 0.4764 0.0 0.4764
0.0478 0.75 1800 0.0693 0.2944 0.5888 0.5888 nan 0.5888 0.0 0.5888
0.0287 0.79 1900 0.0710 0.2395 0.4790 0.4790 nan 0.4790 0.0 0.4790
0.0359 0.83 2000 0.0580 0.3385 0.6771 0.6771 nan 0.6771 0.0 0.6771
0.0309 0.87 2100 0.0744 0.2153 0.4305 0.4305 nan 0.4305 0.0 0.4305
0.0039 0.91 2200 0.0636 0.2974 0.5947 0.5947 nan 0.5947 0.0 0.5947
0.0152 0.95 2300 0.0635 0.3215 0.6430 0.6430 nan 0.6430 0.0 0.6430
0.0233 0.99 2400 0.0668 0.3039 0.6077 0.6077 nan 0.6077 0.0 0.6077
0.0088 1.04 2500 0.0673 0.3352 0.6704 0.6704 nan 0.6704 0.0 0.6704
0.0756 1.08 2600 0.0599 0.3310 0.6621 0.6621 nan 0.6621 0.0 0.6621
0.0522 1.12 2700 0.0674 0.2943 0.5885 0.5885 nan 0.5885 0.0 0.5885
0.0595 1.16 2800 0.0828 0.2382 0.4763 0.4763 nan 0.4763 0.0 0.4763
0.0135 1.2 2900 0.0574 0.2901 0.5802 0.5802 nan 0.5802 0.0 0.5802
0.0289 1.24 3000 0.0700 0.3186 0.6372 0.6372 nan 0.6372 0.0 0.6372
0.0403 1.28 3100 0.0761 0.3741 0.7483 0.7483 nan 0.7483 0.0 0.7483
0.0131 1.33 3200 0.0600 0.3285 0.6570 0.6570 nan 0.6570 0.0 0.6570
0.0957 1.37 3300 0.0633 0.3400 0.6801 0.6801 nan 0.6801 0.0 0.6801
0.0152 1.41 3400 0.0678 0.3479 0.6958 0.6958 nan 0.6958 0.0 0.6958
0.0235 1.45 3500 0.0636 0.3416 0.6832 0.6832 nan 0.6832 0.0 0.6832
0.0304 1.49 3600 0.0596 0.3606 0.7211 0.7211 nan 0.7211 0.0 0.7211
0.0012 1.53 3700 0.0605 0.2992 0.5983 0.5983 nan 0.5983 0.0 0.5983
0.0435 1.57 3800 0.0563 0.3283 0.6566 0.6566 nan 0.6566 0.0 0.6566
0.05 1.61 3900 0.0601 0.3314 0.6628 0.6628 nan 0.6628 0.0 0.6628
0.063 1.66 4000 0.0617 0.3307 0.6614 0.6614 nan 0.6614 0.0 0.6614
0.0552 1.7 4100 0.0626 0.3580 0.7161 0.7161 nan 0.7161 0.0 0.7161
0.0153 1.74 4200 0.0622 0.2864 0.5728 0.5728 nan 0.5728 0.0 0.5728
0.0446 1.78 4300 0.0612 0.3224 0.6448 0.6448 nan 0.6448 0.0 0.6448
0.0203 1.82 4400 0.0589 0.3167 0.6334 0.6334 nan 0.6334 0.0 0.6334
0.0424 1.86 4500 0.0567 0.3443 0.6887 0.6887 nan 0.6887 0.0 0.6887
0.0103 1.9 4600 0.0591 0.3282 0.6563 0.6563 nan 0.6563 0.0 0.6563
0.0831 1.95 4700 0.0573 0.3224 0.6447 0.6447 nan 0.6447 0.0 0.6447
0.1301 1.99 4800 0.0578 0.3169 0.6337 0.6337 nan 0.6337 0.0 0.6337

Framework versions

  • Transformers 4.32.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
2
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from