Edit model card

SegFormer_Clean_Set1_95images_mit-b5

This model is a fine-tuned version of nvidia/mit-b5 on the Hasano20/Clean_Set1_95images dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0390
  • Mean Iou: 0.9468
  • Mean Accuracy: 0.9733
  • Overall Accuracy: 0.9860
  • Accuracy Background: 0.9960
  • Accuracy Melt: 0.9390
  • Accuracy Substrate: 0.9850
  • Iou Background: 0.9899
  • Iou Melt: 0.8763
  • Iou Substrate: 0.9743

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Melt Accuracy Substrate Iou Background Iou Melt Iou Substrate
0.3883 0.5882 10 0.7088 0.5294 0.6161 0.8428 0.8644 0.0 0.9840 0.8428 0.0 0.7456
0.6271 1.1765 20 0.4185 0.5763 0.6455 0.8828 0.9472 0.0011 0.9882 0.9297 0.0011 0.7980
0.1779 1.7647 30 0.2746 0.6105 0.6712 0.9000 0.9943 0.0499 0.9694 0.9534 0.0474 0.8307
0.228 2.3529 40 0.2865 0.6102 0.6723 0.8897 0.9635 0.0820 0.9716 0.9359 0.0692 0.8254
0.1099 2.9412 50 0.2432 0.6646 0.7305 0.9018 0.9879 0.2657 0.9380 0.9495 0.2073 0.8369
0.1448 3.5294 60 0.3321 0.5993 0.6606 0.8987 0.9744 0.0140 0.9934 0.9613 0.0139 0.8226
0.2412 4.1176 70 0.2053 0.6581 0.7115 0.9150 0.9906 0.1590 0.9850 0.9734 0.1485 0.8525
0.1585 4.7059 80 0.2824 0.7094 0.8614 0.8838 0.9775 0.8013 0.8055 0.9504 0.3927 0.7851
0.2025 5.2941 90 0.2405 0.7011 0.8139 0.8924 0.9982 0.6013 0.8423 0.9387 0.3501 0.8144
0.2516 5.8824 100 0.2134 0.7488 0.8852 0.9083 0.9937 0.8227 0.8391 0.9721 0.4533 0.8212
0.275 6.4706 110 0.2856 0.7243 0.8793 0.8910 0.9965 0.8484 0.7932 0.9543 0.4339 0.7848
0.0721 7.0588 120 0.1417 0.7758 0.8225 0.9428 0.9913 0.4956 0.9804 0.9789 0.4530 0.8955
0.1478 7.6471 130 0.1383 0.7811 0.8383 0.9412 0.9828 0.5588 0.9733 0.9715 0.4727 0.8992
0.0541 8.2353 140 0.1654 0.7353 0.7778 0.9368 0.9958 0.3461 0.9915 0.9805 0.3400 0.8854
0.1068 8.8235 150 0.1001 0.8481 0.8900 0.9607 0.9977 0.6982 0.9742 0.9813 0.6358 0.9272
0.0879 9.4118 160 0.1177 0.8272 0.8658 0.9568 0.9914 0.6186 0.9875 0.9798 0.5785 0.9232
0.0855 10.0 170 0.0929 0.8763 0.9444 0.9650 0.9910 0.8886 0.9537 0.9848 0.7113 0.9327
0.102 10.5882 180 0.0770 0.8935 0.9405 0.9715 0.9962 0.8565 0.9689 0.9851 0.7486 0.9468
0.1044 11.1765 190 0.1401 0.7868 0.8367 0.9441 0.9696 0.5446 0.9957 0.9672 0.4853 0.9080
0.0705 11.7647 200 0.0822 0.8836 0.9507 0.9674 0.9924 0.9057 0.9542 0.9853 0.7276 0.9380
0.0583 12.3529 210 0.0670 0.9102 0.9489 0.9757 0.9957 0.8760 0.9750 0.9841 0.7914 0.9550
0.0337 12.9412 220 0.0718 0.9048 0.9384 0.9751 0.9960 0.8389 0.9803 0.9858 0.7756 0.9530
0.0237 13.5294 230 0.0634 0.9106 0.9419 0.9769 0.9957 0.8467 0.9832 0.9878 0.7879 0.9562
0.2478 14.1176 240 0.0724 0.8949 0.9289 0.9726 0.9958 0.8103 0.9806 0.9855 0.7514 0.9478
0.0237 14.7059 250 0.0570 0.9230 0.9610 0.9790 0.9950 0.9124 0.9757 0.9861 0.8226 0.9604
0.0237 15.2941 260 0.0564 0.9251 0.9650 0.9798 0.9957 0.9248 0.9745 0.9887 0.8253 0.9612
0.0414 15.8824 270 0.0786 0.8738 0.8997 0.9693 0.9926 0.7107 0.9959 0.9893 0.6917 0.9405
0.0444 16.4706 280 0.0431 0.9383 0.9686 0.9840 0.9962 0.9269 0.9828 0.9908 0.8539 0.9702
0.0307 17.0588 290 0.0416 0.9438 0.9719 0.9855 0.9942 0.9350 0.9864 0.9900 0.8675 0.9741
0.0335 17.6471 300 0.0420 0.9402 0.9635 0.9846 0.9943 0.9062 0.9900 0.9900 0.8589 0.9716
0.0717 18.2353 310 0.0448 0.9375 0.9651 0.9837 0.9971 0.9144 0.9837 0.9891 0.8533 0.9702
0.0225 18.8235 320 0.0403 0.9405 0.9635 0.9847 0.9947 0.9058 0.9899 0.9904 0.8595 0.9716
0.0315 19.4118 330 0.0394 0.9444 0.9686 0.9855 0.9956 0.9230 0.9873 0.9901 0.8698 0.9732
0.0178 20.0 340 0.0390 0.9468 0.9733 0.9860 0.9960 0.9390 0.9850 0.9899 0.8763 0.9743

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
84.6M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from