Hasano20's picture
End of training
8475ec3 verified
|
raw
history blame
10.3 kB
metadata
license: other
base_model: nvidia/mit-b5
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: segformer_Clean_Set1_95images_mit-b5
    results: []

segformer_Clean_Set1_95images_mit-b5

This model is a fine-tuned version of nvidia/mit-b5 on the Hasano20/Clean_Set1_95images dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0169
  • Mean Iou: 0.6481
  • Mean Accuracy: 0.9819
  • Overall Accuracy: 0.9935
  • Accuracy Background: nan
  • Accuracy Melt: 0.9668
  • Accuracy Substrate: 0.9970
  • Iou Background: 0.0
  • Iou Melt: 0.9507
  • Iou Substrate: 0.9937

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Melt Accuracy Substrate Iou Background Iou Melt Iou Substrate
0.2276 1.1765 20 0.2657 0.3456 0.5675 0.8925 nan 0.1416 0.9935 0.0 0.1374 0.8994
0.3964 2.3529 40 0.1808 0.3540 0.5688 0.8852 nan 0.1542 0.9835 0.0 0.1476 0.9145
0.2669 3.5294 60 0.1312 0.3929 0.6246 0.9080 nan 0.2530 0.9961 0.0 0.2488 0.9298
0.0785 4.7059 80 0.1141 0.4822 0.7742 0.9255 nan 0.5758 0.9725 0.0 0.4933 0.9533
0.1552 5.8824 100 0.0904 0.5549 0.9259 0.9567 nan 0.8857 0.9662 0.0 0.7116 0.9532
0.1163 7.0588 120 0.0988 0.5169 0.8101 0.9463 nan 0.6316 0.9886 0.0 0.6060 0.9446
0.0738 8.2353 140 0.2555 0.3735 0.6075 0.9064 nan 0.2156 0.9993 0.0 0.2152 0.9053
0.07 9.4118 160 0.0706 0.5411 0.8335 0.9589 nan 0.6691 0.9979 0.0 0.6629 0.9605
0.0432 10.5882 180 0.0542 0.5821 0.8942 0.9708 nan 0.7937 0.9946 0.0 0.7743 0.9720
0.0833 11.7647 200 0.0554 0.5863 0.8937 0.9736 nan 0.7890 0.9984 0.0 0.7823 0.9765
0.0488 12.9412 220 0.0325 0.6218 0.9654 0.9824 nan 0.9431 0.9877 0.0 0.8817 0.9836
0.0401 14.1176 240 0.0409 0.6276 0.9531 0.9874 nan 0.9081 0.9981 0.0 0.8966 0.9863
0.0192 15.2941 260 0.0219 0.6383 0.9686 0.9902 nan 0.9402 0.9969 0.0 0.9242 0.9908
0.0639 16.4706 280 0.0500 0.5965 0.9125 0.9749 nan 0.8306 0.9943 0.0 0.8014 0.9882
0.0237 17.6471 300 0.0246 0.6300 0.9558 0.9864 nan 0.9156 0.9959 0.0 0.9005 0.9894
0.014 18.8235 320 0.0207 0.6441 0.9757 0.9921 nan 0.9543 0.9971 0.0 0.9404 0.9920
0.0362 20.0 340 0.0226 0.6348 0.9639 0.9888 nan 0.9312 0.9966 0.0 0.9157 0.9889
0.0195 21.1765 360 0.0203 0.6437 0.9754 0.9923 nan 0.9532 0.9976 0.0 0.9392 0.9919
0.0123 22.3529 380 0.0176 0.6415 0.9745 0.9910 nan 0.9529 0.9962 0.0 0.9317 0.9929
0.0103 23.5294 400 0.0212 0.6427 0.9781 0.9918 nan 0.9600 0.9961 0.0 0.9364 0.9916
0.0098 24.7059 420 0.0157 0.6467 0.9831 0.9929 nan 0.9702 0.9960 0.0 0.9465 0.9935
0.0074 25.8824 440 0.0168 0.6438 0.9730 0.9920 nan 0.9482 0.9979 0.0 0.9384 0.9930
0.0078 27.0588 460 0.0179 0.6441 0.9752 0.9922 nan 0.9530 0.9974 0.0 0.9396 0.9926
0.0084 28.2353 480 0.0188 0.6416 0.9808 0.9909 nan 0.9675 0.9941 0.0 0.9333 0.9916
0.0096 29.4118 500 0.0187 0.6449 0.9866 0.9924 nan 0.9790 0.9942 0.0 0.9422 0.9923
0.0059 30.5882 520 0.0209 0.6415 0.9718 0.9914 nan 0.9460 0.9975 0.0 0.9331 0.9915
0.0092 31.7647 540 0.0227 0.6383 0.9652 0.9903 nan 0.9323 0.9981 0.0 0.9239 0.9910
0.0107 32.9412 560 0.0177 0.6438 0.9747 0.9920 nan 0.9521 0.9973 0.0 0.9382 0.9931
0.0092 34.1176 580 0.0167 0.6463 0.9771 0.9929 nan 0.9563 0.9979 0.0 0.9455 0.9934
0.0076 35.2941 600 0.0160 0.6472 0.9791 0.9931 nan 0.9609 0.9974 0.0 0.9479 0.9937
0.0062 36.4706 620 0.0193 0.6423 0.9715 0.9917 nan 0.9450 0.9979 0.0 0.9350 0.9919
0.0063 37.6471 640 0.0160 0.6481 0.9824 0.9933 nan 0.9680 0.9967 0.0 0.9503 0.9939
0.0064 38.8235 660 0.0164 0.6489 0.9846 0.9935 nan 0.9730 0.9963 0.0 0.9530 0.9936
0.009 40.0 680 0.0167 0.6487 0.9829 0.9937 nan 0.9687 0.9971 0.0 0.9521 0.9938
0.0062 41.1765 700 0.0169 0.6478 0.9801 0.9934 nan 0.9626 0.9975 0.0 0.9497 0.9936
0.0047 42.3529 720 0.0170 0.6481 0.9814 0.9934 nan 0.9657 0.9972 0.0 0.9507 0.9935
0.0053 43.5294 740 0.0166 0.6490 0.9832 0.9939 nan 0.9693 0.9972 0.0 0.9529 0.9941
0.0076 44.7059 760 0.0165 0.6484 0.9828 0.9934 nan 0.9688 0.9968 0.0 0.9513 0.9938
0.0066 45.8824 780 0.0166 0.6488 0.9835 0.9937 nan 0.9702 0.9969 0.0 0.9523 0.9940
0.0048 47.0588 800 0.0169 0.6482 0.9824 0.9935 nan 0.9678 0.9969 0.0 0.9508 0.9937
0.0061 48.2353 820 0.0170 0.6481 0.9821 0.9934 nan 0.9674 0.9969 0.0 0.9506 0.9937
0.0087 49.4118 840 0.0169 0.6481 0.9819 0.9935 nan 0.9668 0.9970 0.0 0.9507 0.9937

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.19.2
  • Tokenizers 0.19.1