Hasano20 commited on
Commit
debd4f5
1 Parent(s): 711f809

End of training

Browse files
Files changed (3) hide show
  1. README.md +48 -56
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -1,32 +1,32 @@
1
  ---
2
  license: other
 
3
  tags:
4
  - vision
5
  - image-segmentation
6
  - generated_from_trainer
7
- base_model: nvidia/mit-b5
8
  model-index:
9
- - name: segformer_Clean_Set1_95images_mit-b5
10
  results: []
11
  ---
12
 
13
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
  should probably proofread and complete it, then remove this comment. -->
15
 
16
- # segformer_Clean_Set1_95images_mit-b5
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_95images dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 0.0169
21
- - Mean Iou: 0.6481
22
- - Mean Accuracy: 0.9819
23
- - Overall Accuracy: 0.9935
24
- - Accuracy Background: nan
25
- - Accuracy Melt: 0.9668
26
- - Accuracy Substrate: 0.9970
27
- - Iou Background: 0.0
28
- - Iou Melt: 0.9507
29
- - Iou Substrate: 0.9937
30
 
31
  ## Model description
32
 
@@ -51,54 +51,46 @@ The following hyperparameters were used during training:
51
  - seed: 42
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
- - num_epochs: 50
55
 
56
  ### Training results
57
 
58
  | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate |
59
  |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:|
60
- | 0.2276 | 1.1765 | 20 | 0.2657 | 0.3456 | 0.5675 | 0.8925 | nan | 0.1416 | 0.9935 | 0.0 | 0.1374 | 0.8994 |
61
- | 0.3964 | 2.3529 | 40 | 0.1808 | 0.3540 | 0.5688 | 0.8852 | nan | 0.1542 | 0.9835 | 0.0 | 0.1476 | 0.9145 |
62
- | 0.2669 | 3.5294 | 60 | 0.1312 | 0.3929 | 0.6246 | 0.9080 | nan | 0.2530 | 0.9961 | 0.0 | 0.2488 | 0.9298 |
63
- | 0.0785 | 4.7059 | 80 | 0.1141 | 0.4822 | 0.7742 | 0.9255 | nan | 0.5758 | 0.9725 | 0.0 | 0.4933 | 0.9533 |
64
- | 0.1552 | 5.8824 | 100 | 0.0904 | 0.5549 | 0.9259 | 0.9567 | nan | 0.8857 | 0.9662 | 0.0 | 0.7116 | 0.9532 |
65
- | 0.1163 | 7.0588 | 120 | 0.0988 | 0.5169 | 0.8101 | 0.9463 | nan | 0.6316 | 0.9886 | 0.0 | 0.6060 | 0.9446 |
66
- | 0.0738 | 8.2353 | 140 | 0.2555 | 0.3735 | 0.6075 | 0.9064 | nan | 0.2156 | 0.9993 | 0.0 | 0.2152 | 0.9053 |
67
- | 0.07 | 9.4118 | 160 | 0.0706 | 0.5411 | 0.8335 | 0.9589 | nan | 0.6691 | 0.9979 | 0.0 | 0.6629 | 0.9605 |
68
- | 0.0432 | 10.5882 | 180 | 0.0542 | 0.5821 | 0.8942 | 0.9708 | nan | 0.7937 | 0.9946 | 0.0 | 0.7743 | 0.9720 |
69
- | 0.0833 | 11.7647 | 200 | 0.0554 | 0.5863 | 0.8937 | 0.9736 | nan | 0.7890 | 0.9984 | 0.0 | 0.7823 | 0.9765 |
70
- | 0.0488 | 12.9412 | 220 | 0.0325 | 0.6218 | 0.9654 | 0.9824 | nan | 0.9431 | 0.9877 | 0.0 | 0.8817 | 0.9836 |
71
- | 0.0401 | 14.1176 | 240 | 0.0409 | 0.6276 | 0.9531 | 0.9874 | nan | 0.9081 | 0.9981 | 0.0 | 0.8966 | 0.9863 |
72
- | 0.0192 | 15.2941 | 260 | 0.0219 | 0.6383 | 0.9686 | 0.9902 | nan | 0.9402 | 0.9969 | 0.0 | 0.9242 | 0.9908 |
73
- | 0.0639 | 16.4706 | 280 | 0.0500 | 0.5965 | 0.9125 | 0.9749 | nan | 0.8306 | 0.9943 | 0.0 | 0.8014 | 0.9882 |
74
- | 0.0237 | 17.6471 | 300 | 0.0246 | 0.6300 | 0.9558 | 0.9864 | nan | 0.9156 | 0.9959 | 0.0 | 0.9005 | 0.9894 |
75
- | 0.014 | 18.8235 | 320 | 0.0207 | 0.6441 | 0.9757 | 0.9921 | nan | 0.9543 | 0.9971 | 0.0 | 0.9404 | 0.9920 |
76
- | 0.0362 | 20.0 | 340 | 0.0226 | 0.6348 | 0.9639 | 0.9888 | nan | 0.9312 | 0.9966 | 0.0 | 0.9157 | 0.9889 |
77
- | 0.0195 | 21.1765 | 360 | 0.0203 | 0.6437 | 0.9754 | 0.9923 | nan | 0.9532 | 0.9976 | 0.0 | 0.9392 | 0.9919 |
78
- | 0.0123 | 22.3529 | 380 | 0.0176 | 0.6415 | 0.9745 | 0.9910 | nan | 0.9529 | 0.9962 | 0.0 | 0.9317 | 0.9929 |
79
- | 0.0103 | 23.5294 | 400 | 0.0212 | 0.6427 | 0.9781 | 0.9918 | nan | 0.9600 | 0.9961 | 0.0 | 0.9364 | 0.9916 |
80
- | 0.0098 | 24.7059 | 420 | 0.0157 | 0.6467 | 0.9831 | 0.9929 | nan | 0.9702 | 0.9960 | 0.0 | 0.9465 | 0.9935 |
81
- | 0.0074 | 25.8824 | 440 | 0.0168 | 0.6438 | 0.9730 | 0.9920 | nan | 0.9482 | 0.9979 | 0.0 | 0.9384 | 0.9930 |
82
- | 0.0078 | 27.0588 | 460 | 0.0179 | 0.6441 | 0.9752 | 0.9922 | nan | 0.9530 | 0.9974 | 0.0 | 0.9396 | 0.9926 |
83
- | 0.0084 | 28.2353 | 480 | 0.0188 | 0.6416 | 0.9808 | 0.9909 | nan | 0.9675 | 0.9941 | 0.0 | 0.9333 | 0.9916 |
84
- | 0.0096 | 29.4118 | 500 | 0.0187 | 0.6449 | 0.9866 | 0.9924 | nan | 0.9790 | 0.9942 | 0.0 | 0.9422 | 0.9923 |
85
- | 0.0059 | 30.5882 | 520 | 0.0209 | 0.6415 | 0.9718 | 0.9914 | nan | 0.9460 | 0.9975 | 0.0 | 0.9331 | 0.9915 |
86
- | 0.0092 | 31.7647 | 540 | 0.0227 | 0.6383 | 0.9652 | 0.9903 | nan | 0.9323 | 0.9981 | 0.0 | 0.9239 | 0.9910 |
87
- | 0.0107 | 32.9412 | 560 | 0.0177 | 0.6438 | 0.9747 | 0.9920 | nan | 0.9521 | 0.9973 | 0.0 | 0.9382 | 0.9931 |
88
- | 0.0092 | 34.1176 | 580 | 0.0167 | 0.6463 | 0.9771 | 0.9929 | nan | 0.9563 | 0.9979 | 0.0 | 0.9455 | 0.9934 |
89
- | 0.0076 | 35.2941 | 600 | 0.0160 | 0.6472 | 0.9791 | 0.9931 | nan | 0.9609 | 0.9974 | 0.0 | 0.9479 | 0.9937 |
90
- | 0.0062 | 36.4706 | 620 | 0.0193 | 0.6423 | 0.9715 | 0.9917 | nan | 0.9450 | 0.9979 | 0.0 | 0.9350 | 0.9919 |
91
- | 0.0063 | 37.6471 | 640 | 0.0160 | 0.6481 | 0.9824 | 0.9933 | nan | 0.9680 | 0.9967 | 0.0 | 0.9503 | 0.9939 |
92
- | 0.0064 | 38.8235 | 660 | 0.0164 | 0.6489 | 0.9846 | 0.9935 | nan | 0.9730 | 0.9963 | 0.0 | 0.9530 | 0.9936 |
93
- | 0.009 | 40.0 | 680 | 0.0167 | 0.6487 | 0.9829 | 0.9937 | nan | 0.9687 | 0.9971 | 0.0 | 0.9521 | 0.9938 |
94
- | 0.0062 | 41.1765 | 700 | 0.0169 | 0.6478 | 0.9801 | 0.9934 | nan | 0.9626 | 0.9975 | 0.0 | 0.9497 | 0.9936 |
95
- | 0.0047 | 42.3529 | 720 | 0.0170 | 0.6481 | 0.9814 | 0.9934 | nan | 0.9657 | 0.9972 | 0.0 | 0.9507 | 0.9935 |
96
- | 0.0053 | 43.5294 | 740 | 0.0166 | 0.6490 | 0.9832 | 0.9939 | nan | 0.9693 | 0.9972 | 0.0 | 0.9529 | 0.9941 |
97
- | 0.0076 | 44.7059 | 760 | 0.0165 | 0.6484 | 0.9828 | 0.9934 | nan | 0.9688 | 0.9968 | 0.0 | 0.9513 | 0.9938 |
98
- | 0.0066 | 45.8824 | 780 | 0.0166 | 0.6488 | 0.9835 | 0.9937 | nan | 0.9702 | 0.9969 | 0.0 | 0.9523 | 0.9940 |
99
- | 0.0048 | 47.0588 | 800 | 0.0169 | 0.6482 | 0.9824 | 0.9935 | nan | 0.9678 | 0.9969 | 0.0 | 0.9508 | 0.9937 |
100
- | 0.0061 | 48.2353 | 820 | 0.0170 | 0.6481 | 0.9821 | 0.9934 | nan | 0.9674 | 0.9969 | 0.0 | 0.9506 | 0.9937 |
101
- | 0.0087 | 49.4118 | 840 | 0.0169 | 0.6481 | 0.9819 | 0.9935 | nan | 0.9668 | 0.9970 | 0.0 | 0.9507 | 0.9937 |
102
 
103
 
104
  ### Framework versions
 
1
  ---
2
  license: other
3
+ base_model: nvidia/mit-b5
4
  tags:
5
  - vision
6
  - image-segmentation
7
  - generated_from_trainer
 
8
  model-index:
9
+ - name: SegFormer_Clean_Set1_95images_mit-b5
10
  results: []
11
  ---
12
 
13
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
  should probably proofread and complete it, then remove this comment. -->
15
 
16
+ # SegFormer_Clean_Set1_95images_mit-b5
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the Hasano20/Clean_Set1_95images dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.0390
21
+ - Mean Iou: 0.9468
22
+ - Mean Accuracy: 0.9733
23
+ - Overall Accuracy: 0.9860
24
+ - Accuracy Background: 0.9960
25
+ - Accuracy Melt: 0.9390
26
+ - Accuracy Substrate: 0.9850
27
+ - Iou Background: 0.9899
28
+ - Iou Melt: 0.8763
29
+ - Iou Substrate: 0.9743
30
 
31
  ## Model description
32
 
 
51
  - seed: 42
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
+ - num_epochs: 20
55
 
56
  ### Training results
57
 
58
  | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Melt | Accuracy Substrate | Iou Background | Iou Melt | Iou Substrate |
59
  |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:-------------:|:------------------:|:--------------:|:--------:|:-------------:|
60
+ | 0.3883 | 0.5882 | 10 | 0.7088 | 0.5294 | 0.6161 | 0.8428 | 0.8644 | 0.0 | 0.9840 | 0.8428 | 0.0 | 0.7456 |
61
+ | 0.6271 | 1.1765 | 20 | 0.4185 | 0.5763 | 0.6455 | 0.8828 | 0.9472 | 0.0011 | 0.9882 | 0.9297 | 0.0011 | 0.7980 |
62
+ | 0.1779 | 1.7647 | 30 | 0.2746 | 0.6105 | 0.6712 | 0.9000 | 0.9943 | 0.0499 | 0.9694 | 0.9534 | 0.0474 | 0.8307 |
63
+ | 0.228 | 2.3529 | 40 | 0.2865 | 0.6102 | 0.6723 | 0.8897 | 0.9635 | 0.0820 | 0.9716 | 0.9359 | 0.0692 | 0.8254 |
64
+ | 0.1099 | 2.9412 | 50 | 0.2432 | 0.6646 | 0.7305 | 0.9018 | 0.9879 | 0.2657 | 0.9380 | 0.9495 | 0.2073 | 0.8369 |
65
+ | 0.1448 | 3.5294 | 60 | 0.3321 | 0.5993 | 0.6606 | 0.8987 | 0.9744 | 0.0140 | 0.9934 | 0.9613 | 0.0139 | 0.8226 |
66
+ | 0.2412 | 4.1176 | 70 | 0.2053 | 0.6581 | 0.7115 | 0.9150 | 0.9906 | 0.1590 | 0.9850 | 0.9734 | 0.1485 | 0.8525 |
67
+ | 0.1585 | 4.7059 | 80 | 0.2824 | 0.7094 | 0.8614 | 0.8838 | 0.9775 | 0.8013 | 0.8055 | 0.9504 | 0.3927 | 0.7851 |
68
+ | 0.2025 | 5.2941 | 90 | 0.2405 | 0.7011 | 0.8139 | 0.8924 | 0.9982 | 0.6013 | 0.8423 | 0.9387 | 0.3501 | 0.8144 |
69
+ | 0.2516 | 5.8824 | 100 | 0.2134 | 0.7488 | 0.8852 | 0.9083 | 0.9937 | 0.8227 | 0.8391 | 0.9721 | 0.4533 | 0.8212 |
70
+ | 0.275 | 6.4706 | 110 | 0.2856 | 0.7243 | 0.8793 | 0.8910 | 0.9965 | 0.8484 | 0.7932 | 0.9543 | 0.4339 | 0.7848 |
71
+ | 0.0721 | 7.0588 | 120 | 0.1417 | 0.7758 | 0.8225 | 0.9428 | 0.9913 | 0.4956 | 0.9804 | 0.9789 | 0.4530 | 0.8955 |
72
+ | 0.1478 | 7.6471 | 130 | 0.1383 | 0.7811 | 0.8383 | 0.9412 | 0.9828 | 0.5588 | 0.9733 | 0.9715 | 0.4727 | 0.8992 |
73
+ | 0.0541 | 8.2353 | 140 | 0.1654 | 0.7353 | 0.7778 | 0.9368 | 0.9958 | 0.3461 | 0.9915 | 0.9805 | 0.3400 | 0.8854 |
74
+ | 0.1068 | 8.8235 | 150 | 0.1001 | 0.8481 | 0.8900 | 0.9607 | 0.9977 | 0.6982 | 0.9742 | 0.9813 | 0.6358 | 0.9272 |
75
+ | 0.0879 | 9.4118 | 160 | 0.1177 | 0.8272 | 0.8658 | 0.9568 | 0.9914 | 0.6186 | 0.9875 | 0.9798 | 0.5785 | 0.9232 |
76
+ | 0.0855 | 10.0 | 170 | 0.0929 | 0.8763 | 0.9444 | 0.9650 | 0.9910 | 0.8886 | 0.9537 | 0.9848 | 0.7113 | 0.9327 |
77
+ | 0.102 | 10.5882 | 180 | 0.0770 | 0.8935 | 0.9405 | 0.9715 | 0.9962 | 0.8565 | 0.9689 | 0.9851 | 0.7486 | 0.9468 |
78
+ | 0.1044 | 11.1765 | 190 | 0.1401 | 0.7868 | 0.8367 | 0.9441 | 0.9696 | 0.5446 | 0.9957 | 0.9672 | 0.4853 | 0.9080 |
79
+ | 0.0705 | 11.7647 | 200 | 0.0822 | 0.8836 | 0.9507 | 0.9674 | 0.9924 | 0.9057 | 0.9542 | 0.9853 | 0.7276 | 0.9380 |
80
+ | 0.0583 | 12.3529 | 210 | 0.0670 | 0.9102 | 0.9489 | 0.9757 | 0.9957 | 0.8760 | 0.9750 | 0.9841 | 0.7914 | 0.9550 |
81
+ | 0.0337 | 12.9412 | 220 | 0.0718 | 0.9048 | 0.9384 | 0.9751 | 0.9960 | 0.8389 | 0.9803 | 0.9858 | 0.7756 | 0.9530 |
82
+ | 0.0237 | 13.5294 | 230 | 0.0634 | 0.9106 | 0.9419 | 0.9769 | 0.9957 | 0.8467 | 0.9832 | 0.9878 | 0.7879 | 0.9562 |
83
+ | 0.2478 | 14.1176 | 240 | 0.0724 | 0.8949 | 0.9289 | 0.9726 | 0.9958 | 0.8103 | 0.9806 | 0.9855 | 0.7514 | 0.9478 |
84
+ | 0.0237 | 14.7059 | 250 | 0.0570 | 0.9230 | 0.9610 | 0.9790 | 0.9950 | 0.9124 | 0.9757 | 0.9861 | 0.8226 | 0.9604 |
85
+ | 0.0237 | 15.2941 | 260 | 0.0564 | 0.9251 | 0.9650 | 0.9798 | 0.9957 | 0.9248 | 0.9745 | 0.9887 | 0.8253 | 0.9612 |
86
+ | 0.0414 | 15.8824 | 270 | 0.0786 | 0.8738 | 0.8997 | 0.9693 | 0.9926 | 0.7107 | 0.9959 | 0.9893 | 0.6917 | 0.9405 |
87
+ | 0.0444 | 16.4706 | 280 | 0.0431 | 0.9383 | 0.9686 | 0.9840 | 0.9962 | 0.9269 | 0.9828 | 0.9908 | 0.8539 | 0.9702 |
88
+ | 0.0307 | 17.0588 | 290 | 0.0416 | 0.9438 | 0.9719 | 0.9855 | 0.9942 | 0.9350 | 0.9864 | 0.9900 | 0.8675 | 0.9741 |
89
+ | 0.0335 | 17.6471 | 300 | 0.0420 | 0.9402 | 0.9635 | 0.9846 | 0.9943 | 0.9062 | 0.9900 | 0.9900 | 0.8589 | 0.9716 |
90
+ | 0.0717 | 18.2353 | 310 | 0.0448 | 0.9375 | 0.9651 | 0.9837 | 0.9971 | 0.9144 | 0.9837 | 0.9891 | 0.8533 | 0.9702 |
91
+ | 0.0225 | 18.8235 | 320 | 0.0403 | 0.9405 | 0.9635 | 0.9847 | 0.9947 | 0.9058 | 0.9899 | 0.9904 | 0.8595 | 0.9716 |
92
+ | 0.0315 | 19.4118 | 330 | 0.0394 | 0.9444 | 0.9686 | 0.9855 | 0.9956 | 0.9230 | 0.9873 | 0.9901 | 0.8698 | 0.9732 |
93
+ | 0.0178 | 20.0 | 340 | 0.0390 | 0.9468 | 0.9733 | 0.9860 | 0.9960 | 0.9390 | 0.9850 | 0.9899 | 0.8763 | 0.9743 |
 
 
 
 
 
 
 
 
94
 
95
 
96
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c31f42117fbe24ad35b88518c1cf2aacf3fe11dbce3446e3de630523c18e511a
3
  size 338531516
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:93d74de79e00d80c93b13cfaf0b09cab4f3201ffa82edaddb8e6be2b130c25a8
3
  size 338531516
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b151ebebe795159385b837f0d9d64e1a949bc4ba71002e2a5ffddefacc0b190e
3
  size 4667
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:81730d3137c1f6ca8b94903061bf78926a81dd0623447ed257805fcc695a4678
3
  size 4667