Edit model card

segformer-b2-p142-cvat-2

This model is a fine-tuned version of nvidia/mit-b2 on the vigneshgs7/segformer_open_cv_RGB_L_0_1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0222
  • Mean Iou: 0.4959
  • Mean Accuracy: 0.9919
  • Overall Accuracy: 0.9919
  • Accuracy Background: nan
  • Accuracy Object: 0.9919
  • Iou Background: 0.0
  • Iou Object: 0.9919

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Object Iou Background Iou Object
0.4097 0.06 20 0.4634 0.4794 0.9589 0.9589 nan 0.9589 0.0 0.9589
0.4192 0.11 40 0.2595 0.4800 0.9601 0.9601 nan 0.9601 0.0 0.9601
0.4005 0.17 60 0.1546 0.4720 0.9441 0.9441 nan 0.9441 0.0 0.9441
0.1912 0.23 80 0.1395 0.4780 0.9560 0.9560 nan 0.9560 0.0 0.9560
0.1286 0.29 100 0.1182 0.4775 0.9551 0.9551 nan 0.9551 0.0 0.9551
0.1012 0.34 120 0.0902 0.4738 0.9477 0.9477 nan 0.9477 0.0 0.9477
0.0798 0.4 140 0.0777 0.4812 0.9624 0.9624 nan 0.9624 0.0 0.9624
0.0593 0.46 160 0.0716 0.4849 0.9697 0.9697 nan 0.9697 0.0 0.9697
0.107 0.52 180 0.0675 0.4900 0.9800 0.9800 nan 0.9800 0.0 0.9800
0.0521 0.57 200 0.0553 0.4811 0.9621 0.9621 nan 0.9621 0.0 0.9621
0.045 0.63 220 0.0527 0.4915 0.9829 0.9829 nan 0.9829 0.0 0.9829
0.0447 0.69 240 0.0481 0.4785 0.9571 0.9571 nan 0.9571 0.0 0.9571
0.0381 0.74 260 0.0405 0.4878 0.9755 0.9755 nan 0.9755 0.0 0.9755
0.0392 0.8 280 0.0409 0.4861 0.9723 0.9723 nan 0.9723 0.0 0.9723
0.0364 0.86 300 0.0377 0.4878 0.9755 0.9755 nan 0.9755 0.0 0.9755
0.0481 0.92 320 0.0383 0.4920 0.9840 0.9840 nan 0.9840 0.0 0.9840
0.0424 0.97 340 0.0355 0.4909 0.9818 0.9818 nan 0.9818 0.0 0.9818
0.0371 1.03 360 0.0358 0.4866 0.9732 0.9732 nan 0.9732 0.0 0.9732
0.0224 1.09 380 0.0355 0.4897 0.9794 0.9794 nan 0.9794 0.0 0.9794
0.0358 1.15 400 0.0359 0.4885 0.9769 0.9769 nan 0.9769 0.0 0.9769
0.0235 1.2 420 0.0340 0.4877 0.9753 0.9753 nan 0.9753 0.0 0.9753
0.1746 1.26 440 0.0335 0.4927 0.9854 0.9854 nan 0.9854 0.0 0.9854
0.0253 1.32 460 0.0321 0.4889 0.9778 0.9778 nan 0.9778 0.0 0.9778
0.0247 1.38 480 0.0299 0.4907 0.9814 0.9814 nan 0.9814 0.0 0.9814
0.0351 1.43 500 0.0303 0.4907 0.9813 0.9813 nan 0.9813 0.0 0.9813
0.0203 1.49 520 0.0300 0.4906 0.9812 0.9812 nan 0.9812 0.0 0.9812
0.0254 1.55 540 0.0327 0.4859 0.9718 0.9718 nan 0.9718 0.0 0.9718
0.0272 1.6 560 0.0293 0.4908 0.9816 0.9816 nan 0.9816 0.0 0.9816
0.0295 1.66 580 0.0284 0.4908 0.9816 0.9816 nan 0.9816 0.0 0.9816
0.025 1.72 600 0.0286 0.4890 0.9779 0.9779 nan 0.9779 0.0 0.9779
0.0225 1.78 620 0.0283 0.4899 0.9799 0.9799 nan 0.9799 0.0 0.9799
0.1922 1.83 640 0.0264 0.4917 0.9834 0.9834 nan 0.9834 0.0 0.9834
0.0349 1.89 660 0.0265 0.4935 0.9871 0.9871 nan 0.9871 0.0 0.9871
0.023 1.95 680 0.0281 0.4887 0.9774 0.9774 nan 0.9774 0.0 0.9774
0.024 2.01 700 0.0262 0.4936 0.9872 0.9872 nan 0.9872 0.0 0.9872
0.0278 2.06 720 0.0261 0.4923 0.9846 0.9846 nan 0.9846 0.0 0.9846
0.0276 2.12 740 0.0263 0.4923 0.9845 0.9845 nan 0.9845 0.0 0.9845
0.0208 2.18 760 0.0262 0.4903 0.9806 0.9806 nan 0.9806 0.0 0.9806
0.0206 2.23 780 0.0258 0.4896 0.9792 0.9792 nan 0.9792 0.0 0.9792
0.017 2.29 800 0.0265 0.4887 0.9775 0.9775 nan 0.9775 0.0 0.9775
0.1898 2.35 820 0.0260 0.4902 0.9803 0.9803 nan 0.9803 0.0 0.9803
0.0167 2.41 840 0.0256 0.4942 0.9883 0.9883 nan 0.9883 0.0 0.9883
0.0212 2.46 860 0.0263 0.4892 0.9784 0.9784 nan 0.9784 0.0 0.9784
0.0182 2.52 880 0.0252 0.4900 0.9800 0.9800 nan 0.9800 0.0 0.9800
0.0218 2.58 900 0.0241 0.4918 0.9836 0.9836 nan 0.9836 0.0 0.9836
0.0197 2.64 920 0.0249 0.4895 0.9791 0.9791 nan 0.9791 0.0 0.9791
0.0254 2.69 940 0.0241 0.4910 0.9819 0.9819 nan 0.9819 0.0 0.9819
0.0276 2.75 960 0.0249 0.4908 0.9816 0.9816 nan 0.9816 0.0 0.9816
0.0167 2.81 980 0.0241 0.4929 0.9858 0.9858 nan 0.9858 0.0 0.9858
0.0173 2.87 1000 0.0241 0.4903 0.9806 0.9806 nan 0.9806 0.0 0.9806
0.081 2.92 1020 0.0251 0.4892 0.9783 0.9783 nan 0.9783 0.0 0.9783
0.0273 2.98 1040 0.0230 0.4921 0.9842 0.9842 nan 0.9842 0.0 0.9842
0.0384 3.04 1060 0.0232 0.4941 0.9881 0.9881 nan 0.9881 0.0 0.9881
0.0229 3.09 1080 0.0235 0.4932 0.9863 0.9863 nan 0.9863 0.0 0.9863
0.0329 3.15 1100 0.0231 0.4941 0.9882 0.9882 nan 0.9882 0.0 0.9882
0.0149 3.21 1120 0.0232 0.4942 0.9883 0.9883 nan 0.9883 0.0 0.9883
0.0163 3.27 1140 0.0237 0.4906 0.9813 0.9813 nan 0.9813 0.0 0.9813
0.0144 3.32 1160 0.0237 0.4903 0.9807 0.9807 nan 0.9807 0.0 0.9807
0.0196 3.38 1180 0.0225 0.4926 0.9851 0.9851 nan 0.9851 0.0 0.9851
0.0194 3.44 1200 0.0224 0.4921 0.9841 0.9841 nan 0.9841 0.0 0.9841
0.0182 3.5 1220 0.0224 0.4916 0.9832 0.9832 nan 0.9832 0.0 0.9832
0.0178 3.55 1240 0.0230 0.4954 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0291 3.61 1260 0.0221 0.4920 0.9840 0.9840 nan 0.9840 0.0 0.9840
0.0167 3.67 1280 0.0219 0.4934 0.9868 0.9868 nan 0.9868 0.0 0.9868
0.0142 3.72 1300 0.0216 0.4943 0.9886 0.9886 nan 0.9886 0.0 0.9886
0.0183 3.78 1320 0.0217 0.4927 0.9855 0.9855 nan 0.9855 0.0 0.9855
0.0156 3.84 1340 0.0216 0.4946 0.9892 0.9892 nan 0.9892 0.0 0.9892
0.0438 3.9 1360 0.0215 0.4932 0.9863 0.9863 nan 0.9863 0.0 0.9863
0.0265 3.95 1380 0.0217 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0481 4.01 1400 0.0231 0.4943 0.9885 0.9885 nan 0.9885 0.0 0.9885
0.0163 4.07 1420 0.0227 0.4948 0.9896 0.9896 nan 0.9896 0.0 0.9896
0.0399 4.13 1440 0.0210 0.4941 0.9881 0.9881 nan 0.9881 0.0 0.9881
0.0178 4.18 1460 0.0221 0.4947 0.9894 0.9894 nan 0.9894 0.0 0.9894
0.0159 4.24 1480 0.0220 0.4940 0.9880 0.9880 nan 0.9880 0.0 0.9880
0.0159 4.3 1500 0.0212 0.4952 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0241 4.36 1520 0.0214 0.4945 0.9890 0.9890 nan 0.9890 0.0 0.9890
0.0159 4.41 1540 0.0215 0.4941 0.9882 0.9882 nan 0.9882 0.0 0.9882
0.0202 4.47 1560 0.0233 0.4953 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.037 4.53 1580 0.0225 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0203 4.58 1600 0.0229 0.4944 0.9889 0.9889 nan 0.9889 0.0 0.9889
0.0244 4.64 1620 0.0210 0.4948 0.9896 0.9896 nan 0.9896 0.0 0.9896
0.0202 4.7 1640 0.0209 0.4954 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0137 4.76 1660 0.0211 0.4940 0.9879 0.9879 nan 0.9879 0.0 0.9879
0.0152 4.81 1680 0.0210 0.4934 0.9868 0.9868 nan 0.9868 0.0 0.9868
0.0159 4.87 1700 0.0206 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0202 4.93 1720 0.0207 0.4930 0.9861 0.9861 nan 0.9861 0.0 0.9861
0.0453 4.99 1740 0.0211 0.4929 0.9859 0.9859 nan 0.9859 0.0 0.9859
0.0203 5.04 1760 0.0207 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.014 5.1 1780 0.0207 0.4957 0.9913 0.9913 nan 0.9913 0.0 0.9913
0.0458 5.16 1800 0.0217 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.012 5.21 1820 0.0218 0.4945 0.9889 0.9889 nan 0.9889 0.0 0.9889
0.0444 5.27 1840 0.0227 0.4949 0.9897 0.9897 nan 0.9897 0.0 0.9897
0.0791 5.33 1860 0.0226 0.4942 0.9884 0.9884 nan 0.9884 0.0 0.9884
0.0349 5.39 1880 0.0222 0.4932 0.9865 0.9865 nan 0.9865 0.0 0.9865
0.0175 5.44 1900 0.0225 0.4943 0.9885 0.9885 nan 0.9885 0.0 0.9885
0.0191 5.5 1920 0.0222 0.4939 0.9878 0.9878 nan 0.9878 0.0 0.9878
0.0219 5.56 1940 0.0217 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0251 5.62 1960 0.0225 0.4947 0.9895 0.9895 nan 0.9895 0.0 0.9895
0.0317 5.67 1980 0.0232 0.4943 0.9887 0.9887 nan 0.9887 0.0 0.9887
0.0177 5.73 2000 0.0232 0.4946 0.9892 0.9892 nan 0.9892 0.0 0.9892
0.0172 5.79 2020 0.0205 0.4939 0.9879 0.9879 nan 0.9879 0.0 0.9879
0.028 5.85 2040 0.0224 0.4968 0.9936 0.9936 nan 0.9936 0.0 0.9936
0.0144 5.9 2060 0.0202 0.4939 0.9877 0.9877 nan 0.9877 0.0 0.9877
0.0143 5.96 2080 0.0203 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0161 6.02 2100 0.0199 0.4945 0.9890 0.9890 nan 0.9890 0.0 0.9890
0.014 6.07 2120 0.0202 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0299 6.13 2140 0.0203 0.4932 0.9863 0.9863 nan 0.9863 0.0 0.9863
0.0152 6.19 2160 0.0201 0.4954 0.9908 0.9908 nan 0.9908 0.0 0.9908
0.0159 6.25 2180 0.0200 0.4956 0.9913 0.9913 nan 0.9913 0.0 0.9913
0.0135 6.3 2200 0.0214 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0122 6.36 2220 0.0211 0.4939 0.9879 0.9879 nan 0.9879 0.0 0.9879
0.0198 6.42 2240 0.0203 0.4955 0.9911 0.9911 nan 0.9911 0.0 0.9911
0.0205 6.48 2260 0.0207 0.4948 0.9897 0.9897 nan 0.9897 0.0 0.9897
0.0144 6.53 2280 0.0205 0.4947 0.9893 0.9893 nan 0.9893 0.0 0.9893
0.0138 6.59 2300 0.0207 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.0228 6.65 2320 0.0224 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0126 6.7 2340 0.0206 0.4949 0.9899 0.9899 nan 0.9899 0.0 0.9899
0.0134 6.76 2360 0.0208 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0105 6.82 2380 0.0229 0.4954 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0407 6.88 2400 0.0219 0.4952 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0148 6.93 2420 0.0212 0.4948 0.9897 0.9897 nan 0.9897 0.0 0.9897
0.011 6.99 2440 0.0216 0.4955 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0149 7.05 2460 0.0221 0.4948 0.9895 0.9895 nan 0.9895 0.0 0.9895
0.0312 7.11 2480 0.0243 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.0146 7.16 2500 0.0236 0.4963 0.9927 0.9927 nan 0.9927 0.0 0.9927
0.0132 7.22 2520 0.0221 0.4954 0.9908 0.9908 nan 0.9908 0.0 0.9908
0.0314 7.28 2540 0.0214 0.4939 0.9878 0.9878 nan 0.9878 0.0 0.9878
0.0177 7.34 2560 0.0221 0.4951 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0213 7.39 2580 0.0223 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.0135 7.45 2600 0.0212 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0361 7.51 2620 0.0223 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.0457 7.56 2640 0.0221 0.4957 0.9914 0.9914 nan 0.9914 0.0 0.9914
0.0191 7.62 2660 0.0238 0.4960 0.9919 0.9919 nan 0.9919 0.0 0.9919
0.0141 7.68 2680 0.0222 0.4951 0.9902 0.9902 nan 0.9902 0.0 0.9902
0.012 7.74 2700 0.0232 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.0134 7.79 2720 0.0226 0.4952 0.9904 0.9904 nan 0.9904 0.0 0.9904
0.0174 7.85 2740 0.0226 0.4957 0.9913 0.9913 nan 0.9913 0.0 0.9913
0.0163 7.91 2760 0.0215 0.4948 0.9895 0.9895 nan 0.9895 0.0 0.9895
0.0159 7.97 2780 0.0213 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0122 8.02 2800 0.0206 0.4950 0.9900 0.9900 nan 0.9900 0.0 0.9900
0.0272 8.08 2820 0.0207 0.4947 0.9893 0.9893 nan 0.9893 0.0 0.9893
0.0178 8.14 2840 0.0214 0.4953 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.1188 8.19 2860 0.0211 0.4946 0.9892 0.9892 nan 0.9892 0.0 0.9892
0.0128 8.25 2880 0.0222 0.4962 0.9924 0.9924 nan 0.9924 0.0 0.9924
0.0171 8.31 2900 0.0222 0.4955 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0522 8.37 2920 0.0227 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.0142 8.42 2940 0.0237 0.4960 0.9920 0.9920 nan 0.9920 0.0 0.9920
0.0422 8.48 2960 0.0234 0.4950 0.9901 0.9901 nan 0.9901 0.0 0.9901
0.0362 8.54 2980 0.0226 0.4954 0.9908 0.9908 nan 0.9908 0.0 0.9908
0.0187 8.6 3000 0.0220 0.4952 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0154 8.65 3020 0.0216 0.4948 0.9896 0.9896 nan 0.9896 0.0 0.9896
0.0387 8.71 3040 0.0219 0.4956 0.9912 0.9912 nan 0.9912 0.0 0.9912
0.038 8.77 3060 0.0214 0.4948 0.9896 0.9896 nan 0.9896 0.0 0.9896
0.0145 8.83 3080 0.0213 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0129 8.88 3100 0.0210 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0129 8.94 3120 0.0213 0.4953 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.0148 9.0 3140 0.0220 0.4958 0.9916 0.9916 nan 0.9916 0.0 0.9916
0.0133 9.05 3160 0.0210 0.4946 0.9891 0.9891 nan 0.9891 0.0 0.9891
0.0158 9.11 3180 0.0213 0.4954 0.9908 0.9908 nan 0.9908 0.0 0.9908
0.0155 9.17 3200 0.0217 0.4957 0.9914 0.9914 nan 0.9914 0.0 0.9914
0.0202 9.23 3220 0.0218 0.4955 0.9911 0.9911 nan 0.9911 0.0 0.9911
0.0128 9.28 3240 0.0211 0.4953 0.9905 0.9905 nan 0.9905 0.0 0.9905
0.0304 9.34 3260 0.0218 0.4959 0.9918 0.9918 nan 0.9918 0.0 0.9918
0.0354 9.4 3280 0.0214 0.4954 0.9908 0.9908 nan 0.9908 0.0 0.9908
0.0188 9.46 3300 0.0214 0.4952 0.9903 0.9903 nan 0.9903 0.0 0.9903
0.0117 9.51 3320 0.0223 0.4961 0.9921 0.9921 nan 0.9921 0.0 0.9921
0.0175 9.57 3340 0.0215 0.4954 0.9907 0.9907 nan 0.9907 0.0 0.9907
0.0304 9.63 3360 0.0217 0.4954 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0166 9.68 3380 0.0216 0.4955 0.9909 0.9909 nan 0.9909 0.0 0.9909
0.0899 9.74 3400 0.0221 0.4962 0.9923 0.9923 nan 0.9923 0.0 0.9923
0.0128 9.8 3420 0.0216 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0149 9.86 3440 0.0217 0.4955 0.9910 0.9910 nan 0.9910 0.0 0.9910
0.0192 9.91 3460 0.0216 0.4953 0.9906 0.9906 nan 0.9906 0.0 0.9906
0.0454 9.97 3480 0.0222 0.4959 0.9919 0.9919 nan 0.9919 0.0 0.9919

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.2.2
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
10
Safetensors
Model size
27.3M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for vigneshgs7/segformer-b2-p142-cvat-2

Base model

nvidia/mit-b2
Finetuned
(13)
this model