Edit model card

safety-utcustom-train-SF-RGBD-b5

This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0867
  • Mean Iou: 0.7280
  • Mean Accuracy: 0.7762
  • Overall Accuracy: 0.9818
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.5578
  • Accuracy Unsafe: 0.9947
  • Iou Unlabeled: nan
  • Iou Safe: 0.4745
  • Iou Unsafe: 0.9814

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-06
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 120

Training results

Training Loss Epoch Step Accuracy Safe Accuracy Unlabeled Accuracy Unsafe Iou Safe Iou Unlabeled Iou Unsafe Validation Loss Mean Accuracy Mean Iou Overall Accuracy
0.789 0.91 10 0.0203 nan 0.8957 0.0095 0.0 0.8722 0.9555 0.4580 0.2939 0.8698
0.7579 1.82 20 0.0117 nan 0.9614 0.0069 0.0 0.9338 0.8322 0.4866 0.3136 0.9334
0.7103 2.73 30 0.0051 nan 0.9893 0.0043 0.0 0.9604 0.6729 0.4972 0.3216 0.9602
0.676 3.64 40 0.0021 nan 0.9969 0.0020 0.0 0.9675 0.5336 0.4995 0.3232 0.9675
0.5955 4.55 50 0.0001 nan 0.9993 0.0001 0.0 0.9698 0.4440 0.4997 0.3233 0.9698
0.5691 5.45 60 0.0000 nan 0.9997 0.0000 0.0 0.9702 0.3812 0.4999 0.3234 0.9702
0.5067 6.36 70 0.0 nan 0.9996 0.0 0.0 0.9701 0.3590 0.4998 0.3234 0.9701
0.4656 7.27 80 0.0 nan 0.9999 0.0 0.0 0.9703 0.3247 0.4999 0.3234 0.9703
0.4227 8.18 90 0.0 nan 0.9998 0.0 0.0 0.9702 0.3171 0.4999 0.3234 0.9702
0.3898 9.09 100 0.0004 nan 0.9996 0.0004 0.0 0.9701 0.3122 0.5000 0.3235 0.9701
0.3513 10.0 110 0.0 nan 0.9999 0.0 0.0 0.9703 0.2876 0.4999 0.3234 0.9703
0.4157 10.91 120 0.0000 nan 0.9998 0.0000 0.0 0.9703 0.2820 0.4999 0.3234 0.9703
0.3317 11.82 130 0.0 nan 0.9999 0.0 0.0 0.9703 0.2693 0.4999 0.3234 0.9703
0.321 12.73 140 0.0 nan 0.9999 0.0 0.0 0.9704 0.2647 0.4999 0.3235 0.9704
0.2887 13.64 150 0.0 nan 0.9999 0.0 0.0 0.9704 0.2539 0.5000 0.3235 0.9704
0.3008 14.55 160 0.0 nan 0.9999 0.0 0.0 0.9704 0.2536 0.5000 0.3235 0.9704
0.2853 15.45 170 0.0 nan 0.9999 0.0 0.0 0.9704 0.2397 0.5000 0.3235 0.9704
0.2684 16.36 180 0.0 nan 0.9999 0.0 0.0 0.9704 0.2321 0.5000 0.3235 0.9704
0.2585 17.27 190 0.0000 nan 0.9999 0.0000 0.0 0.9704 0.2208 0.5000 0.3235 0.9704
0.2088 18.18 200 0.0084 nan 0.9997 0.0083 0.0 0.9704 0.2011 0.5041 0.3262 0.9704
0.2518 19.09 210 0.0468 nan 0.9989 0.0451 0.0 0.9707 0.2026 0.5228 0.3386 0.9707
0.218 20.0 220 0.0879 nan 0.9984 0.0834 nan 0.9714 0.1889 0.5431 0.5274 0.9715
0.2046 20.91 230 0.1931 nan 0.9969 0.1752 nan 0.9730 0.1847 0.5950 0.5741 0.9732
0.2147 21.82 240 0.2042 nan 0.9968 0.1850 nan 0.9733 0.1766 0.6005 0.5791 0.9734
0.188 22.73 250 0.2020 nan 0.9972 0.1849 nan 0.9735 0.1726 0.5996 0.5792 0.9737
0.2175 23.64 260 0.1898 nan 0.9974 0.1748 nan 0.9734 0.1706 0.5936 0.5741 0.9735
0.2059 24.55 270 0.3006 nan 0.9962 0.2670 nan 0.9754 0.1689 0.6484 0.6212 0.9756
0.1776 25.45 280 0.2870 nan 0.9967 0.2587 nan 0.9755 0.1612 0.6418 0.6171 0.9757
0.1585 26.36 290 0.4254 nan 0.9944 0.3593 nan 0.9773 0.1537 0.7099 0.6683 0.9776
0.1588 27.27 300 0.2798 nan 0.9970 0.2548 nan 0.9756 0.1527 0.6384 0.6152 0.9758
0.153 28.18 310 0.4288 nan 0.9946 0.3646 nan 0.9776 0.1452 0.7117 0.6711 0.9779
0.1623 29.09 320 0.4401 nan 0.9945 0.3726 nan 0.9778 0.1442 0.7173 0.6752 0.9781
0.1603 30.0 330 0.4050 nan 0.9958 0.3562 nan 0.9781 0.1407 0.7004 0.6671 0.9784
0.1694 30.91 340 0.4585 nan 0.9948 0.3911 nan 0.9786 0.1343 0.7266 0.6849 0.9789
0.1585 31.82 350 0.3861 nan 0.9962 0.3433 nan 0.9779 0.1353 0.6912 0.6606 0.9782
0.1342 32.73 360 0.4963 nan 0.9939 0.4132 nan 0.9789 0.1338 0.7451 0.6961 0.9792
0.1358 33.64 370 0.5048 nan 0.9937 0.4182 nan 0.9789 0.1342 0.7493 0.6986 0.9793
0.1493 34.55 380 0.4809 nan 0.9946 0.4080 nan 0.9791 0.1297 0.7377 0.6936 0.9794
0.1435 35.45 390 0.5658 nan 0.9923 0.4518 nan 0.9794 0.1271 0.7791 0.7156 0.9797
0.1305 36.36 400 0.4157 nan 0.9968 0.3758 nan 0.9793 0.1225 0.7062 0.6776 0.9796
0.1496 37.27 410 0.5385 nan 0.9934 0.4420 nan 0.9796 0.1237 0.7659 0.7108 0.9799
0.1445 38.18 420 0.5763 nan 0.9924 0.4615 nan 0.9798 0.1207 0.7843 0.7206 0.9801
0.1307 39.09 430 0.4853 nan 0.9956 0.4244 nan 0.9803 0.1194 0.7404 0.7023 0.9806
0.1379 40.0 440 0.5722 nan 0.9922 0.4557 nan 0.9795 0.1174 0.7822 0.7176 0.9798
0.1202 40.91 450 0.5399 nan 0.9943 0.4544 nan 0.9805 0.1143 0.7671 0.7175 0.9809
0.1239 41.82 460 0.5580 nan 0.9932 0.4558 nan 0.9800 0.1150 0.7756 0.7179 0.9803
0.1183 42.73 470 0.4777 nan 0.9961 0.4236 nan 0.9805 0.1129 0.7369 0.7021 0.9808
0.1202 43.64 480 0.5933 nan 0.9928 0.4793 nan 0.9806 0.1119 0.7930 0.7300 0.9810
0.1276 44.55 490 0.5425 nan 0.9942 0.4561 nan 0.9806 0.1131 0.7683 0.7183 0.9809
0.1172 45.45 500 0.6272 nan 0.9898 0.4700 nan 0.9787 0.1135 0.8085 0.7244 0.9791
0.1288 46.36 510 0.4236 nan 0.9974 0.3898 nan 0.9802 0.1105 0.7105 0.6850 0.9804
0.1185 47.27 520 0.6035 nan 0.9914 0.4711 nan 0.9796 0.1130 0.7975 0.7254 0.9800
0.1045 48.18 530 0.5750 nan 0.9930 0.4679 nan 0.9804 0.1102 0.7840 0.7241 0.9807
0.1211 49.09 540 0.5812 nan 0.9929 0.4715 nan 0.9804 0.1069 0.7870 0.7260 0.9808
0.1206 50.0 550 0.5221 nan 0.9953 0.4528 nan 0.9811 0.1071 0.7587 0.7169 0.9814
0.1193 50.91 560 0.4956 nan 0.9961 0.4398 nan 0.9811 0.1053 0.7459 0.7105 0.9814
0.1116 51.82 570 0.5257 nan 0.9951 0.4528 nan 0.9809 0.1043 0.7604 0.7169 0.9812
0.1218 52.73 580 0.5936 nan 0.9922 0.4724 nan 0.9801 0.1078 0.7929 0.7262 0.9804
0.1284 53.64 590 0.5872 nan 0.9924 0.4696 nan 0.9801 0.1054 0.7898 0.7248 0.9804
0.096 54.55 600 0.5451 nan 0.9942 0.4580 nan 0.9806 0.1028 0.7697 0.7193 0.9809
0.1091 55.45 610 0.6014 nan 0.9917 0.4725 nan 0.9798 0.1022 0.7965 0.7261 0.9802
0.1068 56.36 620 0.4926 nan 0.9962 0.4374 nan 0.9810 0.1015 0.7444 0.7092 0.9813
0.106 57.27 630 0.5713 nan 0.9937 0.4731 nan 0.9809 0.1011 0.7825 0.7270 0.9812
0.1009 58.18 640 0.4512 nan 0.9969 0.4089 nan 0.9805 0.1028 0.7240 0.6947 0.9807
0.1018 59.09 650 0.6053 nan 0.9919 0.4779 nan 0.9801 0.1022 0.7986 0.7290 0.9805
0.1012 60.0 660 0.5167 nan 0.9949 0.4427 nan 0.9805 0.1016 0.7558 0.7116 0.9808
0.1052 60.91 670 0.5464 nan 0.9943 0.4604 nan 0.9808 0.0999 0.7703 0.7206 0.9811
0.1229 61.82 680 0.5706 nan 0.9939 0.4750 nan 0.9810 0.0993 0.7822 0.7280 0.9814
0.0963 62.73 690 0.5746 nan 0.9936 0.4754 nan 0.9809 0.0974 0.7841 0.7282 0.9813
0.1115 63.64 700 0.5239 nan 0.9955 0.4562 nan 0.9813 0.0974 0.7597 0.7187 0.9816
0.1025 64.55 710 0.5845 nan 0.9935 0.4813 nan 0.9811 0.0964 0.7890 0.7312 0.9814
0.0916 65.45 720 0.5493 nan 0.9947 0.4685 nan 0.9813 0.0962 0.7720 0.7249 0.9816
0.1055 66.36 730 0.5273 nan 0.9953 0.4571 nan 0.9812 0.0947 0.7613 0.7191 0.9815
0.1081 67.27 740 0.6093 nan 0.9919 0.4813 nan 0.9802 0.0964 0.8006 0.7308 0.9806
0.1039 68.18 750 0.5405 nan 0.9945 0.4573 nan 0.9807 0.0950 0.7675 0.7190 0.9811
0.106 69.09 760 0.5564 nan 0.9943 0.4682 nan 0.9810 0.0939 0.7753 0.7246 0.9813
0.0912 70.0 770 0.5377 nan 0.9949 0.4612 nan 0.9811 0.0936 0.7663 0.7212 0.9814
0.0951 70.91 780 0.5600 nan 0.9941 0.4689 nan 0.9809 0.0938 0.7771 0.7249 0.9813
0.0998 71.82 790 0.5573 nan 0.9944 0.4705 nan 0.9812 0.0928 0.7759 0.7258 0.9815
0.0889 72.73 800 0.5398 nan 0.9949 0.4628 nan 0.9812 0.0931 0.7674 0.7220 0.9815
0.0906 73.64 810 0.5151 nan 0.9958 0.4528 nan 0.9813 0.0928 0.7555 0.7171 0.9816
0.0911 74.55 820 0.5682 nan 0.9938 0.4722 nan 0.9809 0.0924 0.7810 0.7265 0.9812
0.0907 75.45 830 0.4864 nan 0.9965 0.4365 nan 0.9812 0.0929 0.7415 0.7089 0.9815
0.1117 76.36 840 0.5239 nan 0.9956 0.4576 nan 0.9814 0.0934 0.7598 0.7195 0.9817
0.0812 77.27 850 0.5279 nan 0.9956 0.4605 nan 0.9814 0.0915 0.7617 0.7210 0.9817
0.0888 78.18 860 0.5615 nan 0.9942 0.4720 nan 0.9811 0.0915 0.7778 0.7266 0.9814
0.09 79.09 870 0.5414 nan 0.9948 0.4628 nan 0.9811 0.0920 0.7681 0.7220 0.9814
0.1052 80.0 880 0.5866 nan 0.9932 0.4790 nan 0.9808 0.0917 0.7899 0.7299 0.9812
0.0867 80.91 890 0.5252 nan 0.9955 0.4573 nan 0.9813 0.0912 0.7603 0.7193 0.9816
0.0942 81.82 900 0.5091 nan 0.9959 0.4490 nan 0.9813 0.0925 0.7525 0.7152 0.9815
0.0917 82.73 910 0.5454 nan 0.9950 0.4682 nan 0.9814 0.0908 0.7702 0.7248 0.9817
0.103 83.64 920 0.5452 nan 0.9949 0.4672 nan 0.9813 0.0912 0.7701 0.7243 0.9816
0.0939 84.55 930 0.5539 nan 0.9947 0.4717 nan 0.9814 0.0900 0.7743 0.7265 0.9817
0.0892 85.45 940 0.5330 nan 0.9954 0.4635 nan 0.9815 0.0900 0.7642 0.7225 0.9818
0.0899 86.36 950 0.5756 nan 0.9938 0.4778 nan 0.9811 0.0905 0.7847 0.7295 0.9814
0.0877 87.27 960 0.5771 nan 0.9937 0.4787 nan 0.9811 0.0893 0.7854 0.7299 0.9814
0.0851 88.18 970 0.5087 nan 0.9961 0.4512 nan 0.9814 0.0897 0.7524 0.7163 0.9817
0.0857 89.09 980 0.5363 nan 0.9953 0.4644 nan 0.9814 0.0894 0.7658 0.7229 0.9817
0.0821 90.0 990 0.5333 nan 0.9953 0.4623 nan 0.9814 0.0895 0.7643 0.7218 0.9817
0.0931 90.91 1000 0.5581 nan 0.9944 0.4718 nan 0.9812 0.0895 0.7763 0.7265 0.9815
0.0787 91.82 1010 0.5525 nan 0.9946 0.4689 nan 0.9812 0.0889 0.7735 0.7251 0.9815
0.0865 92.73 1020 0.5659 nan 0.9941 0.4746 nan 0.9812 0.0883 0.7800 0.7279 0.9815
0.0939 93.64 1030 0.5583 nan 0.9945 0.4723 nan 0.9813 0.0891 0.7764 0.7268 0.9816
0.0874 94.55 1040 0.5258 nan 0.9955 0.4580 nan 0.9813 0.0893 0.7607 0.7197 0.9816
0.0927 95.45 1050 0.5319 nan 0.9953 0.4608 nan 0.9813 0.0894 0.7636 0.7211 0.9816
0.0808 96.36 1060 0.5444 nan 0.9949 0.4665 nan 0.9813 0.0897 0.7696 0.7239 0.9816
0.0924 97.27 1070 0.5445 nan 0.9950 0.4671 nan 0.9814 0.0892 0.7697 0.7243 0.9817
0.08 98.18 1080 0.5522 nan 0.9947 0.4703 nan 0.9813 0.0884 0.7735 0.7258 0.9816
0.0798 99.09 1090 0.0880 0.7300 0.7842 0.9815 nan 0.5745 0.9939 nan 0.4788 0.9812
0.0789 100.0 1100 0.0877 0.7231 0.7668 0.9817 nan 0.5383 0.9952 nan 0.4647 0.9814
0.0801 100.91 1110 0.0885 0.7232 0.7677 0.9816 nan 0.5404 0.9951 nan 0.4650 0.9813
0.1043 101.82 1120 0.0891 0.7242 0.7697 0.9816 nan 0.5445 0.9950 nan 0.4670 0.9813
0.0893 102.73 1130 0.0882 0.7263 0.7728 0.9817 nan 0.5508 0.9949 nan 0.4712 0.9814
0.0923 103.64 1140 0.0892 0.7134 0.7504 0.9815 nan 0.5048 0.9960 nan 0.4457 0.9812
0.0915 104.55 1150 0.0884 0.7293 0.7795 0.9817 nan 0.5646 0.9944 nan 0.4772 0.9814
0.0859 105.45 1160 0.0880 0.7340 0.7941 0.9815 nan 0.5949 0.9932 nan 0.4869 0.9811
0.0872 106.36 1170 0.0872 0.7298 0.7815 0.9817 nan 0.5688 0.9942 nan 0.4783 0.9814
0.0845 107.27 1180 0.0881 0.7310 0.7843 0.9817 nan 0.5746 0.9940 nan 0.4806 0.9813
0.0842 108.18 1190 0.0869 0.7285 0.7766 0.9818 nan 0.5584 0.9947 nan 0.4755 0.9815
0.0906 109.09 1200 0.0875 0.7277 0.7754 0.9818 nan 0.5560 0.9947 nan 0.4740 0.9815
0.0953 110.0 1210 0.0878 0.7289 0.7777 0.9818 nan 0.5608 0.9946 nan 0.4764 0.9815
0.0988 110.91 1220 0.0880 0.7303 0.7809 0.9818 nan 0.5674 0.9944 nan 0.4790 0.9815
0.0894 111.82 1230 0.0869 0.7300 0.7801 0.9818 nan 0.5657 0.9945 nan 0.4785 0.9815
0.0788 112.73 1240 0.0868 0.7283 0.7758 0.9818 nan 0.5569 0.9948 nan 0.4750 0.9815
0.0793 113.64 1250 0.0870 0.7281 0.7758 0.9818 nan 0.5569 0.9947 nan 0.4747 0.9815
0.084 114.55 1260 0.0874 0.7295 0.7809 0.9817 nan 0.5675 0.9943 nan 0.4777 0.9814
0.0832 115.45 1270 0.0875 0.7277 0.7760 0.9817 nan 0.5574 0.9946 nan 0.4739 0.9814
0.0833 116.36 1280 0.0873 0.7274 0.7755 0.9817 nan 0.5563 0.9947 nan 0.4735 0.9814
0.0786 117.27 1290 0.0867 0.7277 0.7754 0.9818 nan 0.5561 0.9947 nan 0.4740 0.9815
0.0839 118.18 1300 0.0865 0.7285 0.7779 0.9817 nan 0.5613 0.9945 nan 0.4755 0.9814
0.0847 119.09 1310 0.0877 0.7293 0.7816 0.9816 nan 0.5691 0.9941 nan 0.4773 0.9813
0.0933 120.0 1320 0.0867 0.7280 0.7762 0.9818 nan 0.5578 0.9947 nan 0.4745 0.9814

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
2