sam1120's picture
update model card README.md
344155d
|
raw
history blame
22.9 kB
metadata
license: other
tags:
  - generated_from_trainer
model-index:
  - name: safety-utcustom-train-SF-RGBD-b5
    results: []

safety-utcustom-train-SF-RGBD-b5

This model is a fine-tuned version of nvidia/mit-b5 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0875
  • Mean Iou: 0.7231
  • Mean Accuracy: 0.7658
  • Overall Accuracy: 0.9818
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.5363
  • Accuracy Unsafe: 0.9953
  • Iou Unlabeled: nan
  • Iou Safe: 0.4647
  • Iou Unsafe: 0.9815

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-06
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Safe Accuracy Unsafe Iou Unlabeled Iou Safe Iou Unsafe
0.789 0.91 10 0.9555 0.2939 0.4580 0.8698 nan 0.0203 0.8957 0.0 0.0095 0.8722
0.7579 1.82 20 0.8322 0.3136 0.4866 0.9334 nan 0.0117 0.9614 0.0 0.0069 0.9338
0.7103 2.73 30 0.6729 0.3216 0.4972 0.9602 nan 0.0051 0.9893 0.0 0.0043 0.9604
0.676 3.64 40 0.5336 0.3232 0.4995 0.9675 nan 0.0021 0.9969 0.0 0.0020 0.9675
0.5955 4.55 50 0.4440 0.3233 0.4997 0.9698 nan 0.0001 0.9993 0.0 0.0001 0.9698
0.5691 5.45 60 0.3812 0.3234 0.4999 0.9702 nan 0.0000 0.9997 0.0 0.0000 0.9702
0.5067 6.36 70 0.3590 0.3234 0.4998 0.9701 nan 0.0 0.9996 0.0 0.0 0.9701
0.4656 7.27 80 0.3247 0.3234 0.4999 0.9703 nan 0.0 0.9999 0.0 0.0 0.9703
0.4227 8.18 90 0.3171 0.3234 0.4999 0.9702 nan 0.0 0.9998 0.0 0.0 0.9702
0.3898 9.09 100 0.3122 0.3235 0.5000 0.9701 nan 0.0004 0.9996 0.0 0.0004 0.9701
0.3513 10.0 110 0.2876 0.3234 0.4999 0.9703 nan 0.0 0.9999 0.0 0.0 0.9703
0.4157 10.91 120 0.2820 0.3234 0.4999 0.9703 nan 0.0000 0.9998 0.0 0.0000 0.9703
0.3317 11.82 130 0.2693 0.3234 0.4999 0.9703 nan 0.0 0.9999 0.0 0.0 0.9703
0.321 12.73 140 0.2647 0.3235 0.4999 0.9704 nan 0.0 0.9999 0.0 0.0 0.9704
0.2887 13.64 150 0.2539 0.3235 0.5000 0.9704 nan 0.0 0.9999 0.0 0.0 0.9704
0.3008 14.55 160 0.2536 0.3235 0.5000 0.9704 nan 0.0 0.9999 0.0 0.0 0.9704
0.2853 15.45 170 0.2397 0.3235 0.5000 0.9704 nan 0.0 0.9999 0.0 0.0 0.9704
0.2684 16.36 180 0.2321 0.3235 0.5000 0.9704 nan 0.0 0.9999 0.0 0.0 0.9704
0.2585 17.27 190 0.2208 0.3235 0.5000 0.9704 nan 0.0000 0.9999 0.0 0.0000 0.9704
0.2088 18.18 200 0.2011 0.3262 0.5041 0.9704 nan 0.0084 0.9997 0.0 0.0083 0.9704
0.2518 19.09 210 0.2026 0.3386 0.5228 0.9707 nan 0.0468 0.9989 0.0 0.0451 0.9707
0.218 20.0 220 0.1889 0.5274 0.5431 0.9715 nan 0.0879 0.9984 nan 0.0834 0.9714
0.2046 20.91 230 0.1847 0.5741 0.5950 0.9732 nan 0.1931 0.9969 nan 0.1752 0.9730
0.2147 21.82 240 0.1766 0.5791 0.6005 0.9734 nan 0.2042 0.9968 nan 0.1850 0.9733
0.188 22.73 250 0.1726 0.5792 0.5996 0.9737 nan 0.2020 0.9972 nan 0.1849 0.9735
0.2175 23.64 260 0.1706 0.5741 0.5936 0.9735 nan 0.1898 0.9974 nan 0.1748 0.9734
0.2059 24.55 270 0.1689 0.6212 0.6484 0.9756 nan 0.3006 0.9962 nan 0.2670 0.9754
0.1776 25.45 280 0.1612 0.6171 0.6418 0.9757 nan 0.2870 0.9967 nan 0.2587 0.9755
0.1585 26.36 290 0.1537 0.6683 0.7099 0.9776 nan 0.4254 0.9944 nan 0.3593 0.9773
0.1588 27.27 300 0.1527 0.6152 0.6384 0.9758 nan 0.2798 0.9970 nan 0.2548 0.9756
0.153 28.18 310 0.1452 0.6711 0.7117 0.9779 nan 0.4288 0.9946 nan 0.3646 0.9776
0.1623 29.09 320 0.1442 0.6752 0.7173 0.9781 nan 0.4401 0.9945 nan 0.3726 0.9778
0.1603 30.0 330 0.1407 0.6671 0.7004 0.9784 nan 0.4050 0.9958 nan 0.3562 0.9781
0.1694 30.91 340 0.1343 0.6849 0.7266 0.9789 nan 0.4585 0.9948 nan 0.3911 0.9786
0.1585 31.82 350 0.1353 0.6606 0.6912 0.9782 nan 0.3861 0.9962 nan 0.3433 0.9779
0.1342 32.73 360 0.1338 0.6961 0.7451 0.9792 nan 0.4963 0.9939 nan 0.4132 0.9789
0.1358 33.64 370 0.1342 0.6986 0.7493 0.9793 nan 0.5048 0.9937 nan 0.4182 0.9789
0.1493 34.55 380 0.1297 0.6936 0.7377 0.9794 nan 0.4809 0.9946 nan 0.4080 0.9791
0.1435 35.45 390 0.1271 0.7156 0.7791 0.9797 nan 0.5658 0.9923 nan 0.4518 0.9794
0.1305 36.36 400 0.1225 0.6776 0.7062 0.9796 nan 0.4157 0.9968 nan 0.3758 0.9793
0.1496 37.27 410 0.1237 0.7108 0.7659 0.9799 nan 0.5385 0.9934 nan 0.4420 0.9796
0.1445 38.18 420 0.1207 0.7206 0.7843 0.9801 nan 0.5763 0.9924 nan 0.4615 0.9798
0.1307 39.09 430 0.1194 0.7023 0.7404 0.9806 nan 0.4853 0.9956 nan 0.4244 0.9803
0.1379 40.0 440 0.1174 0.7176 0.7822 0.9798 nan 0.5722 0.9922 nan 0.4557 0.9795
0.1202 40.91 450 0.1143 0.7175 0.7671 0.9809 nan 0.5399 0.9943 nan 0.4544 0.9805
0.1239 41.82 460 0.1150 0.7179 0.7756 0.9803 nan 0.5580 0.9932 nan 0.4558 0.9800
0.1183 42.73 470 0.1129 0.7021 0.7369 0.9808 nan 0.4777 0.9961 nan 0.4236 0.9805
0.1202 43.64 480 0.1119 0.7300 0.7930 0.9810 nan 0.5933 0.9928 nan 0.4793 0.9806
0.1276 44.55 490 0.1131 0.7183 0.7683 0.9809 nan 0.5425 0.9942 nan 0.4561 0.9806
0.1172 45.45 500 0.1135 0.7244 0.8085 0.9791 nan 0.6272 0.9898 nan 0.4700 0.9787
0.1288 46.36 510 0.1105 0.6850 0.7105 0.9804 nan 0.4236 0.9974 nan 0.3898 0.9802
0.1185 47.27 520 0.1130 0.7254 0.7975 0.9800 nan 0.6035 0.9914 nan 0.4711 0.9796
0.1045 48.18 530 0.1102 0.7241 0.7840 0.9807 nan 0.5750 0.9930 nan 0.4679 0.9804
0.1211 49.09 540 0.1069 0.7260 0.7870 0.9808 nan 0.5812 0.9929 nan 0.4715 0.9804
0.1206 50.0 550 0.1071 0.7169 0.7587 0.9814 nan 0.5221 0.9953 nan 0.4528 0.9811
0.1193 50.91 560 0.1053 0.7105 0.7459 0.9814 nan 0.4956 0.9961 nan 0.4398 0.9811
0.1116 51.82 570 0.1043 0.7169 0.7604 0.9812 nan 0.5257 0.9951 nan 0.4528 0.9809
0.1218 52.73 580 0.1078 0.7262 0.7929 0.9804 nan 0.5936 0.9922 nan 0.4724 0.9801
0.1284 53.64 590 0.1054 0.7248 0.7898 0.9804 nan 0.5872 0.9924 nan 0.4696 0.9801
0.096 54.55 600 0.1028 0.7193 0.7697 0.9809 nan 0.5451 0.9942 nan 0.4580 0.9806
0.1091 55.45 610 0.1022 0.7261 0.7965 0.9802 nan 0.6014 0.9917 nan 0.4725 0.9798
0.1068 56.36 620 0.1015 0.7092 0.7444 0.9813 nan 0.4926 0.9962 nan 0.4374 0.9810
0.106 57.27 630 0.1011 0.7270 0.7825 0.9812 nan 0.5713 0.9937 nan 0.4731 0.9809
0.1009 58.18 640 0.1028 0.6947 0.7240 0.9807 nan 0.4512 0.9969 nan 0.4089 0.9805
0.1018 59.09 650 0.1022 0.7290 0.7986 0.9805 nan 0.6053 0.9919 nan 0.4779 0.9801
0.1012 60.0 660 0.1016 0.7116 0.7558 0.9808 nan 0.5167 0.9949 nan 0.4427 0.9805
0.1052 60.91 670 0.0999 0.7206 0.7703 0.9811 nan 0.5464 0.9943 nan 0.4604 0.9808
0.1229 61.82 680 0.0993 0.7280 0.7822 0.9814 nan 0.5706 0.9939 nan 0.4750 0.9810
0.0963 62.73 690 0.0974 0.7282 0.7841 0.9813 nan 0.5746 0.9936 nan 0.4754 0.9809
0.1115 63.64 700 0.0974 0.7187 0.7597 0.9816 nan 0.5239 0.9955 nan 0.4562 0.9813
0.1025 64.55 710 0.0964 0.7312 0.7890 0.9814 nan 0.5845 0.9935 nan 0.4813 0.9811
0.0916 65.45 720 0.0962 0.7249 0.7720 0.9816 nan 0.5493 0.9947 nan 0.4685 0.9813
0.1055 66.36 730 0.0947 0.7191 0.7613 0.9815 nan 0.5273 0.9953 nan 0.4571 0.9812
0.1081 67.27 740 0.0964 0.7308 0.8006 0.9806 nan 0.6093 0.9919 nan 0.4813 0.9802
0.1039 68.18 750 0.0950 0.7190 0.7675 0.9811 nan 0.5405 0.9945 nan 0.4573 0.9807
0.106 69.09 760 0.0939 0.7246 0.7753 0.9813 nan 0.5564 0.9943 nan 0.4682 0.9810
0.0912 70.0 770 0.0936 0.7212 0.7663 0.9814 nan 0.5377 0.9949 nan 0.4612 0.9811
0.0951 70.91 780 0.0938 0.7249 0.7771 0.9813 nan 0.5600 0.9941 nan 0.4689 0.9809
0.0998 71.82 790 0.0928 0.7258 0.7759 0.9815 nan 0.5573 0.9944 nan 0.4705 0.9812
0.0889 72.73 800 0.0931 0.7220 0.7674 0.9815 nan 0.5398 0.9949 nan 0.4628 0.9812
0.0906 73.64 810 0.0928 0.7171 0.7555 0.9816 nan 0.5151 0.9958 nan 0.4528 0.9813
0.0911 74.55 820 0.0924 0.7265 0.7810 0.9812 nan 0.5682 0.9938 nan 0.4722 0.9809
0.0907 75.45 830 0.0929 0.7089 0.7415 0.9815 nan 0.4864 0.9965 nan 0.4365 0.9812
0.1117 76.36 840 0.0934 0.7195 0.7598 0.9817 nan 0.5239 0.9956 nan 0.4576 0.9814
0.0812 77.27 850 0.0915 0.7210 0.7617 0.9817 nan 0.5279 0.9956 nan 0.4605 0.9814
0.0888 78.18 860 0.0915 0.7266 0.7778 0.9814 nan 0.5615 0.9942 nan 0.4720 0.9811
0.09 79.09 870 0.0920 0.7220 0.7681 0.9814 nan 0.5414 0.9948 nan 0.4628 0.9811
0.1052 80.0 880 0.0917 0.7299 0.7899 0.9812 nan 0.5866 0.9932 nan 0.4790 0.9808
0.0867 80.91 890 0.0912 0.7193 0.7603 0.9816 nan 0.5252 0.9955 nan 0.4573 0.9813
0.0942 81.82 900 0.0925 0.7152 0.7525 0.9815 nan 0.5091 0.9959 nan 0.4490 0.9813
0.0917 82.73 910 0.0908 0.7248 0.7702 0.9817 nan 0.5454 0.9950 nan 0.4682 0.9814
0.103 83.64 920 0.0912 0.7243 0.7701 0.9816 nan 0.5452 0.9949 nan 0.4672 0.9813
0.0939 84.55 930 0.0900 0.7265 0.7743 0.9817 nan 0.5539 0.9947 nan 0.4717 0.9814
0.0892 85.45 940 0.0900 0.7225 0.7642 0.9818 nan 0.5330 0.9954 nan 0.4635 0.9815
0.0899 86.36 950 0.0905 0.7295 0.7847 0.9814 nan 0.5756 0.9938 nan 0.4778 0.9811
0.0877 87.27 960 0.0893 0.7299 0.7854 0.9814 nan 0.5771 0.9937 nan 0.4787 0.9811
0.0851 88.18 970 0.0897 0.7163 0.7524 0.9817 nan 0.5087 0.9961 nan 0.4512 0.9814
0.0857 89.09 980 0.0894 0.7229 0.7658 0.9817 nan 0.5363 0.9953 nan 0.4644 0.9814
0.0821 90.0 990 0.0895 0.7218 0.7643 0.9817 nan 0.5333 0.9953 nan 0.4623 0.9814
0.0931 90.91 1000 0.0895 0.7265 0.7763 0.9815 nan 0.5581 0.9944 nan 0.4718 0.9812
0.0787 91.82 1010 0.0889 0.7251 0.7735 0.9815 nan 0.5525 0.9946 nan 0.4689 0.9812
0.0865 92.73 1020 0.0883 0.7279 0.7800 0.9815 nan 0.5659 0.9941 nan 0.4746 0.9812
0.0939 93.64 1030 0.0891 0.7268 0.7764 0.9816 nan 0.5583 0.9945 nan 0.4723 0.9813
0.0874 94.55 1040 0.0893 0.7197 0.7607 0.9816 nan 0.5258 0.9955 nan 0.4580 0.9813
0.0927 95.45 1050 0.0894 0.7211 0.7636 0.9816 nan 0.5319 0.9953 nan 0.4608 0.9813
0.0808 96.36 1060 0.0897 0.7239 0.7696 0.9816 nan 0.5444 0.9949 nan 0.4665 0.9813
0.0924 97.27 1070 0.0892 0.7243 0.7697 0.9817 nan 0.5445 0.9950 nan 0.4671 0.9814
0.08 98.18 1080 0.0884 0.7258 0.7735 0.9816 nan 0.5522 0.9947 nan 0.4703 0.9813
0.0789 99.09 1090 0.0883 0.7259 0.7733 0.9816 nan 0.5519 0.9947 nan 0.4704 0.9813
0.0796 100.0 1100 0.0875 0.7231 0.7658 0.9818 nan 0.5363 0.9953 nan 0.4647 0.9815

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3