Edit model card

INTERNAL_BEST-safety-utcustom-train-SF-RGB-b5

This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0472
  • Mean Iou: 0.8728
  • Mean Accuracy: 0.9195
  • Overall Accuracy: 0.9919
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.8426
  • Accuracy Unsafe: 0.9964
  • Iou Unlabeled: nan
  • Iou Safe: 0.7540
  • Iou Unsafe: 0.9917

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 2000

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Safe Accuracy Unsafe Iou Unlabeled Iou Safe Iou Unsafe
1.2199 2.0 20 1.1474 0.0765 0.4399 0.2168 nan 0.6770 0.2028 0.0 0.0279 0.2014
1.1542 4.0 40 1.0616 0.1558 0.6082 0.4365 nan 0.7908 0.4257 0.0 0.0436 0.4237
1.0324 6.0 60 0.9140 0.2569 0.6886 0.7091 nan 0.6667 0.7104 0.0 0.0672 0.7036
0.8058 8.0 80 0.7629 0.2970 0.6780 0.8115 nan 0.5360 0.8199 0.0 0.0823 0.8086
0.681 10.0 100 0.5545 0.3510 0.6964 0.9135 nan 0.4657 0.9271 0.0 0.1407 0.9123
0.5248 12.0 120 0.4153 0.3738 0.6747 0.9462 nan 0.3861 0.9633 0.0 0.1757 0.9456
0.3372 14.0 140 0.3050 0.3896 0.6548 0.9628 nan 0.3276 0.9821 0.0 0.2064 0.9624
0.2818 16.0 160 0.2346 0.4313 0.7331 0.9703 nan 0.4810 0.9852 0.0 0.3239 0.9699
0.2081 18.0 180 0.1802 0.4762 0.7973 0.9782 nan 0.6050 0.9896 0.0 0.4509 0.9778
0.141 20.0 200 0.1362 0.4968 0.7945 0.9830 nan 0.5943 0.9948 0.0 0.5078 0.9827
0.0963 22.0 220 0.1032 0.7409 0.7652 0.9841 nan 0.5326 0.9979 nan 0.4978 0.9839
0.0842 24.0 240 0.0770 0.7840 0.8200 0.9863 nan 0.6432 0.9968 nan 0.5818 0.9861
0.0702 26.0 260 0.0669 0.7836 0.8193 0.9863 nan 0.6417 0.9968 nan 0.5812 0.9861
0.0706 28.0 280 0.0671 0.8065 0.8593 0.9872 nan 0.7234 0.9953 nan 0.6261 0.9870
0.0747 30.0 300 0.0551 0.7808 0.7980 0.9870 nan 0.5971 0.9988 nan 0.5748 0.9867
0.057 32.0 320 0.0492 0.8267 0.8736 0.9888 nan 0.7511 0.9961 nan 0.6648 0.9886
0.0435 34.0 340 0.0507 0.7956 0.8134 0.9878 nan 0.6280 0.9988 nan 0.6035 0.9876
0.0326 36.0 360 0.0418 0.8422 0.8895 0.9898 nan 0.7830 0.9961 nan 0.6947 0.9896
0.0262 38.0 380 0.0420 0.8280 0.8550 0.9895 nan 0.7120 0.9979 nan 0.6667 0.9893
0.0268 40.0 400 0.0392 0.8407 0.8822 0.9899 nan 0.7676 0.9967 nan 0.6918 0.9897
0.0395 42.0 420 0.0466 0.5436 0.8370 0.9889 nan 0.6755 0.9984 0.0 0.6422 0.9887
0.0279 44.0 440 0.0439 0.8321 0.8946 0.9887 nan 0.7945 0.9947 nan 0.6758 0.9885
0.0468 46.0 460 0.0360 0.8480 0.8894 0.9904 nan 0.7822 0.9967 nan 0.7059 0.9901
0.0233 48.0 480 0.0376 0.8507 0.8962 0.9905 nan 0.7960 0.9964 nan 0.7113 0.9902
0.0288 50.0 500 0.0386 0.8404 0.8845 0.9898 nan 0.7725 0.9964 nan 0.6913 0.9896
0.0266 52.0 520 0.0361 0.8455 0.8768 0.9905 nan 0.7560 0.9976 nan 0.7008 0.9902
0.0241 54.0 540 0.0367 0.8504 0.9039 0.9902 nan 0.8122 0.9957 nan 0.7109 0.9900
0.0239 56.0 560 0.0414 0.8401 0.8809 0.9899 nan 0.7650 0.9967 nan 0.6906 0.9896
0.0221 58.0 580 0.0375 0.8536 0.8971 0.9907 nan 0.7977 0.9966 nan 0.7167 0.9905
0.0324 60.0 600 0.0390 0.8566 0.9017 0.9908 nan 0.8069 0.9964 nan 0.7226 0.9906
0.0264 62.0 620 0.0405 0.8506 0.9088 0.9901 nan 0.8224 0.9952 nan 0.7114 0.9899
0.0146 64.0 640 0.0356 0.8627 0.9158 0.9911 nan 0.8358 0.9958 nan 0.7346 0.9909
0.0252 66.0 660 0.0310 0.8667 0.9010 0.9917 nan 0.8046 0.9974 nan 0.7418 0.9915
0.0155 68.0 680 0.0359 0.8567 0.9056 0.9908 nan 0.8152 0.9961 nan 0.7229 0.9905
0.0169 70.0 700 0.0490 0.8476 0.9182 0.9896 nan 0.8422 0.9941 nan 0.7058 0.9894
0.0142 72.0 720 0.0357 0.8442 0.8771 0.9903 nan 0.7568 0.9974 nan 0.6982 0.9901
0.0244 74.0 740 0.0400 0.8523 0.9070 0.9903 nan 0.8183 0.9956 nan 0.7146 0.9901
0.016 76.0 760 0.0302 0.8644 0.9037 0.9915 nan 0.8105 0.9970 nan 0.7376 0.9913
0.0137 78.0 780 0.0325 0.8664 0.9118 0.9915 nan 0.8271 0.9965 nan 0.7415 0.9913
0.0115 80.0 800 0.0347 0.8678 0.9162 0.9915 nan 0.8362 0.9962 nan 0.7443 0.9913
0.0117 82.0 820 0.0320 0.8697 0.9084 0.9918 nan 0.8197 0.9971 nan 0.7478 0.9916
0.0108 84.0 840 0.0348 0.8691 0.9192 0.9916 nan 0.8423 0.9961 nan 0.7468 0.9913
0.0101 86.0 860 0.0342 0.8683 0.9081 0.9917 nan 0.8192 0.9970 nan 0.7452 0.9915
0.0085 88.0 880 0.0441 0.8639 0.9214 0.9911 nan 0.8474 0.9954 nan 0.7370 0.9908
0.009 90.0 900 0.0428 0.8619 0.9086 0.9912 nan 0.8209 0.9963 nan 0.7330 0.9909
0.009 92.0 920 0.0444 0.8620 0.9089 0.9912 nan 0.8215 0.9963 nan 0.7331 0.9909
0.0089 94.0 940 0.0410 0.8645 0.9147 0.9913 nan 0.8334 0.9961 nan 0.7380 0.9910
0.0091 96.0 960 0.0418 0.8663 0.9155 0.9914 nan 0.8349 0.9962 nan 0.7413 0.9912
0.0079 98.0 980 0.0398 0.8629 0.9085 0.9913 nan 0.8205 0.9965 nan 0.7348 0.9910
0.0084 100.0 1000 0.0497 0.8553 0.9109 0.9905 nan 0.8262 0.9955 nan 0.7204 0.9903
0.0088 102.0 1020 0.0399 0.8558 0.9058 0.9907 nan 0.8156 0.9960 nan 0.7212 0.9905
0.0089 104.0 1040 0.0388 0.8678 0.9225 0.9914 nan 0.8494 0.9957 nan 0.7444 0.9912
0.008 106.0 1060 0.0449 0.8622 0.9225 0.9909 nan 0.8498 0.9952 nan 0.7337 0.9907
0.0084 108.0 1080 0.0429 0.8687 0.9233 0.9914 nan 0.8510 0.9957 nan 0.7462 0.9912
0.0084 110.0 1100 0.0405 0.8687 0.9169 0.9916 nan 0.8375 0.9963 nan 0.7460 0.9914
0.007 112.0 1120 0.0544 0.8620 0.9180 0.9910 nan 0.8404 0.9956 nan 0.7333 0.9907
0.0079 114.0 1140 0.0501 0.8602 0.9176 0.9908 nan 0.8399 0.9954 nan 0.7299 0.9906
0.0084 116.0 1160 0.0508 0.8605 0.9212 0.9908 nan 0.8473 0.9951 nan 0.7304 0.9905
0.0113 118.0 1180 0.0511 0.8601 0.9228 0.9907 nan 0.8507 0.9950 nan 0.7298 0.9905
0.0076 120.0 1200 0.0556 0.8602 0.9264 0.9906 nan 0.8582 0.9947 nan 0.7299 0.9904
0.0081 122.0 1220 0.0471 0.8665 0.9256 0.9912 nan 0.8559 0.9953 nan 0.7420 0.9910
0.0054 124.0 1240 0.0504 0.8652 0.9174 0.9913 nan 0.8389 0.9959 nan 0.7394 0.9910
0.0054 126.0 1260 0.0502 0.8666 0.9209 0.9913 nan 0.8461 0.9957 nan 0.7420 0.9911
0.0092 128.0 1280 0.0540 0.8642 0.9223 0.9911 nan 0.8492 0.9954 nan 0.7376 0.9908
0.007 130.0 1300 0.0533 0.8637 0.9207 0.9911 nan 0.8459 0.9955 nan 0.7366 0.9908
0.0063 132.0 1320 0.0526 0.8644 0.9259 0.9910 nan 0.8567 0.9951 nan 0.7380 0.9908
0.0101 134.0 1340 0.0462 0.8653 0.9067 0.9915 nan 0.8166 0.9968 nan 0.7393 0.9913
0.0183 136.0 1360 0.0516 0.5675 0.9024 0.9904 nan 0.8089 0.9959 0.0 0.7123 0.9901
0.0102 138.0 1380 0.0388 0.8366 0.8716 0.9898 nan 0.7460 0.9972 nan 0.6837 0.9896
0.0277 140.0 1400 0.0649 0.8159 0.9590 0.9850 nan 0.9313 0.9866 nan 0.6472 0.9846
0.0169 142.0 1420 0.0340 0.8444 0.9148 0.9894 nan 0.8355 0.9941 nan 0.6996 0.9891
0.0359 144.0 1440 0.0314 0.8667 0.8987 0.9918 nan 0.7997 0.9976 nan 0.7419 0.9916
0.0117 146.0 1460 0.0307 0.8517 0.8869 0.9908 nan 0.7765 0.9973 nan 0.7129 0.9905
0.0097 148.0 1480 0.0323 0.8757 0.9070 0.9924 nan 0.8162 0.9977 nan 0.7593 0.9922
0.0063 150.0 1500 0.0302 0.8808 0.9155 0.9926 nan 0.8335 0.9975 nan 0.7692 0.9924
0.0085 152.0 1520 0.0352 0.8697 0.9113 0.9918 nan 0.8258 0.9968 nan 0.7478 0.9916
0.0078 154.0 1540 0.0428 0.8649 0.9190 0.9912 nan 0.8423 0.9957 nan 0.7388 0.9910
0.0056 156.0 1560 0.0340 0.8709 0.9170 0.9918 nan 0.8376 0.9965 nan 0.7502 0.9916
0.0063 158.0 1580 0.0359 0.8661 0.9201 0.9913 nan 0.8445 0.9958 nan 0.7412 0.9911
0.0083 160.0 1600 0.0375 0.8684 0.9186 0.9915 nan 0.8410 0.9961 nan 0.7456 0.9913
0.0065 162.0 1620 0.0370 0.8699 0.9210 0.9916 nan 0.8459 0.9960 nan 0.7484 0.9914
0.0063 164.0 1640 0.0388 0.8699 0.9228 0.9916 nan 0.8498 0.9959 nan 0.7484 0.9913
0.0056 166.0 1660 0.0386 0.8702 0.9238 0.9916 nan 0.8517 0.9958 nan 0.7491 0.9914
0.0049 168.0 1680 0.0394 0.8703 0.9199 0.9917 nan 0.8436 0.9962 nan 0.7491 0.9914
0.0054 170.0 1700 0.0400 0.8704 0.9195 0.9917 nan 0.8428 0.9962 nan 0.7494 0.9915
0.0046 172.0 1720 0.0398 0.8728 0.9187 0.9919 nan 0.8410 0.9965 nan 0.7539 0.9917
0.0058 174.0 1740 0.0402 0.8711 0.9166 0.9918 nan 0.8367 0.9965 nan 0.7507 0.9916
0.005 176.0 1760 0.0400 0.8720 0.9196 0.9918 nan 0.8428 0.9963 nan 0.7525 0.9916
0.0061 178.0 1780 0.0417 0.8714 0.9226 0.9917 nan 0.8492 0.9960 nan 0.7513 0.9915
0.0061 180.0 1800 0.0407 0.8731 0.9249 0.9918 nan 0.8538 0.9960 nan 0.7545 0.9916
0.0065 182.0 1820 0.0420 0.8712 0.9235 0.9917 nan 0.8511 0.9959 nan 0.7509 0.9914
0.0045 184.0 1840 0.0421 0.8718 0.9250 0.9917 nan 0.8541 0.9959 nan 0.7522 0.9915
0.0056 186.0 1860 0.0435 0.8703 0.9157 0.9917 nan 0.8349 0.9965 nan 0.7490 0.9915
0.0059 188.0 1880 0.0436 0.8707 0.9191 0.9917 nan 0.8419 0.9963 nan 0.7499 0.9915
0.0042 190.0 1900 0.0436 0.8707 0.9210 0.9917 nan 0.8458 0.9961 nan 0.7499 0.9915
0.006 192.0 1920 0.0426 0.8697 0.9193 0.9916 nan 0.8425 0.9962 nan 0.7480 0.9914
0.0053 194.0 1940 0.0447 0.8697 0.9199 0.9916 nan 0.8437 0.9961 nan 0.7480 0.9914
0.0044 196.0 1960 0.0441 0.8710 0.9238 0.9916 nan 0.8516 0.9959 nan 0.7505 0.9914
0.0049 198.0 1980 0.0453 0.8693 0.9219 0.9915 nan 0.8479 0.9959 nan 0.7473 0.9913
0.0059 200.0 2000 0.0444 0.8726 0.9233 0.9918 nan 0.8506 0.9961 nan 0.7537 0.9916
0.005 202.0 2020 0.0447 0.8717 0.9256 0.9917 nan 0.8555 0.9958 nan 0.7519 0.9914
0.005 204.0 2040 0.0451 0.8711 0.9227 0.9917 nan 0.8494 0.9960 nan 0.7507 0.9915
0.0043 206.0 2060 0.0458 0.8707 0.9220 0.9916 nan 0.8480 0.9960 nan 0.7499 0.9914
0.0043 208.0 2080 0.0462 0.8696 0.9221 0.9916 nan 0.8482 0.9959 nan 0.7479 0.9913
0.0062 210.0 2100 0.0451 0.8715 0.9211 0.9917 nan 0.8461 0.9962 nan 0.7515 0.9915
0.0056 212.0 2120 0.0470 0.8706 0.9213 0.9917 nan 0.8466 0.9961 nan 0.7498 0.9914
0.0049 214.0 2140 0.0480 0.8679 0.9229 0.9914 nan 0.8500 0.9957 nan 0.7447 0.9912
0.0038 216.0 2160 0.0474 0.8700 0.9194 0.9916 nan 0.8427 0.9962 nan 0.7486 0.9914
0.0043 218.0 2180 0.0472 0.8693 0.9231 0.9915 nan 0.8503 0.9958 nan 0.7474 0.9913
0.005 220.0 2200 0.0471 0.8695 0.9156 0.9917 nan 0.8348 0.9964 nan 0.7475 0.9915
0.0041 222.0 2220 0.0472 0.8719 0.9187 0.9918 nan 0.8411 0.9964 nan 0.7522 0.9916
0.0041 224.0 2240 0.0471 0.8720 0.9219 0.9918 nan 0.8477 0.9962 nan 0.7525 0.9916
0.005 226.0 2260 0.0479 0.8720 0.9191 0.9918 nan 0.8418 0.9964 nan 0.7524 0.9916
0.0041 228.0 2280 0.0489 0.8706 0.9183 0.9917 nan 0.8403 0.9963 nan 0.7498 0.9915
0.005 230.0 2300 0.0472 0.8728 0.9195 0.9919 nan 0.8426 0.9964 nan 0.7540 0.9917

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
2