INTERNAL_BEST-safety-utcustom-train-SF-RGBD-b5
This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:
- Loss: 0.0506
- Mean Iou: 0.8519
- Mean Accuracy: 0.9125
- Overall Accuracy: 0.9902
- Accuracy Unlabeled: nan
- Accuracy Safe: 0.8300
- Accuracy Unsafe: 0.9950
- Iou Unlabeled: nan
- Iou Safe: 0.7138
- Iou Unsafe: 0.9899
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 2000
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Safe | Accuracy Unsafe | Iou Unlabeled | Iou Safe | Iou Unsafe |
---|---|---|---|---|---|---|---|---|---|---|---|---|
1.4012 | 2.0 | 20 | 1.1653 | 0.0379 | 0.0923 | 0.0926 | nan | 0.0920 | 0.0926 | 0.0 | 0.0214 | 0.0923 |
1.2461 | 4.0 | 40 | 0.9899 | 0.2255 | 0.3670 | 0.6307 | nan | 0.0868 | 0.6473 | 0.0 | 0.0379 | 0.6386 |
1.0596 | 6.0 | 60 | 0.7738 | 0.2703 | 0.4188 | 0.7941 | nan | 0.0199 | 0.8177 | 0.0 | 0.0143 | 0.7967 |
0.8267 | 8.0 | 80 | 0.6767 | 0.2902 | 0.4512 | 0.8507 | nan | 0.0265 | 0.8758 | 0.0 | 0.0183 | 0.8522 |
0.7282 | 10.0 | 100 | 0.5637 | 0.3086 | 0.4776 | 0.9098 | nan | 0.0183 | 0.9370 | 0.0 | 0.0149 | 0.9110 |
0.5124 | 12.0 | 120 | 0.4667 | 0.3254 | 0.5053 | 0.9512 | nan | 0.0314 | 0.9792 | 0.0 | 0.0247 | 0.9516 |
0.3126 | 14.0 | 140 | 0.3585 | 0.3325 | 0.5147 | 0.9662 | nan | 0.0349 | 0.9945 | 0.0 | 0.0313 | 0.9663 |
0.2862 | 16.0 | 160 | 0.2890 | 0.3346 | 0.5168 | 0.9703 | nan | 0.0349 | 0.9988 | 0.0 | 0.0336 | 0.9703 |
0.2374 | 18.0 | 180 | 0.2102 | 0.3647 | 0.5637 | 0.9725 | nan | 0.1291 | 0.9982 | 0.0 | 0.1218 | 0.9724 |
0.1583 | 20.0 | 200 | 0.1730 | 0.6293 | 0.6574 | 0.9761 | nan | 0.3186 | 0.9961 | nan | 0.2827 | 0.9759 |
0.1082 | 22.0 | 220 | 0.1317 | 0.6306 | 0.6566 | 0.9765 | nan | 0.3166 | 0.9966 | nan | 0.2849 | 0.9763 |
0.1025 | 24.0 | 240 | 0.1116 | 0.6494 | 0.6766 | 0.9777 | nan | 0.3565 | 0.9967 | nan | 0.3212 | 0.9775 |
0.1158 | 26.0 | 260 | 0.0965 | 0.7200 | 0.7978 | 0.9791 | nan | 0.6051 | 0.9905 | nan | 0.4612 | 0.9787 |
0.0882 | 28.0 | 280 | 0.0857 | 0.7356 | 0.7857 | 0.9822 | nan | 0.5769 | 0.9946 | nan | 0.4893 | 0.9819 |
0.07 | 30.0 | 300 | 0.0829 | 0.6717 | 0.6934 | 0.9799 | nan | 0.3890 | 0.9979 | nan | 0.3637 | 0.9797 |
0.0911 | 32.0 | 320 | 0.0677 | 0.7680 | 0.8244 | 0.9843 | nan | 0.6545 | 0.9944 | nan | 0.5521 | 0.9840 |
0.0807 | 34.0 | 340 | 0.0696 | 0.7779 | 0.8716 | 0.9834 | nan | 0.7528 | 0.9904 | nan | 0.5727 | 0.9830 |
0.0531 | 36.0 | 360 | 0.0611 | 0.7761 | 0.8781 | 0.9829 | nan | 0.7668 | 0.9895 | nan | 0.5698 | 0.9825 |
0.0407 | 38.0 | 380 | 0.0567 | 0.7828 | 0.8396 | 0.9854 | nan | 0.6846 | 0.9945 | nan | 0.5805 | 0.9851 |
0.0449 | 40.0 | 400 | 0.0639 | 0.7725 | 0.8200 | 0.9851 | nan | 0.6446 | 0.9954 | nan | 0.5602 | 0.9848 |
0.0932 | 42.0 | 420 | 0.0503 | 0.7726 | 0.7983 | 0.9861 | nan | 0.5987 | 0.9979 | nan | 0.5593 | 0.9858 |
0.0362 | 44.0 | 440 | 0.0634 | 0.7553 | 0.8670 | 0.9805 | nan | 0.7464 | 0.9876 | nan | 0.5306 | 0.9801 |
0.0324 | 46.0 | 460 | 0.0501 | 0.8024 | 0.8615 | 0.9867 | nan | 0.7284 | 0.9946 | nan | 0.6184 | 0.9864 |
0.036 | 48.0 | 480 | 0.0454 | 0.8010 | 0.8454 | 0.9872 | nan | 0.6947 | 0.9961 | nan | 0.6151 | 0.9869 |
0.0356 | 50.0 | 500 | 0.0495 | 0.8061 | 0.8760 | 0.9866 | nan | 0.7585 | 0.9936 | nan | 0.6260 | 0.9863 |
0.0333 | 52.0 | 520 | 0.0483 | 0.7743 | 0.8128 | 0.9856 | nan | 0.6292 | 0.9964 | nan | 0.5632 | 0.9853 |
0.0277 | 54.0 | 540 | 0.0445 | 0.7714 | 0.7932 | 0.9862 | nan | 0.5880 | 0.9983 | nan | 0.5569 | 0.9859 |
0.0298 | 56.0 | 560 | 0.0460 | 0.8034 | 0.8518 | 0.9872 | nan | 0.7078 | 0.9957 | nan | 0.6198 | 0.9869 |
0.0256 | 58.0 | 580 | 0.0416 | 0.8181 | 0.8548 | 0.9886 | nan | 0.7126 | 0.9970 | nan | 0.6479 | 0.9883 |
0.0336 | 60.0 | 600 | 0.0442 | 0.7957 | 0.8168 | 0.9877 | nan | 0.6351 | 0.9984 | nan | 0.6039 | 0.9875 |
0.0283 | 62.0 | 620 | 0.0425 | 0.8141 | 0.8812 | 0.9873 | nan | 0.7684 | 0.9940 | nan | 0.6413 | 0.9870 |
0.0198 | 64.0 | 640 | 0.0455 | 0.8059 | 0.8401 | 0.9879 | nan | 0.6830 | 0.9971 | nan | 0.6242 | 0.9876 |
0.0181 | 66.0 | 660 | 0.0444 | 0.8144 | 0.8733 | 0.9876 | nan | 0.7519 | 0.9948 | nan | 0.6415 | 0.9873 |
0.0188 | 68.0 | 680 | 0.0456 | 0.8179 | 0.8696 | 0.9881 | nan | 0.7436 | 0.9955 | nan | 0.6479 | 0.9878 |
0.0165 | 70.0 | 700 | 0.0431 | 0.8208 | 0.8985 | 0.9875 | nan | 0.8040 | 0.9930 | nan | 0.6544 | 0.9872 |
0.0184 | 72.0 | 720 | 0.0421 | 0.8165 | 0.8785 | 0.9876 | nan | 0.7625 | 0.9945 | nan | 0.6457 | 0.9874 |
0.0336 | 74.0 | 740 | 0.0441 | 0.8081 | 0.8792 | 0.9867 | nan | 0.7650 | 0.9935 | nan | 0.6298 | 0.9864 |
0.0165 | 76.0 | 760 | 0.0374 | 0.8200 | 0.8555 | 0.9887 | nan | 0.7139 | 0.9971 | nan | 0.6515 | 0.9885 |
0.0127 | 78.0 | 780 | 0.0402 | 0.8222 | 0.8780 | 0.9882 | nan | 0.7608 | 0.9952 | nan | 0.6563 | 0.9880 |
0.0152 | 80.0 | 800 | 0.0430 | 0.8230 | 0.8687 | 0.9886 | nan | 0.7413 | 0.9961 | nan | 0.6576 | 0.9883 |
0.0143 | 82.0 | 820 | 0.0410 | 0.8087 | 0.8422 | 0.9881 | nan | 0.6873 | 0.9972 | nan | 0.6297 | 0.9878 |
0.0134 | 84.0 | 840 | 0.0335 | 0.8429 | 0.8893 | 0.9899 | nan | 0.7823 | 0.9962 | nan | 0.6962 | 0.9897 |
0.0122 | 86.0 | 860 | 0.0396 | 0.8312 | 0.8749 | 0.9892 | nan | 0.7534 | 0.9964 | nan | 0.6734 | 0.9890 |
0.0126 | 88.0 | 880 | 0.0405 | 0.8341 | 0.8805 | 0.9893 | nan | 0.7649 | 0.9962 | nan | 0.6791 | 0.9891 |
0.0121 | 90.0 | 900 | 0.0400 | 0.8390 | 0.8810 | 0.9898 | nan | 0.7654 | 0.9966 | nan | 0.6884 | 0.9895 |
0.0104 | 92.0 | 920 | 0.0372 | 0.8453 | 0.8990 | 0.9899 | nan | 0.8024 | 0.9956 | nan | 0.7010 | 0.9896 |
0.0128 | 94.0 | 940 | 0.0394 | 0.8411 | 0.8893 | 0.9897 | nan | 0.7825 | 0.9961 | nan | 0.6927 | 0.9895 |
0.0124 | 96.0 | 960 | 0.0409 | 0.8395 | 0.8948 | 0.9895 | nan | 0.7943 | 0.9954 | nan | 0.6899 | 0.9892 |
0.0095 | 98.0 | 980 | 0.0413 | 0.8258 | 0.8903 | 0.9882 | nan | 0.7863 | 0.9944 | nan | 0.6637 | 0.9880 |
0.0147 | 100.0 | 1000 | 0.0468 | 0.8181 | 0.9044 | 0.9870 | nan | 0.8167 | 0.9922 | nan | 0.6496 | 0.9867 |
0.0125 | 102.0 | 1020 | 0.0379 | 0.8213 | 0.8961 | 0.9876 | nan | 0.7989 | 0.9933 | nan | 0.6553 | 0.9873 |
0.0142 | 104.0 | 1040 | 0.0328 | 0.8449 | 0.9154 | 0.9894 | nan | 0.8366 | 0.9941 | nan | 0.7006 | 0.9892 |
0.0101 | 106.0 | 1060 | 0.0428 | 0.8407 | 0.9144 | 0.9891 | nan | 0.8351 | 0.9937 | nan | 0.6927 | 0.9888 |
0.0097 | 108.0 | 1080 | 0.0397 | 0.8296 | 0.8847 | 0.9888 | nan | 0.7740 | 0.9953 | nan | 0.6707 | 0.9885 |
0.01 | 110.0 | 1100 | 0.0384 | 0.8457 | 0.8935 | 0.9901 | nan | 0.7910 | 0.9961 | nan | 0.7016 | 0.9898 |
0.0084 | 112.0 | 1120 | 0.0385 | 0.8421 | 0.8874 | 0.9899 | nan | 0.7784 | 0.9963 | nan | 0.6945 | 0.9896 |
0.0086 | 114.0 | 1140 | 0.0413 | 0.8488 | 0.8882 | 0.9905 | nan | 0.7795 | 0.9969 | nan | 0.7074 | 0.9903 |
0.0112 | 116.0 | 1160 | 0.0427 | 0.8459 | 0.8942 | 0.9901 | nan | 0.7924 | 0.9961 | nan | 0.7020 | 0.9898 |
0.0132 | 118.0 | 1180 | 0.0407 | 0.8510 | 0.9011 | 0.9904 | nan | 0.8062 | 0.9960 | nan | 0.7118 | 0.9901 |
0.0084 | 120.0 | 1200 | 0.0432 | 0.8510 | 0.9015 | 0.9903 | nan | 0.8071 | 0.9959 | nan | 0.7118 | 0.9901 |
0.008 | 122.0 | 1220 | 0.0431 | 0.8504 | 0.9077 | 0.9901 | nan | 0.8202 | 0.9953 | nan | 0.7109 | 0.9899 |
0.0069 | 124.0 | 1240 | 0.0424 | 0.8522 | 0.8982 | 0.9905 | nan | 0.8001 | 0.9963 | nan | 0.7141 | 0.9903 |
0.006 | 126.0 | 1260 | 0.0447 | 0.8537 | 0.9114 | 0.9904 | nan | 0.8275 | 0.9953 | nan | 0.7173 | 0.9901 |
0.0123 | 128.0 | 1280 | 0.0464 | 0.8529 | 0.9102 | 0.9903 | nan | 0.8250 | 0.9954 | nan | 0.7157 | 0.9901 |
0.0073 | 130.0 | 1300 | 0.0441 | 0.8520 | 0.9025 | 0.9904 | nan | 0.8090 | 0.9959 | nan | 0.7139 | 0.9902 |
0.0066 | 132.0 | 1320 | 0.0447 | 0.8524 | 0.9086 | 0.9903 | nan | 0.8217 | 0.9954 | nan | 0.7148 | 0.9901 |
0.0063 | 134.0 | 1340 | 0.0434 | 0.8546 | 0.9077 | 0.9905 | nan | 0.8197 | 0.9957 | nan | 0.7189 | 0.9903 |
0.0068 | 136.0 | 1360 | 0.0475 | 0.8518 | 0.9090 | 0.9902 | nan | 0.8226 | 0.9953 | nan | 0.7135 | 0.9900 |
0.0056 | 138.0 | 1380 | 0.0458 | 0.8549 | 0.9122 | 0.9905 | nan | 0.8291 | 0.9954 | nan | 0.7195 | 0.9902 |
0.007 | 140.0 | 1400 | 0.0455 | 0.8554 | 0.9126 | 0.9905 | nan | 0.8298 | 0.9954 | nan | 0.7205 | 0.9903 |
0.0064 | 142.0 | 1420 | 0.0476 | 0.8542 | 0.9047 | 0.9906 | nan | 0.8133 | 0.9960 | nan | 0.7180 | 0.9903 |
0.0065 | 144.0 | 1440 | 0.0437 | 0.8556 | 0.9107 | 0.9906 | nan | 0.8258 | 0.9956 | nan | 0.7210 | 0.9903 |
0.005 | 146.0 | 1460 | 0.0455 | 0.8551 | 0.9098 | 0.9905 | nan | 0.8239 | 0.9956 | nan | 0.7198 | 0.9903 |
0.005 | 148.0 | 1480 | 0.0458 | 0.8539 | 0.9084 | 0.9905 | nan | 0.8212 | 0.9956 | nan | 0.7175 | 0.9902 |
0.0048 | 150.0 | 1500 | 0.0462 | 0.8558 | 0.9041 | 0.9907 | nan | 0.8121 | 0.9962 | nan | 0.7211 | 0.9905 |
0.0063 | 152.0 | 1520 | 0.0453 | 0.8560 | 0.9175 | 0.9904 | nan | 0.8400 | 0.9950 | nan | 0.7217 | 0.9902 |
0.006 | 154.0 | 1540 | 0.0473 | 0.8531 | 0.9073 | 0.9904 | nan | 0.8190 | 0.9956 | nan | 0.7160 | 0.9902 |
0.0043 | 156.0 | 1560 | 0.0448 | 0.8562 | 0.9100 | 0.9906 | nan | 0.8243 | 0.9957 | nan | 0.7220 | 0.9904 |
0.0049 | 158.0 | 1580 | 0.0480 | 0.8518 | 0.9137 | 0.9901 | nan | 0.8324 | 0.9949 | nan | 0.7138 | 0.9899 |
0.0065 | 160.0 | 1600 | 0.0475 | 0.8556 | 0.9095 | 0.9906 | nan | 0.8233 | 0.9957 | nan | 0.7209 | 0.9903 |
0.0052 | 162.0 | 1620 | 0.0479 | 0.8531 | 0.9087 | 0.9904 | nan | 0.8218 | 0.9955 | nan | 0.7161 | 0.9901 |
0.0063 | 164.0 | 1640 | 0.0488 | 0.8571 | 0.9115 | 0.9907 | nan | 0.8273 | 0.9956 | nan | 0.7238 | 0.9904 |
0.0053 | 166.0 | 1660 | 0.0514 | 0.8515 | 0.9152 | 0.9901 | nan | 0.8357 | 0.9948 | nan | 0.7132 | 0.9898 |
0.0046 | 168.0 | 1680 | 0.0476 | 0.8540 | 0.9040 | 0.9906 | nan | 0.8119 | 0.9960 | nan | 0.7177 | 0.9903 |
0.0039 | 170.0 | 1700 | 0.0483 | 0.5699 | 0.9121 | 0.9905 | nan | 0.8289 | 0.9954 | 0.0 | 0.7195 | 0.9902 |
0.0044 | 172.0 | 1720 | 0.0494 | 0.8550 | 0.9114 | 0.9905 | nan | 0.8273 | 0.9954 | nan | 0.7197 | 0.9902 |
0.0051 | 174.0 | 1740 | 0.0503 | 0.8556 | 0.9103 | 0.9906 | nan | 0.8250 | 0.9956 | nan | 0.7208 | 0.9903 |
0.0041 | 176.0 | 1760 | 0.0499 | 0.8545 | 0.9118 | 0.9904 | nan | 0.8283 | 0.9954 | nan | 0.7188 | 0.9902 |
0.0049 | 178.0 | 1780 | 0.0525 | 0.8541 | 0.9066 | 0.9905 | nan | 0.8174 | 0.9958 | nan | 0.7179 | 0.9903 |
0.0048 | 180.0 | 1800 | 0.0496 | 0.8556 | 0.9165 | 0.9904 | nan | 0.8380 | 0.9951 | nan | 0.7210 | 0.9902 |
0.008 | 182.0 | 1820 | 0.0487 | 0.8528 | 0.9085 | 0.9904 | nan | 0.8215 | 0.9955 | nan | 0.7155 | 0.9901 |
0.0041 | 184.0 | 1840 | 0.0506 | 0.8519 | 0.9125 | 0.9902 | nan | 0.8300 | 0.9950 | nan | 0.7138 | 0.9899 |
Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
- Downloads last month
- 34
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.