Edit model card

safety-utcustom-train-SF-RGB-b0

This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3221
  • Mean Iou: 0.7557
  • Mean Accuracy: 0.8092
  • Overall Accuracy: 0.9835
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.6240
  • Accuracy Unsafe: 0.9945
  • Iou Unlabeled: nan
  • Iou Safe: 0.5281
  • Iou Unsafe: 0.9832

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 9e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Safe Accuracy Unsafe Iou Unlabeled Iou Safe Iou Unsafe
1.2069 1.0 10 1.1287 0.0406 0.3613 0.1117 nan 0.6267 0.0960 0.0 0.0261 0.0958
1.196 2.0 20 1.1408 0.0465 0.3971 0.1274 nan 0.6837 0.1105 0.0 0.0290 0.1104
1.1866 3.0 30 1.1441 0.0662 0.4586 0.1826 nan 0.7519 0.1653 0.0 0.0335 0.1652
1.1701 4.0 40 1.1350 0.1016 0.5469 0.2805 nan 0.8301 0.2638 0.0 0.0410 0.2637
1.1467 5.0 50 1.1285 0.1325 0.6266 0.3646 nan 0.9052 0.3481 0.0 0.0496 0.3481
1.1126 6.0 60 1.0914 0.1933 0.7318 0.5257 nan 0.9508 0.5128 0.0 0.0673 0.5127
1.0735 7.0 70 1.0392 0.2462 0.8075 0.6582 nan 0.9662 0.6489 0.0 0.0900 0.6487
1.0335 8.0 80 1.0015 0.2783 0.8456 0.7301 nan 0.9683 0.7228 0.0 0.1122 0.7226
1.0088 9.0 90 0.9502 0.3061 0.8736 0.7884 nan 0.9643 0.7830 0.0 0.1359 0.7825
0.9993 10.0 100 0.9158 0.3246 0.8886 0.8232 nan 0.9581 0.8191 0.0 0.1556 0.8183
0.9114 11.0 110 0.8472 0.3562 0.9061 0.8732 nan 0.9411 0.8711 0.0 0.1990 0.8697
0.9027 12.0 120 0.8073 0.3687 0.9085 0.8909 nan 0.9271 0.8898 0.0 0.2182 0.8881
0.8775 13.0 130 0.7756 0.3819 0.9011 0.9086 nan 0.8931 0.9090 0.0 0.2394 0.9062
0.8532 14.0 140 0.7544 0.3883 0.9005 0.9156 nan 0.8844 0.9166 0.0 0.2513 0.9135
0.7509 15.0 150 0.7137 0.4039 0.8965 0.9311 nan 0.8597 0.9333 0.0 0.2824 0.9294
0.7711 16.0 160 0.6837 0.4131 0.8959 0.9394 nan 0.8497 0.9422 0.0 0.3014 0.9379
0.7163 17.0 170 0.6573 0.4230 0.8859 0.9467 nan 0.8212 0.9505 0.0 0.3234 0.9454
0.6609 18.0 180 0.6698 0.4200 0.8889 0.9449 nan 0.8294 0.9484 0.0 0.3163 0.9436
0.7237 19.0 190 0.6465 0.4236 0.8821 0.9479 nan 0.8121 0.9520 0.0 0.3241 0.9467
0.6264 20.0 200 0.6300 0.4293 0.8776 0.9520 nan 0.7985 0.9566 0.0 0.3372 0.9508
0.6711 21.0 210 0.6050 0.4391 0.8731 0.9576 nan 0.7833 0.9630 0.0 0.3605 0.9567
0.626 22.0 220 0.5855 0.4409 0.8742 0.9585 nan 0.7846 0.9637 0.0 0.3653 0.9575
0.6103 23.0 230 0.5651 0.4474 0.8671 0.9623 nan 0.7658 0.9683 0.0 0.3807 0.9615
0.6462 24.0 240 0.5621 0.4489 0.8643 0.9631 nan 0.7592 0.9693 0.0 0.3844 0.9623
0.5442 25.0 250 0.5460 0.4563 0.8592 0.9668 nan 0.7450 0.9735 0.0 0.4028 0.9660
0.6764 26.0 260 0.5673 0.4544 0.8646 0.9657 nan 0.7571 0.9721 0.0 0.3983 0.9650
0.6471 27.0 270 0.5412 0.4586 0.8561 0.9679 nan 0.7374 0.9749 0.0 0.4087 0.9672
0.5589 28.0 280 0.5427 0.4573 0.8601 0.9671 nan 0.7465 0.9738 0.0 0.4057 0.9663
0.6512 29.0 290 0.5264 0.4600 0.8567 0.9681 nan 0.7384 0.9751 0.0 0.4126 0.9674
0.6146 30.0 300 0.5321 0.4616 0.8619 0.9688 nan 0.7482 0.9755 0.0 0.4167 0.9681
0.4938 31.0 310 0.5025 0.4751 0.8475 0.9744 nan 0.7127 0.9823 0.0 0.4515 0.9738
0.4868 32.0 320 0.4836 0.4781 0.8342 0.9761 nan 0.6833 0.9851 0.0 0.4586 0.9756
0.6315 33.0 330 0.4918 0.4739 0.8479 0.9740 nan 0.7139 0.9819 0.0 0.4483 0.9735
0.5529 34.0 340 0.4879 0.4753 0.8414 0.9749 nan 0.6995 0.9832 0.0 0.4516 0.9743
0.4592 35.0 350 0.4826 0.4764 0.8364 0.9754 nan 0.6887 0.9842 0.0 0.4542 0.9749
0.5904 36.0 360 0.4611 0.4859 0.8159 0.9793 nan 0.6423 0.9896 0.0 0.4789 0.9789
0.4804 37.0 370 0.4654 0.4796 0.8359 0.9764 nan 0.6865 0.9853 0.0 0.4627 0.9760
0.4701 38.0 380 0.4625 0.4846 0.8251 0.9784 nan 0.6623 0.9880 0.0 0.4758 0.9779
0.4729 39.0 390 0.4536 0.4838 0.8231 0.9783 nan 0.6582 0.9881 0.0 0.4736 0.9779
0.4219 40.0 400 0.4514 0.4838 0.8305 0.9779 nan 0.6738 0.9872 0.0 0.4739 0.9775
0.6494 41.0 410 0.4425 0.4892 0.8162 0.9801 nan 0.6420 0.9904 0.0 0.4878 0.9797
0.4616 42.0 420 0.4390 0.7316 0.8225 0.9794 nan 0.6558 0.9892 nan 0.4842 0.9790
0.4408 43.0 430 0.4419 0.7358 0.8272 0.9797 nan 0.6652 0.9893 nan 0.4923 0.9793
0.4532 44.0 440 0.4371 0.7375 0.8274 0.9800 nan 0.6651 0.9896 nan 0.4954 0.9796
0.5015 45.0 450 0.4376 0.7364 0.8276 0.9798 nan 0.6659 0.9894 nan 0.4933 0.9794
0.4965 46.0 460 0.4201 0.7405 0.8137 0.9812 nan 0.6357 0.9918 nan 0.5002 0.9809
0.4837 47.0 470 0.4281 0.7378 0.8279 0.9800 nan 0.6662 0.9896 nan 0.4961 0.9796
0.4221 48.0 480 0.4288 0.7371 0.8227 0.9802 nan 0.6553 0.9901 nan 0.4944 0.9798
0.4491 49.0 490 0.4152 0.7371 0.8074 0.9811 nan 0.6228 0.9920 nan 0.4935 0.9808
0.4121 50.0 500 0.4159 0.7367 0.8063 0.9811 nan 0.6205 0.9921 nan 0.4927 0.9808
0.4727 51.0 510 0.4199 0.7354 0.8095 0.9807 nan 0.6274 0.9915 nan 0.4905 0.9804
0.5323 52.0 520 0.4079 0.7383 0.8074 0.9813 nan 0.6227 0.9922 nan 0.4957 0.9809
0.409 53.0 530 0.4103 0.7392 0.8161 0.9809 nan 0.6409 0.9913 nan 0.4978 0.9805
0.6391 54.0 540 0.4063 0.7406 0.8133 0.9813 nan 0.6349 0.9918 nan 0.5003 0.9809
0.3905 55.0 550 0.4000 0.7409 0.8122 0.9814 nan 0.6325 0.9920 nan 0.5007 0.9810
0.4138 56.0 560 0.4028 0.7398 0.8183 0.9809 nan 0.6455 0.9911 nan 0.4990 0.9805
0.3977 57.0 570 0.3865 0.7372 0.7912 0.9821 nan 0.5884 0.9941 nan 0.4926 0.9818
0.4186 58.0 580 0.3845 0.7416 0.7994 0.9822 nan 0.6050 0.9937 nan 0.5014 0.9819
0.4921 59.0 590 0.3881 0.7427 0.8102 0.9817 nan 0.6278 0.9925 nan 0.5039 0.9814
0.3953 60.0 600 0.3823 0.7429 0.8027 0.9822 nan 0.6119 0.9935 nan 0.5039 0.9819
0.4263 61.0 610 0.3841 0.7420 0.8075 0.9818 nan 0.6222 0.9928 nan 0.5026 0.9815
0.3798 62.0 620 0.3763 0.7446 0.8054 0.9823 nan 0.6174 0.9934 nan 0.5072 0.9820
0.4208 63.0 630 0.3724 0.7437 0.7919 0.9829 nan 0.5888 0.9949 nan 0.5047 0.9826
0.3627 64.0 640 0.3760 0.7466 0.8111 0.9822 nan 0.6292 0.9930 nan 0.5112 0.9819
0.4156 65.0 650 0.3669 0.7478 0.8018 0.9829 nan 0.6092 0.9943 nan 0.5130 0.9826
0.468 66.0 660 0.3706 0.7508 0.8145 0.9826 nan 0.6359 0.9932 nan 0.5193 0.9823
0.4547 67.0 670 0.3692 0.7512 0.8189 0.9824 nan 0.6451 0.9927 nan 0.5204 0.9821
0.3604 68.0 680 0.3691 0.7520 0.8152 0.9827 nan 0.6371 0.9933 nan 0.5215 0.9824
0.4476 69.0 690 0.3679 0.7516 0.8195 0.9825 nan 0.6463 0.9927 nan 0.5210 0.9821
0.3535 70.0 700 0.3589 0.7522 0.8097 0.9831 nan 0.6255 0.9939 nan 0.5217 0.9827
0.3539 71.0 710 0.3572 0.7526 0.8091 0.9831 nan 0.6242 0.9941 nan 0.5224 0.9828
0.3675 72.0 720 0.3589 0.7518 0.8100 0.9830 nan 0.6261 0.9939 nan 0.5209 0.9827
0.4148 73.0 730 0.3536 0.7504 0.8093 0.9828 nan 0.6249 0.9937 nan 0.5182 0.9825
0.3941 74.0 740 0.3538 0.7497 0.8099 0.9827 nan 0.6263 0.9936 nan 0.5169 0.9824
0.4264 75.0 750 0.3595 0.7469 0.8197 0.9818 nan 0.6473 0.9920 nan 0.5123 0.9814
0.3815 76.0 760 0.3525 0.7492 0.8097 0.9827 nan 0.6258 0.9935 nan 0.5162 0.9823
0.3459 77.0 770 0.3443 0.7452 0.7926 0.9831 nan 0.5901 0.9951 nan 0.5076 0.9828
0.3794 78.0 780 0.3538 0.7501 0.8154 0.9825 nan 0.6377 0.9930 nan 0.5180 0.9821
0.3761 79.0 790 0.3525 0.7483 0.8169 0.9821 nan 0.6412 0.9925 nan 0.5147 0.9818
0.3612 80.0 800 0.3495 0.7513 0.8128 0.9828 nan 0.6321 0.9934 nan 0.5201 0.9824
0.405 81.0 810 0.3466 0.7502 0.8148 0.9825 nan 0.6365 0.9931 nan 0.5182 0.9822
0.4289 82.0 820 0.3458 0.7498 0.8092 0.9828 nan 0.6247 0.9937 nan 0.5171 0.9824
0.3523 83.0 830 0.3435 0.7503 0.8112 0.9827 nan 0.6288 0.9935 nan 0.5183 0.9824
0.4254 84.0 840 0.3403 0.7495 0.8000 0.9832 nan 0.6052 0.9947 nan 0.5160 0.9829
0.3399 85.0 850 0.3355 0.7492 0.8003 0.9832 nan 0.6059 0.9947 nan 0.5155 0.9829
0.3251 86.0 860 0.3395 0.7503 0.8028 0.9832 nan 0.6111 0.9945 nan 0.5178 0.9829
0.3748 87.0 870 0.3400 0.7502 0.8117 0.9827 nan 0.6299 0.9934 nan 0.5181 0.9824
0.4398 88.0 880 0.3450 0.7527 0.8197 0.9826 nan 0.6466 0.9928 nan 0.5231 0.9822
0.3782 89.0 890 0.3454 0.7547 0.8180 0.9829 nan 0.6426 0.9933 nan 0.5268 0.9826
0.4318 90.0 900 0.3424 0.7541 0.8162 0.9830 nan 0.6390 0.9934 nan 0.5255 0.9826
0.3428 91.0 910 0.3327 0.7541 0.8124 0.9832 nan 0.6309 0.9939 nan 0.5253 0.9828
0.4303 92.0 920 0.3364 0.7525 0.8108 0.9830 nan 0.6277 0.9939 nan 0.5223 0.9827
0.3624 93.0 930 0.3277 0.7531 0.8063 0.9834 nan 0.6182 0.9945 nan 0.5231 0.9830
0.3418 94.0 940 0.3315 0.7548 0.8125 0.9833 nan 0.6311 0.9940 nan 0.5267 0.9829
0.321 95.0 950 0.3266 0.7541 0.8070 0.9835 nan 0.6195 0.9945 nan 0.5251 0.9831
0.3152 96.0 960 0.3265 0.7531 0.8025 0.9836 nan 0.6101 0.9949 nan 0.5230 0.9833
0.3153 97.0 970 0.3263 0.7537 0.8048 0.9835 nan 0.6149 0.9947 nan 0.5243 0.9832
0.3158 98.0 980 0.3299 0.7553 0.8139 0.9832 nan 0.6340 0.9939 nan 0.5278 0.9829
0.3162 99.0 990 0.3248 0.7546 0.8076 0.9835 nan 0.6207 0.9945 nan 0.5260 0.9832
0.3748 100.0 1000 0.3238 0.7553 0.8077 0.9836 nan 0.6208 0.9946 nan 0.5274 0.9833
0.3598 101.0 1010 0.3221 0.7544 0.8096 0.9833 nan 0.6250 0.9943 nan 0.5257 0.9830
0.3245 102.0 1020 0.3247 0.7527 0.8156 0.9828 nan 0.6380 0.9933 nan 0.5228 0.9825
0.3527 103.0 1030 0.3275 0.7537 0.8193 0.9827 nan 0.6456 0.9930 nan 0.5250 0.9824
0.5087 104.0 1040 0.3221 0.7559 0.8105 0.9835 nan 0.6266 0.9944 nan 0.5287 0.9832
0.3331 105.0 1050 0.3183 0.7560 0.8064 0.9837 nan 0.6180 0.9948 nan 0.5285 0.9834
0.324 106.0 1060 0.3198 0.7561 0.8090 0.9836 nan 0.6235 0.9946 nan 0.5289 0.9833
0.3512 107.0 1070 0.3194 0.7549 0.8052 0.9836 nan 0.6155 0.9949 nan 0.5265 0.9833
0.3274 108.0 1080 0.3185 0.7569 0.8122 0.9835 nan 0.6301 0.9943 nan 0.5306 0.9832
0.335 109.0 1090 0.3177 0.7554 0.8081 0.9836 nan 0.6217 0.9946 nan 0.5276 0.9832
0.3581 110.0 1100 0.3204 0.7568 0.8146 0.9834 nan 0.6352 0.9940 nan 0.5306 0.9831
0.3307 111.0 1110 0.3216 0.7571 0.8138 0.9835 nan 0.6335 0.9941 nan 0.5310 0.9832
0.3162 112.0 1120 0.3227 0.7575 0.8181 0.9833 nan 0.6425 0.9937 nan 0.5320 0.9830
0.3687 113.0 1130 0.3188 0.7567 0.8124 0.9835 nan 0.6306 0.9942 nan 0.5302 0.9832
0.4099 114.0 1140 0.3151 0.7550 0.8063 0.9836 nan 0.6178 0.9947 nan 0.5266 0.9833
0.3283 115.0 1150 0.3152 0.7557 0.8088 0.9836 nan 0.6232 0.9945 nan 0.5281 0.9832
0.3118 116.0 1160 0.3180 0.7556 0.8097 0.9835 nan 0.6249 0.9944 nan 0.5280 0.9832
0.3233 117.0 1170 0.3164 0.7551 0.8070 0.9836 nan 0.6192 0.9947 nan 0.5269 0.9833
0.3401 118.0 1180 0.3192 0.7562 0.8122 0.9834 nan 0.6303 0.9942 nan 0.5292 0.9831
0.3867 119.0 1190 0.3199 0.7566 0.8160 0.9833 nan 0.6382 0.9938 nan 0.5302 0.9830
0.3217 120.0 1200 0.3221 0.7557 0.8092 0.9835 nan 0.6240 0.9945 nan 0.5281 0.9832

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
1
Inference API
This model can be loaded on Inference API (serverless).