sam1120's picture
update model card README.md
e71abbe
|
raw
history blame
No virus
25.2 kB
metadata
license: other
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: safety-utcustom-train-SF-RGB-b5
    results: []

safety-utcustom-train-SF-RGB-b5

This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2537
  • Mean Iou: 0.4972
  • Mean Accuracy: 0.8434
  • Overall Accuracy: 0.9802
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.6980
  • Accuracy Unsafe: 0.9888
  • Iou Unlabeled: 0.0
  • Iou Safe: 0.5118
  • Iou Unsafe: 0.9798

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-06
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 110

Training results

Training Loss Epoch Step Accuracy Safe Accuracy Unlabeled Accuracy Unsafe Iou Safe Iou Unlabeled Iou Unsafe Validation Loss Mean Accuracy Mean Iou Overall Accuracy
1.2239 0.91 10 0.3992 nan 0.2951 0.0314 0.0 0.2939 1.1103 0.3472 0.1084 0.2982
1.1948 1.82 20 0.5219 nan 0.3705 0.0440 0.0 0.3689 1.0963 0.4462 0.1376 0.3750
1.1661 2.73 30 0.5863 nan 0.4988 0.0647 0.0 0.4961 1.0516 0.5426 0.1870 0.5014
1.1112 3.64 40 0.5459 nan 0.5794 0.0900 0.0 0.5754 1.0048 0.5626 0.2218 0.5784
1.0907 4.55 50 0.5993 nan 0.6367 0.1094 0.0 0.6321 0.9690 0.6180 0.2472 0.6356
1.047 5.45 60 0.6692 nan 0.6699 0.1159 0.0 0.6656 0.9437 0.6695 0.2605 0.6699
1.0112 6.36 70 0.6673 nan 0.7189 0.1349 0.0 0.7137 0.9084 0.6931 0.2829 0.7173
0.9925 7.27 80 0.6842 nan 0.7665 0.1452 0.0 0.7605 0.8647 0.7254 0.3019 0.7641
0.9395 8.18 90 0.6818 nan 0.7921 0.1620 0.0 0.7856 0.8319 0.7369 0.3159 0.7888
0.8902 9.09 100 0.6806 nan 0.8142 0.1770 0.0 0.8072 0.8014 0.7474 0.3281 0.8102
0.9057 10.0 110 0.6984 nan 0.8179 0.1733 0.0 0.8109 0.7867 0.7581 0.3281 0.8143
0.8321 10.91 120 0.6744 nan 0.8494 0.1862 0.0 0.8413 0.7440 0.7619 0.3425 0.8442
0.8152 11.82 130 0.6688 nan 0.8590 0.2006 0.0 0.8507 0.7270 0.7639 0.3504 0.8534
0.7929 12.73 140 0.6660 nan 0.8657 0.2085 0.0 0.8572 0.7045 0.7658 0.3553 0.8598
0.7568 13.64 150 0.6571 nan 0.8838 0.2185 0.0 0.8748 0.6744 0.7704 0.3644 0.8771
0.7085 14.55 160 0.6519 nan 0.8934 0.2260 0.0 0.8842 0.6556 0.7727 0.3701 0.8863
0.7147 15.45 170 0.6561 nan 0.8964 0.2283 0.0 0.8872 0.6509 0.7762 0.3718 0.8893
0.6991 16.36 180 0.6620 nan 0.8964 0.2267 0.0 0.8874 0.6502 0.7792 0.3714 0.8895
0.6357 17.27 190 0.6612 nan 0.9051 0.2411 0.0 0.8960 0.6230 0.7831 0.3790 0.8979
0.6815 18.18 200 0.6484 nan 0.9178 0.2594 0.0 0.9082 0.5993 0.7831 0.3892 0.9098
0.6398 19.09 210 0.6414 nan 0.9258 0.2682 0.0 0.9159 0.5785 0.7836 0.3947 0.9174
0.5845 20.0 220 0.6426 nan 0.9286 0.2698 0.0 0.9187 0.5641 0.7856 0.3962 0.9202
0.6062 20.91 230 0.6520 nan 0.9252 0.2641 0.0 0.9156 0.5693 0.7886 0.3932 0.9171
0.6071 21.82 240 0.6592 nan 0.9283 0.2675 0.0 0.9188 0.5627 0.7937 0.3955 0.9203
0.6209 22.73 250 0.6619 nan 0.9300 0.2724 0.0 0.9205 0.5632 0.7959 0.3977 0.9220
0.5609 23.64 260 0.6505 nan 0.9379 0.2868 0.0 0.9281 0.5416 0.7942 0.4050 0.9294
0.5752 24.55 270 0.6412 nan 0.9451 0.2983 0.0 0.9350 0.5141 0.7932 0.4111 0.9362
0.6004 25.45 280 0.6492 nan 0.9412 0.2907 0.0 0.9313 0.5255 0.7952 0.4073 0.9326
0.5524 26.36 290 0.6588 nan 0.9387 0.2868 0.0 0.9291 0.5314 0.7987 0.4053 0.9304
0.5758 27.27 300 0.6544 nan 0.9423 0.2913 0.0 0.9326 0.5268 0.7984 0.4080 0.9338
0.5598 28.18 310 0.6605 nan 0.9408 0.2897 0.0 0.9312 0.5240 0.8006 0.4070 0.9325
0.5505 29.09 320 0.6582 nan 0.9421 0.2959 0.0 0.9324 0.5165 0.8002 0.4094 0.9337
0.5754 30.0 330 0.6578 nan 0.9433 0.2959 0.0 0.9336 0.5145 0.8005 0.4098 0.9348
0.5284 30.91 340 0.6719 nan 0.9411 0.2941 0.0 0.9318 0.5175 0.8065 0.4086 0.9331
0.5463 31.82 350 0.6684 nan 0.9448 0.3020 0.0 0.9354 0.5016 0.8066 0.4125 0.9367
0.4923 32.73 360 0.6688 nan 0.9463 0.3066 0.0 0.9369 0.4947 0.8075 0.4145 0.9381
0.4922 33.64 370 0.6685 nan 0.9504 0.3165 0.0 0.9409 0.4738 0.8094 0.4191 0.9420
0.4976 34.55 380 0.6748 nan 0.9535 0.3233 0.0 0.9443 0.4663 0.8142 0.4225 0.9453
0.4922 35.45 390 0.6509 nan 0.9653 0.3484 0.0 0.9552 0.4295 0.8081 0.4345 0.9560
0.4608 36.36 400 0.6580 nan 0.9637 0.3507 0.0 0.9538 0.4434 0.8109 0.4348 0.9547
0.4836 37.27 410 0.6522 nan 0.9662 0.3588 0.0 0.9561 0.4328 0.8092 0.4383 0.9569
0.459 38.18 420 0.6477 nan 0.9691 0.3632 0.0 0.9588 0.4211 0.8084 0.4407 0.9596
0.4528 39.09 430 0.6593 nan 0.9668 0.3574 0.0 0.9569 0.4239 0.8131 0.4381 0.9577
0.4202 40.0 440 0.6572 nan 0.9689 0.3650 0.0 0.9590 0.4141 0.8130 0.4413 0.9597
0.4805 40.91 450 0.6470 nan 0.9724 0.3754 0.0 0.9621 0.4012 0.8097 0.4458 0.9628
0.4611 41.82 460 0.6525 nan 0.9718 0.3716 0.0 0.9617 0.4025 0.8122 0.4444 0.9624
0.4339 42.73 470 0.6487 nan 0.9726 0.3744 0.0 0.9624 0.3951 0.8107 0.4456 0.9631
0.4361 43.64 480 0.6448 nan 0.9740 0.3769 0.0 0.9636 0.3946 0.8094 0.4468 0.9643
0.4416 44.55 490 0.6447 nan 0.9746 0.3783 0.0 0.9642 0.3871 0.8097 0.4475 0.9649
0.4524 45.45 500 0.6589 nan 0.9712 0.3701 0.0 0.9612 0.4025 0.8151 0.4438 0.9620
0.4319 46.36 510 0.6730 nan 0.9673 0.3594 0.0 0.9578 0.4169 0.8202 0.4391 0.9586
0.4224 47.27 520 0.6603 nan 0.9712 0.3716 0.0 0.9613 0.3986 0.8158 0.4443 0.9620
0.4333 48.18 530 0.6650 nan 0.9703 0.3724 0.0 0.9605 0.4038 0.8176 0.4443 0.9612
0.3916 49.09 540 0.6624 nan 0.9724 0.3781 0.0 0.9626 0.3968 0.8174 0.4469 0.9633
0.4803 50.0 550 0.6680 nan 0.9726 0.3809 0.0 0.9629 0.3942 0.8203 0.4479 0.9636
0.3543 50.91 560 0.6473 nan 0.9777 0.3952 0.0 0.9673 0.3697 0.8125 0.4542 0.9680
0.3684 51.82 570 0.6515 nan 0.9772 0.3951 0.0 0.9670 0.3708 0.8143 0.4540 0.9676
0.4004 52.73 580 0.6437 nan 0.9793 0.4014 0.0 0.9688 0.3585 0.8115 0.4567 0.9694
0.3656 53.64 590 0.6559 nan 0.9780 0.4010 0.0 0.9679 0.3654 0.8169 0.4563 0.9685
0.3918 54.55 600 0.6432 nan 0.9809 0.4115 0.0 0.9704 0.3527 0.8121 0.4606 0.9709
0.3741 55.45 610 0.6393 nan 0.9827 0.4185 0.0 0.9720 0.3361 0.8110 0.4635 0.9726
0.3656 56.36 620 0.6540 nan 0.9807 0.4147 0.0 0.9705 0.3473 0.8174 0.4617 0.9710
0.3341 57.27 630 0.6258 nan 0.9845 0.4247 0.0 0.9734 0.3335 0.8052 0.4660 0.9739
0.3669 58.18 640 0.6495 nan 0.9815 0.4190 0.0 0.9712 0.3395 0.8155 0.4634 0.9717
0.3347 59.09 650 0.6612 nan 0.9800 0.4174 0.0 0.9700 0.3416 0.8206 0.4625 0.9706
0.4287 60.0 660 0.6673 nan 0.9797 0.4185 0.0 0.9699 0.3419 0.8235 0.4628 0.9705
0.3838 60.91 670 0.6611 nan 0.9812 0.4227 0.0 0.9712 0.3381 0.8211 0.4646 0.9718
0.352 61.82 680 0.6407 nan 0.9845 0.4318 0.0 0.9738 0.3216 0.8126 0.4685 0.9743
0.3343 62.73 690 0.6499 nan 0.9837 0.4311 0.0 0.9733 0.3275 0.8168 0.4681 0.9738
0.3443 63.64 700 0.6528 nan 0.9836 0.4324 0.0 0.9733 0.3273 0.8182 0.4686 0.9738
0.3183 64.55 710 0.6456 nan 0.9848 0.4367 0.0 0.9743 0.3155 0.8152 0.4703 0.9748
0.3346 65.45 720 0.6517 nan 0.9841 0.4356 0.0 0.9738 0.3212 0.8179 0.4698 0.9743
0.3225 66.36 730 0.6367 nan 0.9863 0.4432 0.0 0.9755 0.3052 0.8115 0.4729 0.9759
0.3792 67.27 740 0.6381 nan 0.9861 0.4429 0.0 0.9753 0.3037 0.8121 0.4728 0.9758
0.3177 68.18 750 0.6345 nan 0.9865 0.4446 0.0 0.9756 0.2989 0.8105 0.4734 0.9761
0.3295 69.09 760 0.6404 nan 0.9859 0.4426 0.0 0.9752 0.3064 0.8131 0.4726 0.9757
0.3847 70.0 770 0.6429 nan 0.9857 0.4439 0.0 0.9751 0.3054 0.8143 0.4730 0.9756
0.3406 70.91 780 0.6443 nan 0.9862 0.4476 0.0 0.9756 0.3075 0.8152 0.4744 0.9761
0.3847 71.82 790 0.6343 nan 0.9877 0.4546 0.0 0.9769 0.2911 0.8110 0.4772 0.9773
0.3292 72.73 800 0.6328 nan 0.9881 0.4567 0.0 0.9771 0.2905 0.8105 0.4779 0.9776
0.3156 73.64 810 0.6318 nan 0.9882 0.4579 0.0 0.9773 0.2865 0.8100 0.4784 0.9777
0.3106 74.55 820 0.6333 nan 0.9884 0.4600 0.0 0.9775 0.2812 0.8109 0.4792 0.9779
0.3004 75.45 830 0.6232 nan 0.9893 0.4632 0.0 0.9781 0.2798 0.8063 0.4804 0.9785
0.3336 76.36 840 0.6485 nan 0.9872 0.4593 0.0 0.9768 0.2954 0.8178 0.4787 0.9772
0.299 77.27 850 0.6490 nan 0.9874 0.4613 0.0 0.9769 0.2909 0.8182 0.4794 0.9774
0.292 78.18 860 0.6497 nan 0.9875 0.4629 0.0 0.9771 0.2853 0.8186 0.4800 0.9775
0.2922 79.09 870 0.6586 nan 0.9866 0.4601 0.0 0.9765 0.2917 0.8226 0.4789 0.9770
0.3583 80.0 880 0.6515 nan 0.9876 0.4644 0.0 0.9772 0.2876 0.8195 0.4805 0.9776
0.293 80.91 890 0.6465 nan 0.9882 0.4674 0.0 0.9777 0.2767 0.8173 0.4817 0.9781
0.3287 81.82 900 0.6518 nan 0.9876 0.4652 0.0 0.9773 0.2858 0.8197 0.4808 0.9777
0.3067 82.73 910 0.6528 nan 0.9875 0.4654 0.0 0.9772 0.2861 0.8202 0.4809 0.9776
0.3374 83.64 920 0.6577 nan 0.9870 0.4631 0.0 0.9768 0.2869 0.8224 0.4800 0.9773
0.3171 84.55 930 0.2706 0.4832 0.8164 0.9785 nan 0.6442 0.9887 0.0 0.4714 0.9781
0.3156 85.45 940 0.2708 0.4849 0.8110 0.9793 nan 0.6321 0.9899 0.0 0.4757 0.9789
0.2749 86.36 950 0.2760 0.4850 0.8202 0.9787 nan 0.6518 0.9887 0.0 0.4765 0.9783
0.2725 87.27 960 0.2780 0.4847 0.8279 0.9782 nan 0.6681 0.9876 0.0 0.4765 0.9777
0.2948 88.18 970 0.2636 0.4878 0.8228 0.9793 nan 0.6565 0.9891 0.0 0.4845 0.9788
0.2972 89.09 980 0.2770 0.4870 0.8301 0.9786 nan 0.6722 0.9879 0.0 0.4829 0.9782
0.3101 90.0 990 0.2765 0.4881 0.8297 0.9788 nan 0.6711 0.9882 0.0 0.4859 0.9784
0.2874 90.91 1000 0.2690 0.4896 0.8288 0.9793 nan 0.6689 0.9888 0.0 0.4899 0.9789
0.275 91.82 1010 0.2593 0.4915 0.8221 0.9802 nan 0.6542 0.9901 0.0 0.4947 0.9798
0.2711 92.73 1020 0.2608 0.4917 0.8283 0.9798 nan 0.6673 0.9893 0.0 0.4957 0.9794
0.2691 93.64 1030 0.2609 0.4915 0.8352 0.9794 nan 0.6819 0.9884 0.0 0.4954 0.9789
0.274 94.55 1040 0.2542 0.4935 0.8309 0.9801 nan 0.6722 0.9895 0.0 0.5007 0.9797
0.27 95.45 1050 0.2357 0.4966 0.8177 0.9816 nan 0.6436 0.9919 0.0 0.5087 0.9812
0.255 96.36 1060 0.2460 0.4959 0.8287 0.9808 nan 0.6671 0.9903 0.0 0.5074 0.9804
0.2756 97.27 1070 0.2411 0.4974 0.8271 0.9812 nan 0.6634 0.9909 0.0 0.5113 0.9809
0.2473 98.18 1080 0.2447 0.4974 0.8314 0.9810 nan 0.6724 0.9904 0.0 0.5116 0.9806
0.266 99.09 1090 0.2455 0.4974 0.8339 0.9808 nan 0.6778 0.9901 0.0 0.5118 0.9804
0.2682 100.0 1100 0.2359 0.4989 0.8281 0.9815 nan 0.6651 0.9911 0.0 0.5156 0.9811
0.2607 100.91 1110 0.2376 0.4984 0.8322 0.9812 nan 0.6739 0.9905 0.0 0.5144 0.9808
0.2506 101.82 1120 0.2380 0.4987 0.8317 0.9813 nan 0.6727 0.9907 0.0 0.5153 0.9809
0.2729 102.73 1130 0.2375 0.4984 0.8352 0.9810 nan 0.6802 0.9902 0.0 0.5146 0.9806
0.2348 103.64 1140 0.2335 0.4994 0.8319 0.9814 nan 0.6731 0.9908 0.0 0.5172 0.9810
0.2409 104.55 1150 0.2385 0.4991 0.8343 0.9812 nan 0.6781 0.9904 0.0 0.5164 0.9808
0.2737 105.45 1160 0.2357 0.4995 0.8340 0.9813 nan 0.6774 0.9906 0.0 0.5177 0.9809
0.2857 106.36 1170 0.2304 0.5006 0.8287 0.9818 nan 0.6659 0.9915 0.0 0.5203 0.9815
0.2503 107.27 1180 0.2397 0.4993 0.8362 0.9811 nan 0.6822 0.9902 0.0 0.5172 0.9807
0.2524 108.18 1190 0.2294 0.5004 0.8325 0.9816 nan 0.6741 0.9909 0.0 0.5199 0.9812
0.2722 109.09 1200 0.2461 0.4982 0.8411 0.9806 nan 0.6928 0.9894 0.0 0.5145 0.9802
0.2777 110.0 1210 0.2537 0.4972 0.8434 0.9802 nan 0.6980 0.9888 0.0 0.5118 0.9798

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3