sam1120's picture
update model card README.md
f858e51
|
raw
history blame
No virus
16.6 kB
metadata
license: other
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: safety-utcustom-train-SF-RGB-b5
    results: []

safety-utcustom-train-SF-RGB-b5

This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3064
  • Mean Iou: 0.4721
  • Mean Accuracy: 0.8144
  • Overall Accuracy: 0.9753
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.6433
  • Accuracy Unsafe: 0.9854
  • Iou Unlabeled: 0.0
  • Iou Safe: 0.4415
  • Iou Unsafe: 0.9748

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-06
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 70

Training results

Training Loss Epoch Step Accuracy Safe Accuracy Unlabeled Accuracy Unsafe Iou Safe Iou Unlabeled Iou Unsafe Validation Loss Mean Accuracy Mean Iou Overall Accuracy
1.2239 0.91 10 0.3992 nan 0.2951 0.0314 0.0 0.2939 1.1103 0.3472 0.1084 0.2982
1.1948 1.82 20 0.5219 nan 0.3705 0.0440 0.0 0.3689 1.0963 0.4462 0.1376 0.3750
1.1661 2.73 30 0.5863 nan 0.4988 0.0647 0.0 0.4961 1.0516 0.5426 0.1870 0.5014
1.1112 3.64 40 0.5459 nan 0.5794 0.0900 0.0 0.5754 1.0048 0.5626 0.2218 0.5784
1.0907 4.55 50 0.5993 nan 0.6367 0.1094 0.0 0.6321 0.9690 0.6180 0.2472 0.6356
1.047 5.45 60 0.6692 nan 0.6699 0.1159 0.0 0.6656 0.9437 0.6695 0.2605 0.6699
1.0112 6.36 70 0.6673 nan 0.7189 0.1349 0.0 0.7137 0.9084 0.6931 0.2829 0.7173
0.9925 7.27 80 0.6842 nan 0.7665 0.1452 0.0 0.7605 0.8647 0.7254 0.3019 0.7641
0.9395 8.18 90 0.6818 nan 0.7921 0.1620 0.0 0.7856 0.8319 0.7369 0.3159 0.7888
0.8902 9.09 100 0.6806 nan 0.8142 0.1770 0.0 0.8072 0.8014 0.7474 0.3281 0.8102
0.9057 10.0 110 0.6984 nan 0.8179 0.1733 0.0 0.8109 0.7867 0.7581 0.3281 0.8143
0.8321 10.91 120 0.6744 nan 0.8494 0.1862 0.0 0.8413 0.7440 0.7619 0.3425 0.8442
0.8152 11.82 130 0.6688 nan 0.8590 0.2006 0.0 0.8507 0.7270 0.7639 0.3504 0.8534
0.7929 12.73 140 0.6660 nan 0.8657 0.2085 0.0 0.8572 0.7045 0.7658 0.3553 0.8598
0.7568 13.64 150 0.6571 nan 0.8838 0.2185 0.0 0.8748 0.6744 0.7704 0.3644 0.8771
0.7085 14.55 160 0.6519 nan 0.8934 0.2260 0.0 0.8842 0.6556 0.7727 0.3701 0.8863
0.7147 15.45 170 0.6561 nan 0.8964 0.2283 0.0 0.8872 0.6509 0.7762 0.3718 0.8893
0.6991 16.36 180 0.6620 nan 0.8964 0.2267 0.0 0.8874 0.6502 0.7792 0.3714 0.8895
0.6357 17.27 190 0.6612 nan 0.9051 0.2411 0.0 0.8960 0.6230 0.7831 0.3790 0.8979
0.6815 18.18 200 0.6484 nan 0.9178 0.2594 0.0 0.9082 0.5993 0.7831 0.3892 0.9098
0.6398 19.09 210 0.6414 nan 0.9258 0.2682 0.0 0.9159 0.5785 0.7836 0.3947 0.9174
0.5845 20.0 220 0.6426 nan 0.9286 0.2698 0.0 0.9187 0.5641 0.7856 0.3962 0.9202
0.6062 20.91 230 0.6520 nan 0.9252 0.2641 0.0 0.9156 0.5693 0.7886 0.3932 0.9171
0.6071 21.82 240 0.6592 nan 0.9283 0.2675 0.0 0.9188 0.5627 0.7937 0.3955 0.9203
0.6209 22.73 250 0.6619 nan 0.9300 0.2724 0.0 0.9205 0.5632 0.7959 0.3977 0.9220
0.5609 23.64 260 0.6505 nan 0.9379 0.2868 0.0 0.9281 0.5416 0.7942 0.4050 0.9294
0.5752 24.55 270 0.6412 nan 0.9451 0.2983 0.0 0.9350 0.5141 0.7932 0.4111 0.9362
0.6004 25.45 280 0.6492 nan 0.9412 0.2907 0.0 0.9313 0.5255 0.7952 0.4073 0.9326
0.5524 26.36 290 0.6588 nan 0.9387 0.2868 0.0 0.9291 0.5314 0.7987 0.4053 0.9304
0.5758 27.27 300 0.6544 nan 0.9423 0.2913 0.0 0.9326 0.5268 0.7984 0.4080 0.9338
0.5598 28.18 310 0.6605 nan 0.9408 0.2897 0.0 0.9312 0.5240 0.8006 0.4070 0.9325
0.5505 29.09 320 0.6582 nan 0.9421 0.2959 0.0 0.9324 0.5165 0.8002 0.4094 0.9337
0.5754 30.0 330 0.6578 nan 0.9433 0.2959 0.0 0.9336 0.5145 0.8005 0.4098 0.9348
0.5284 30.91 340 0.6719 nan 0.9411 0.2941 0.0 0.9318 0.5175 0.8065 0.4086 0.9331
0.5463 31.82 350 0.6684 nan 0.9448 0.3020 0.0 0.9354 0.5016 0.8066 0.4125 0.9367
0.4923 32.73 360 0.6688 nan 0.9463 0.3066 0.0 0.9369 0.4947 0.8075 0.4145 0.9381
0.4922 33.64 370 0.6685 nan 0.9504 0.3165 0.0 0.9409 0.4738 0.8094 0.4191 0.9420
0.4976 34.55 380 0.6748 nan 0.9535 0.3233 0.0 0.9443 0.4663 0.8142 0.4225 0.9453
0.4922 35.45 390 0.6509 nan 0.9653 0.3484 0.0 0.9552 0.4295 0.8081 0.4345 0.9560
0.4608 36.36 400 0.6580 nan 0.9637 0.3507 0.0 0.9538 0.4434 0.8109 0.4348 0.9547
0.4836 37.27 410 0.6522 nan 0.9662 0.3588 0.0 0.9561 0.4328 0.8092 0.4383 0.9569
0.459 38.18 420 0.6477 nan 0.9691 0.3632 0.0 0.9588 0.4211 0.8084 0.4407 0.9596
0.4528 39.09 430 0.6593 nan 0.9668 0.3574 0.0 0.9569 0.4239 0.8131 0.4381 0.9577
0.4202 40.0 440 0.6572 nan 0.9689 0.3650 0.0 0.9590 0.4141 0.8130 0.4413 0.9597
0.4805 40.91 450 0.6470 nan 0.9724 0.3754 0.0 0.9621 0.4012 0.8097 0.4458 0.9628
0.4611 41.82 460 0.6525 nan 0.9718 0.3716 0.0 0.9617 0.4025 0.8122 0.4444 0.9624
0.4339 42.73 470 0.6487 nan 0.9726 0.3744 0.0 0.9624 0.3951 0.8107 0.4456 0.9631
0.4361 43.64 480 0.6448 nan 0.9740 0.3769 0.0 0.9636 0.3946 0.8094 0.4468 0.9643
0.4416 44.55 490 0.6447 nan 0.9746 0.3783 0.0 0.9642 0.3871 0.8097 0.4475 0.9649
0.4524 45.45 500 0.6589 nan 0.9712 0.3701 0.0 0.9612 0.4025 0.8151 0.4438 0.9620
0.4319 46.36 510 0.6730 nan 0.9673 0.3594 0.0 0.9578 0.4169 0.8202 0.4391 0.9586
0.4224 47.27 520 0.6603 nan 0.9712 0.3716 0.0 0.9613 0.3986 0.8158 0.4443 0.9620
0.4333 48.18 530 0.4038 0.4443 0.8176 0.9612 nan 0.6650 0.9703 0.0 0.3724 0.9605
0.3916 49.09 540 0.3968 0.4469 0.8174 0.9633 nan 0.6624 0.9724 0.0 0.3781 0.9626
0.4803 50.0 550 0.3942 0.4479 0.8203 0.9636 nan 0.6680 0.9726 0.0 0.3809 0.9629
0.3543 50.91 560 0.3697 0.4542 0.8125 0.9680 nan 0.6473 0.9777 0.0 0.3952 0.9673
0.3684 51.82 570 0.3708 0.4540 0.8143 0.9676 nan 0.6515 0.9772 0.0 0.3951 0.9670
0.4004 52.73 580 0.3585 0.4567 0.8115 0.9694 nan 0.6437 0.9793 0.0 0.4014 0.9688
0.3656 53.64 590 0.3654 0.4563 0.8169 0.9685 nan 0.6559 0.9780 0.0 0.4010 0.9679
0.3918 54.55 600 0.3527 0.4606 0.8121 0.9709 nan 0.6432 0.9809 0.0 0.4115 0.9704
0.3741 55.45 610 0.3361 0.4635 0.8110 0.9726 nan 0.6393 0.9827 0.0 0.4185 0.9720
0.3656 56.36 620 0.3473 0.4617 0.8174 0.9710 nan 0.6540 0.9807 0.0 0.4147 0.9705
0.3341 57.27 630 0.3335 0.4660 0.8052 0.9739 nan 0.6258 0.9845 0.0 0.4247 0.9734
0.3669 58.18 640 0.3395 0.4634 0.8155 0.9717 nan 0.6495 0.9815 0.0 0.4190 0.9712
0.3347 59.09 650 0.3416 0.4625 0.8206 0.9706 nan 0.6612 0.9800 0.0 0.4174 0.9700
0.4287 60.0 660 0.3419 0.4628 0.8235 0.9705 nan 0.6673 0.9797 0.0 0.4185 0.9699
0.3838 60.91 670 0.3381 0.4646 0.8211 0.9718 nan 0.6611 0.9812 0.0 0.4227 0.9712
0.352 61.82 680 0.3216 0.4685 0.8126 0.9743 nan 0.6407 0.9845 0.0 0.4318 0.9738
0.3343 62.73 690 0.3275 0.4681 0.8168 0.9738 nan 0.6499 0.9837 0.0 0.4311 0.9733
0.3443 63.64 700 0.3273 0.4686 0.8182 0.9738 nan 0.6528 0.9836 0.0 0.4324 0.9733
0.3183 64.55 710 0.3155 0.4703 0.8152 0.9748 nan 0.6456 0.9848 0.0 0.4367 0.9743
0.3346 65.45 720 0.3212 0.4698 0.8179 0.9743 nan 0.6517 0.9841 0.0 0.4356 0.9738
0.3225 66.36 730 0.3052 0.4729 0.8115 0.9759 nan 0.6367 0.9863 0.0 0.4432 0.9755
0.3792 67.27 740 0.3037 0.4728 0.8121 0.9758 nan 0.6381 0.9861 0.0 0.4429 0.9753
0.3177 68.18 750 0.2989 0.4734 0.8105 0.9761 nan 0.6345 0.9865 0.0 0.4446 0.9756
0.3295 69.09 760 0.3064 0.4726 0.8131 0.9757 nan 0.6404 0.9859 0.0 0.4426 0.9752
0.3856 70.0 770 0.3064 0.4721 0.8144 0.9753 nan 0.6433 0.9854 0.0 0.4415 0.9748

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3