sam1120's picture
update model card README.md
f87b60b
|
raw
history blame
No virus
12.4 kB
metadata
license: other
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: safety-utcustom-train-SF-RGB-b5
    results: []

safety-utcustom-train-SF-RGB-b5

This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4046
  • Mean Iou: 0.4437
  • Mean Accuracy: 0.8171
  • Overall Accuracy: 0.9614
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.6638
  • Accuracy Unsafe: 0.9704
  • Iou Unlabeled: 0.0
  • Iou Safe: 0.3704
  • Iou Unsafe: 0.9606

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-06
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 50

Training results

Training Loss Epoch Step Accuracy Safe Accuracy Unlabeled Accuracy Unsafe Iou Safe Iou Unlabeled Iou Unsafe Validation Loss Mean Accuracy Mean Iou Overall Accuracy
1.2239 0.91 10 0.3992 nan 0.2951 0.0314 0.0 0.2939 1.1103 0.3472 0.1084 0.2982
1.1948 1.82 20 0.5219 nan 0.3705 0.0440 0.0 0.3689 1.0963 0.4462 0.1376 0.3750
1.1661 2.73 30 0.5863 nan 0.4988 0.0647 0.0 0.4961 1.0516 0.5426 0.1870 0.5014
1.1112 3.64 40 0.5459 nan 0.5794 0.0900 0.0 0.5754 1.0048 0.5626 0.2218 0.5784
1.0907 4.55 50 0.5993 nan 0.6367 0.1094 0.0 0.6321 0.9690 0.6180 0.2472 0.6356
1.047 5.45 60 0.6692 nan 0.6699 0.1159 0.0 0.6656 0.9437 0.6695 0.2605 0.6699
1.0112 6.36 70 0.6673 nan 0.7189 0.1349 0.0 0.7137 0.9084 0.6931 0.2829 0.7173
0.9925 7.27 80 0.6842 nan 0.7665 0.1452 0.0 0.7605 0.8647 0.7254 0.3019 0.7641
0.9395 8.18 90 0.6818 nan 0.7921 0.1620 0.0 0.7856 0.8319 0.7369 0.3159 0.7888
0.8902 9.09 100 0.6806 nan 0.8142 0.1770 0.0 0.8072 0.8014 0.7474 0.3281 0.8102
0.9057 10.0 110 0.6984 nan 0.8179 0.1733 0.0 0.8109 0.7867 0.7581 0.3281 0.8143
0.8321 10.91 120 0.6744 nan 0.8494 0.1862 0.0 0.8413 0.7440 0.7619 0.3425 0.8442
0.8152 11.82 130 0.6688 nan 0.8590 0.2006 0.0 0.8507 0.7270 0.7639 0.3504 0.8534
0.7929 12.73 140 0.6660 nan 0.8657 0.2085 0.0 0.8572 0.7045 0.7658 0.3553 0.8598
0.7568 13.64 150 0.6571 nan 0.8838 0.2185 0.0 0.8748 0.6744 0.7704 0.3644 0.8771
0.7085 14.55 160 0.6519 nan 0.8934 0.2260 0.0 0.8842 0.6556 0.7727 0.3701 0.8863
0.7147 15.45 170 0.6561 nan 0.8964 0.2283 0.0 0.8872 0.6509 0.7762 0.3718 0.8893
0.6991 16.36 180 0.6620 nan 0.8964 0.2267 0.0 0.8874 0.6502 0.7792 0.3714 0.8895
0.6357 17.27 190 0.6612 nan 0.9051 0.2411 0.0 0.8960 0.6230 0.7831 0.3790 0.8979
0.6815 18.18 200 0.6484 nan 0.9178 0.2594 0.0 0.9082 0.5993 0.7831 0.3892 0.9098
0.6398 19.09 210 0.6414 nan 0.9258 0.2682 0.0 0.9159 0.5785 0.7836 0.3947 0.9174
0.5845 20.0 220 0.6426 nan 0.9286 0.2698 0.0 0.9187 0.5641 0.7856 0.3962 0.9202
0.6062 20.91 230 0.6520 nan 0.9252 0.2641 0.0 0.9156 0.5693 0.7886 0.3932 0.9171
0.6071 21.82 240 0.6592 nan 0.9283 0.2675 0.0 0.9188 0.5627 0.7937 0.3955 0.9203
0.6209 22.73 250 0.6619 nan 0.9300 0.2724 0.0 0.9205 0.5632 0.7959 0.3977 0.9220
0.5609 23.64 260 0.6505 nan 0.9379 0.2868 0.0 0.9281 0.5416 0.7942 0.4050 0.9294
0.5752 24.55 270 0.6412 nan 0.9451 0.2983 0.0 0.9350 0.5141 0.7932 0.4111 0.9362
0.6004 25.45 280 0.6492 nan 0.9412 0.2907 0.0 0.9313 0.5255 0.7952 0.4073 0.9326
0.5524 26.36 290 0.6588 nan 0.9387 0.2868 0.0 0.9291 0.5314 0.7987 0.4053 0.9304
0.5758 27.27 300 0.6544 nan 0.9423 0.2913 0.0 0.9326 0.5268 0.7984 0.4080 0.9338
0.5598 28.18 310 0.6605 nan 0.9408 0.2897 0.0 0.9312 0.5240 0.8006 0.4070 0.9325
0.5505 29.09 320 0.6582 nan 0.9421 0.2959 0.0 0.9324 0.5165 0.8002 0.4094 0.9337
0.5754 30.0 330 0.5145 0.4098 0.8005 0.9348 nan 0.6578 0.9433 0.0 0.2959 0.9336
0.5284 30.91 340 0.5175 0.4086 0.8065 0.9331 nan 0.6719 0.9411 0.0 0.2941 0.9318
0.5463 31.82 350 0.5016 0.4125 0.8066 0.9367 nan 0.6684 0.9448 0.0 0.3020 0.9354
0.4923 32.73 360 0.4947 0.4145 0.8075 0.9381 nan 0.6688 0.9463 0.0 0.3066 0.9369
0.4922 33.64 370 0.4738 0.4191 0.8094 0.9420 nan 0.6685 0.9504 0.0 0.3165 0.9409
0.4976 34.55 380 0.4663 0.4225 0.8142 0.9453 nan 0.6748 0.9535 0.0 0.3233 0.9443
0.4922 35.45 390 0.4295 0.4345 0.8081 0.9560 nan 0.6509 0.9653 0.0 0.3484 0.9552
0.4608 36.36 400 0.4434 0.4348 0.8109 0.9547 nan 0.6580 0.9637 0.0 0.3507 0.9538
0.4836 37.27 410 0.4328 0.4383 0.8092 0.9569 nan 0.6522 0.9662 0.0 0.3588 0.9561
0.459 38.18 420 0.4211 0.4407 0.8084 0.9596 nan 0.6477 0.9691 0.0 0.3632 0.9588
0.4528 39.09 430 0.4239 0.4381 0.8131 0.9577 nan 0.6593 0.9668 0.0 0.3574 0.9569
0.4202 40.0 440 0.4141 0.4413 0.8130 0.9597 nan 0.6572 0.9689 0.0 0.3650 0.9590
0.4805 40.91 450 0.4012 0.4458 0.8097 0.9628 nan 0.6470 0.9724 0.0 0.3754 0.9621
0.4611 41.82 460 0.4025 0.4444 0.8122 0.9624 nan 0.6525 0.9718 0.0 0.3716 0.9617
0.4339 42.73 470 0.3951 0.4456 0.8107 0.9631 nan 0.6487 0.9726 0.0 0.3744 0.9624
0.4361 43.64 480 0.3946 0.4468 0.8094 0.9643 nan 0.6448 0.9740 0.0 0.3769 0.9636
0.4416 44.55 490 0.3871 0.4475 0.8097 0.9649 nan 0.6447 0.9746 0.0 0.3783 0.9642
0.4524 45.45 500 0.4025 0.4438 0.8151 0.9620 nan 0.6589 0.9712 0.0 0.3701 0.9612
0.4319 46.36 510 0.4169 0.4391 0.8202 0.9586 nan 0.6730 0.9673 0.0 0.3594 0.9578
0.4224 47.27 520 0.3986 0.4443 0.8158 0.9620 nan 0.6603 0.9712 0.0 0.3716 0.9613
0.4409 48.18 530 0.4073 0.4419 0.8179 0.9601 nan 0.6667 0.9691 0.0 0.3664 0.9594
0.4228 49.09 540 0.4031 0.4431 0.8168 0.9614 nan 0.6631 0.9705 0.0 0.3685 0.9607
0.4605 50.0 550 0.4046 0.4437 0.8171 0.9614 nan 0.6638 0.9704 0.0 0.3704 0.9606

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3