sam1120's picture
update model card README.md
cf20435
|
raw
history blame
8.13 kB
metadata
license: other
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: safety-utcustom-train-SF-RGB-b5
    results: []

safety-utcustom-train-SF-RGB-b5

This model is a fine-tuned version of nvidia/mit-b5 on the sam1120/safety-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5268
  • Mean Iou: 0.4062
  • Mean Accuracy: 0.8013
  • Overall Accuracy: 0.9320
  • Accuracy Unlabeled: nan
  • Accuracy Safe: 0.6624
  • Accuracy Unsafe: 0.9402
  • Iou Unlabeled: 0.0
  • Iou Safe: 0.2880
  • Iou Unsafe: 0.9307

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-06
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Safe Accuracy Unsafe Iou Unlabeled Iou Safe Iou Unsafe
1.2239 0.91 10 1.1103 0.1084 0.3472 0.2982 nan 0.3992 0.2951 0.0 0.0314 0.2939
1.1948 1.82 20 1.0963 0.1376 0.4462 0.3750 nan 0.5219 0.3705 0.0 0.0440 0.3689
1.1661 2.73 30 1.0516 0.1870 0.5426 0.5014 nan 0.5863 0.4988 0.0 0.0647 0.4961
1.1112 3.64 40 1.0048 0.2218 0.5626 0.5784 nan 0.5459 0.5794 0.0 0.0900 0.5754
1.0907 4.55 50 0.9690 0.2472 0.6180 0.6356 nan 0.5993 0.6367 0.0 0.1094 0.6321
1.047 5.45 60 0.9437 0.2605 0.6695 0.6699 nan 0.6692 0.6699 0.0 0.1159 0.6656
1.0112 6.36 70 0.9084 0.2829 0.6931 0.7173 nan 0.6673 0.7189 0.0 0.1349 0.7137
0.9925 7.27 80 0.8647 0.3019 0.7254 0.7641 nan 0.6842 0.7665 0.0 0.1452 0.7605
0.9395 8.18 90 0.8319 0.3159 0.7369 0.7888 nan 0.6818 0.7921 0.0 0.1620 0.7856
0.8902 9.09 100 0.8014 0.3281 0.7474 0.8102 nan 0.6806 0.8142 0.0 0.1770 0.8072
0.9057 10.0 110 0.7867 0.3281 0.7581 0.8143 nan 0.6984 0.8179 0.0 0.1733 0.8109
0.8321 10.91 120 0.7440 0.3425 0.7619 0.8442 nan 0.6744 0.8494 0.0 0.1862 0.8413
0.8152 11.82 130 0.7270 0.3504 0.7639 0.8534 nan 0.6688 0.8590 0.0 0.2006 0.8507
0.7929 12.73 140 0.7045 0.3553 0.7658 0.8598 nan 0.6660 0.8657 0.0 0.2085 0.8572
0.7568 13.64 150 0.6744 0.3644 0.7704 0.8771 nan 0.6571 0.8838 0.0 0.2185 0.8748
0.7085 14.55 160 0.6556 0.3701 0.7727 0.8863 nan 0.6519 0.8934 0.0 0.2260 0.8842
0.7147 15.45 170 0.6509 0.3718 0.7762 0.8893 nan 0.6561 0.8964 0.0 0.2283 0.8872
0.6991 16.36 180 0.6502 0.3714 0.7792 0.8895 nan 0.6620 0.8964 0.0 0.2267 0.8874
0.6357 17.27 190 0.6230 0.3790 0.7831 0.8979 nan 0.6612 0.9051 0.0 0.2411 0.8960
0.6815 18.18 200 0.5993 0.3892 0.7831 0.9098 nan 0.6484 0.9178 0.0 0.2594 0.9082
0.6398 19.09 210 0.5785 0.3947 0.7836 0.9174 nan 0.6414 0.9258 0.0 0.2682 0.9159
0.5845 20.0 220 0.5641 0.3962 0.7856 0.9202 nan 0.6426 0.9286 0.0 0.2698 0.9187
0.6062 20.91 230 0.5693 0.3932 0.7886 0.9171 nan 0.6520 0.9252 0.0 0.2641 0.9156
0.6071 21.82 240 0.5627 0.3955 0.7937 0.9203 nan 0.6592 0.9283 0.0 0.2675 0.9188
0.6209 22.73 250 0.5632 0.3977 0.7959 0.9220 nan 0.6619 0.9300 0.0 0.2724 0.9205
0.5609 23.64 260 0.5416 0.4050 0.7942 0.9294 nan 0.6505 0.9379 0.0 0.2868 0.9281
0.5752 24.55 270 0.5141 0.4111 0.7932 0.9362 nan 0.6412 0.9451 0.0 0.2983 0.9350
0.6004 25.45 280 0.5255 0.4073 0.7952 0.9326 nan 0.6492 0.9412 0.0 0.2907 0.9313
0.5524 26.36 290 0.5314 0.4053 0.7987 0.9304 nan 0.6588 0.9387 0.0 0.2868 0.9291
0.5758 27.27 300 0.5268 0.4080 0.7984 0.9338 nan 0.6544 0.9423 0.0 0.2913 0.9326
0.5598 28.18 310 0.5240 0.4070 0.8006 0.9325 nan 0.6605 0.9408 0.0 0.2897 0.9312
0.5505 29.09 320 0.5165 0.4094 0.8002 0.9337 nan 0.6582 0.9421 0.0 0.2959 0.9324
0.5763 30.0 330 0.5268 0.4062 0.8013 0.9320 nan 0.6624 0.9402 0.0 0.2880 0.9307

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3