Edit model card

dropoff-utcustom-train-SF-RGB-b0_2

This model is a fine-tuned version of nvidia/mit-b0 on the sam1120/dropoff-utcustom-TRAIN dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6222
  • Mean Iou: 0.4086
  • Mean Accuracy: 0.6638
  • Overall Accuracy: 0.9583
  • Accuracy Unlabeled: nan
  • Accuracy Dropoff: 0.3408
  • Accuracy Undropoff: 0.9869
  • Iou Unlabeled: 0.0
  • Iou Dropoff: 0.2682
  • Iou Undropoff: 0.9576

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Dropoff Accuracy Undropoff Iou Unlabeled Iou Dropoff Iou Undropoff
1.1857 3.33 10 1.1215 0.0852 0.2039 0.0995 nan 0.3183 0.0894 0.0 0.1663 0.0893
1.1597 6.67 20 1.1165 0.1437 0.3568 0.2326 nan 0.4930 0.2206 0.0 0.2108 0.2203
1.1528 10.0 30 1.1040 0.1938 0.5140 0.3464 nan 0.6978 0.3301 0.0 0.2517 0.3297
1.0852 13.33 40 1.0896 0.2243 0.6289 0.4219 nan 0.8561 0.4018 0.0 0.2717 0.4011
1.0388 16.67 50 1.0511 0.2700 0.6748 0.5498 nan 0.8120 0.5377 0.0 0.2744 0.5357
1.0426 20.0 60 1.0089 0.3147 0.6787 0.6640 nan 0.6949 0.6625 0.0 0.2857 0.6583
0.9621 23.33 70 0.9921 0.3374 0.7060 0.7392 nan 0.6695 0.7424 0.0 0.2760 0.7361
0.925 26.67 80 0.9464 0.3591 0.7031 0.7964 nan 0.6007 0.8054 0.0 0.2807 0.7965
0.8872 30.0 90 0.8993 0.3858 0.7074 0.8676 nan 0.5316 0.8831 0.0 0.2888 0.8686
0.8751 33.33 100 0.8974 0.3896 0.7177 0.8817 nan 0.5379 0.8976 0.0 0.2866 0.8822
0.8571 36.67 110 0.8501 0.4028 0.7162 0.9122 nan 0.5011 0.9312 0.0 0.2953 0.9131
0.8866 40.0 120 0.8434 0.4072 0.7240 0.9252 nan 0.5032 0.9448 0.0 0.2963 0.9254
0.8127 43.33 130 0.7922 0.4142 0.7089 0.9404 nan 0.4548 0.9629 0.0 0.3025 0.9402
0.8062 46.67 140 0.7917 0.4123 0.7103 0.9432 nan 0.4548 0.9658 0.0 0.2943 0.9425
0.7512 50.0 150 0.7646 0.4142 0.7059 0.9478 nan 0.4404 0.9713 0.0 0.2955 0.9470
0.7554 53.33 160 0.7497 0.4161 0.7001 0.9510 nan 0.4248 0.9754 0.0 0.2981 0.9502
0.7468 56.67 170 0.7326 0.4177 0.6989 0.9535 nan 0.4195 0.9782 0.0 0.3005 0.9527
0.6506 60.0 180 0.7184 0.4173 0.6992 0.9541 nan 0.4196 0.9789 0.0 0.2987 0.9533
0.6761 63.33 190 0.7037 0.4142 0.6884 0.9546 nan 0.3964 0.9805 0.0 0.2886 0.9539
0.7245 66.67 200 0.6960 0.4122 0.6821 0.9553 nan 0.3824 0.9818 0.0 0.2820 0.9545
0.6514 70.0 210 0.6755 0.4104 0.6705 0.9573 nan 0.3559 0.9852 0.0 0.2746 0.9566
0.6433 73.33 220 0.6804 0.4180 0.6954 0.9556 nan 0.4100 0.9809 0.0 0.2991 0.9548
0.6686 76.67 230 0.6608 0.4107 0.6694 0.9578 nan 0.3531 0.9858 0.0 0.2749 0.9571
0.9091 80.0 240 0.6701 0.4160 0.6922 0.9557 nan 0.4031 0.9813 0.0 0.2930 0.9549
0.6346 83.33 250 0.6725 0.4166 0.6904 0.9563 nan 0.3987 0.9821 0.0 0.2944 0.9555
0.6303 86.67 260 0.6460 0.4090 0.6670 0.9576 nan 0.3481 0.9858 0.0 0.2702 0.9569
0.8923 90.0 270 0.6550 0.4131 0.6799 0.9568 nan 0.3760 0.9837 0.0 0.2832 0.9561
0.6334 93.33 280 0.6468 0.4100 0.6708 0.9572 nan 0.3566 0.9851 0.0 0.2734 0.9565
0.6242 96.67 290 0.6483 0.4106 0.6728 0.9572 nan 0.3607 0.9848 0.0 0.2754 0.9565
0.7401 100.0 300 0.6470 0.4129 0.6796 0.9569 nan 0.3755 0.9838 0.0 0.2825 0.9561
0.6148 103.33 310 0.6242 0.4081 0.6633 0.9582 nan 0.3397 0.9868 0.0 0.2668 0.9575
0.6345 106.67 320 0.6287 0.4093 0.6670 0.9579 nan 0.3478 0.9862 0.0 0.2708 0.9573
0.8711 110.0 330 0.6396 0.4130 0.6782 0.9572 nan 0.3720 0.9843 0.0 0.2826 0.9565
0.5812 113.33 340 0.6266 0.4101 0.6689 0.9580 nan 0.3517 0.9861 0.0 0.2731 0.9573
0.6503 116.67 350 0.6384 0.4130 0.6775 0.9573 nan 0.3706 0.9845 0.0 0.2824 0.9566
0.5923 120.0 360 0.6222 0.4086 0.6638 0.9583 nan 0.3408 0.9869 0.0 0.2682 0.9576

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
2