sam1120's picture
update model card README.md
bf9e4fd
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGB-b0_5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dropoff-utcustom-train-SF-RGB-b0_5
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2543
- Mean Iou: 0.6541
- Mean Accuracy: 0.6937
- Overall Accuracy: 0.9665
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3944
- Accuracy Undropoff: 0.9930
- Iou Unlabeled: nan
- Iou Dropoff: 0.3424
- Iou Undropoff: 0.9659
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.2123 | 3.33 | 10 | 1.1206 | 0.0793 | 0.1898 | 0.1888 | nan | 0.1908 | 0.1887 | 0.0 | 0.0494 | 0.1886 |
| 1.0927 | 6.67 | 20 | 1.0985 | 0.2196 | 0.5875 | 0.5351 | nan | 0.6450 | 0.5300 | 0.0 | 0.1290 | 0.5298 |
| 1.0578 | 10.0 | 30 | 0.9786 | 0.3662 | 0.7562 | 0.8622 | nan | 0.6400 | 0.8725 | 0.0 | 0.2367 | 0.8621 |
| 0.788 | 13.33 | 40 | 0.7940 | 0.4289 | 0.7505 | 0.9456 | nan | 0.5365 | 0.9646 | 0.0 | 0.3398 | 0.9468 |
| 0.6353 | 16.67 | 50 | 0.6206 | 0.4182 | 0.6840 | 0.9583 | nan | 0.3830 | 0.9850 | 0.0 | 0.2966 | 0.9581 |
| 0.6944 | 20.0 | 60 | 0.5213 | 0.4211 | 0.6766 | 0.9623 | nan | 0.3631 | 0.9901 | 0.0 | 0.3014 | 0.9620 |
| 0.5046 | 23.33 | 70 | 0.4765 | 0.4239 | 0.6796 | 0.9634 | nan | 0.3683 | 0.9910 | 0.0 | 0.3090 | 0.9628 |
| 0.4684 | 26.67 | 80 | 0.4643 | 0.3982 | 0.6347 | 0.9598 | nan | 0.2779 | 0.9914 | 0.0 | 0.2352 | 0.9593 |
| 0.4401 | 30.0 | 90 | 0.4483 | 0.4110 | 0.6507 | 0.9632 | nan | 0.3077 | 0.9936 | 0.0 | 0.2703 | 0.9627 |
| 0.4268 | 33.33 | 100 | 0.4366 | 0.6489 | 0.7001 | 0.9638 | nan | 0.4108 | 0.9895 | nan | 0.3347 | 0.9632 |
| 0.3939 | 36.67 | 110 | 0.4027 | 0.4272 | 0.6798 | 0.9650 | nan | 0.3670 | 0.9927 | 0.0 | 0.3171 | 0.9644 |
| 0.4472 | 40.0 | 120 | 0.4159 | 0.6428 | 0.6896 | 0.9638 | nan | 0.3887 | 0.9905 | nan | 0.3225 | 0.9632 |
| 0.3618 | 43.33 | 130 | 0.3765 | 0.6325 | 0.6671 | 0.9650 | nan | 0.3402 | 0.9939 | nan | 0.3006 | 0.9644 |
| 0.3456 | 46.67 | 140 | 0.3671 | 0.6395 | 0.6816 | 0.9643 | nan | 0.3715 | 0.9917 | nan | 0.3153 | 0.9637 |
| 0.3352 | 50.0 | 150 | 0.3572 | 0.6431 | 0.6839 | 0.9650 | nan | 0.3755 | 0.9923 | nan | 0.3218 | 0.9644 |
| 0.3143 | 53.33 | 160 | 0.3451 | 0.6351 | 0.6702 | 0.9651 | nan | 0.3467 | 0.9938 | nan | 0.3056 | 0.9646 |
| 0.3009 | 56.67 | 170 | 0.3357 | 0.6449 | 0.6941 | 0.9636 | nan | 0.3984 | 0.9898 | nan | 0.3267 | 0.9630 |
| 0.2765 | 60.0 | 180 | 0.3188 | 0.6458 | 0.6934 | 0.9641 | nan | 0.3965 | 0.9903 | nan | 0.3282 | 0.9634 |
| 0.2703 | 63.33 | 190 | 0.3179 | 0.6385 | 0.6732 | 0.9656 | nan | 0.3525 | 0.9940 | nan | 0.3119 | 0.9650 |
| 0.2746 | 66.67 | 200 | 0.3067 | 0.6385 | 0.6702 | 0.9662 | nan | 0.3456 | 0.9949 | nan | 0.3113 | 0.9656 |
| 0.2516 | 70.0 | 210 | 0.2992 | 0.6569 | 0.6968 | 0.9667 | nan | 0.4008 | 0.9929 | nan | 0.3477 | 0.9661 |
| 0.2503 | 73.33 | 220 | 0.2999 | 0.6671 | 0.7198 | 0.9659 | nan | 0.4497 | 0.9899 | nan | 0.3689 | 0.9652 |
| 0.2443 | 76.67 | 230 | 0.2816 | 0.6439 | 0.6750 | 0.9668 | nan | 0.3547 | 0.9952 | nan | 0.3215 | 0.9663 |
| 0.3757 | 80.0 | 240 | 0.2907 | 0.6593 | 0.7063 | 0.9659 | nan | 0.4215 | 0.9911 | nan | 0.3535 | 0.9652 |
| 0.2306 | 83.33 | 250 | 0.2767 | 0.6439 | 0.6807 | 0.9658 | nan | 0.3680 | 0.9935 | nan | 0.3226 | 0.9652 |
| 0.2216 | 86.67 | 260 | 0.2792 | 0.6583 | 0.7018 | 0.9663 | nan | 0.4115 | 0.9920 | nan | 0.3509 | 0.9657 |
| 0.3202 | 90.0 | 270 | 0.2681 | 0.6425 | 0.6789 | 0.9657 | nan | 0.3642 | 0.9936 | nan | 0.3199 | 0.9652 |
| 0.2174 | 93.33 | 280 | 0.2633 | 0.6467 | 0.6860 | 0.9657 | nan | 0.3791 | 0.9928 | nan | 0.3284 | 0.9651 |
| 0.2086 | 96.67 | 290 | 0.2658 | 0.6476 | 0.6900 | 0.9652 | nan | 0.3880 | 0.9920 | nan | 0.3306 | 0.9646 |
| 0.2042 | 100.0 | 300 | 0.2651 | 0.6486 | 0.6898 | 0.9655 | nan | 0.3873 | 0.9923 | nan | 0.3322 | 0.9649 |
| 0.2071 | 103.33 | 310 | 0.2597 | 0.6445 | 0.6792 | 0.9662 | nan | 0.3643 | 0.9941 | nan | 0.3233 | 0.9657 |
| 0.2097 | 106.67 | 320 | 0.2596 | 0.6615 | 0.7062 | 0.9665 | nan | 0.4206 | 0.9918 | nan | 0.3571 | 0.9658 |
| 0.3118 | 110.0 | 330 | 0.2557 | 0.6516 | 0.6928 | 0.9659 | nan | 0.3931 | 0.9924 | nan | 0.3380 | 0.9653 |
| 0.1956 | 113.33 | 340 | 0.2517 | 0.6494 | 0.6865 | 0.9664 | nan | 0.3794 | 0.9936 | nan | 0.3331 | 0.9658 |
| 0.201 | 116.67 | 350 | 0.2570 | 0.6573 | 0.7032 | 0.9658 | nan | 0.4151 | 0.9913 | nan | 0.3494 | 0.9651 |
| 0.1952 | 120.0 | 360 | 0.2543 | 0.6541 | 0.6937 | 0.9665 | nan | 0.3944 | 0.9930 | nan | 0.3424 | 0.9659 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3