File size: 6,737 Bytes
9c8fcbd
 
 
84bdbc3
 
9c8fcbd
 
 
 
 
 
 
 
 
 
 
84bdbc3
9c8fcbd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGB-b5_5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGB-b5_5

This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1911
- Mean Iou: 0.4677
- Mean Accuracy: 0.7472
- Overall Accuracy: 0.9719
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.5020
- Accuracy Undropoff: 0.9923
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.4318
- Iou Undropoff: 0.9713

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 9e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0685        | 5.0   | 10   | 1.0222          | 0.2189   | 0.3725        | 0.5989           | nan                | 0.1256           | 0.6194             | 0.0           | 0.0497      | 0.6070        |
| 0.9481        | 10.0  | 20   | 0.8419          | 0.3703   | 0.6398        | 0.8451           | nan                | 0.4159           | 0.8637             | 0.0           | 0.2633      | 0.8476        |
| 0.8268        | 15.0  | 30   | 0.7165          | 0.3949   | 0.6938        | 0.8694           | nan                | 0.5023           | 0.8853             | 0.0           | 0.3136      | 0.8711        |
| 0.7573        | 20.0  | 40   | 0.6206          | 0.4084   | 0.7186        | 0.8994           | nan                | 0.5214           | 0.9158             | 0.0           | 0.3243      | 0.9010        |
| 0.636         | 25.0  | 50   | 0.5194          | 0.4239   | 0.7253        | 0.9300           | nan                | 0.5020           | 0.9485             | 0.0           | 0.3401      | 0.9316        |
| 0.5238        | 30.0  | 60   | 0.4507          | 0.4365   | 0.7368        | 0.9461           | nan                | 0.5085           | 0.9651             | 0.0           | 0.3618      | 0.9476        |
| 0.4296        | 35.0  | 70   | 0.4064          | 0.4410   | 0.7422        | 0.9530           | nan                | 0.5123           | 0.9721             | 0.0           | 0.3683      | 0.9546        |
| 0.4105        | 40.0  | 80   | 0.3547          | 0.4502   | 0.7467        | 0.9619           | nan                | 0.5120           | 0.9814             | 0.0           | 0.3880      | 0.9627        |
| 0.3436        | 45.0  | 90   | 0.3304          | 0.4571   | 0.7596        | 0.9644           | nan                | 0.5361           | 0.9830             | 0.0           | 0.4066      | 0.9647        |
| 0.2729        | 50.0  | 100  | 0.2953          | 0.4614   | 0.7552        | 0.9680           | nan                | 0.5232           | 0.9873             | 0.0           | 0.4163      | 0.9678        |
| 0.2546        | 55.0  | 110  | 0.2770          | 0.4629   | 0.7579        | 0.9691           | nan                | 0.5276           | 0.9882             | 0.0           | 0.4201      | 0.9686        |
| 0.2281        | 60.0  | 120  | 0.2591          | 0.4647   | 0.7566        | 0.9702           | nan                | 0.5235           | 0.9896             | 0.0           | 0.4245      | 0.9696        |
| 0.2041        | 65.0  | 130  | 0.2453          | 0.4657   | 0.7556        | 0.9708           | nan                | 0.5209           | 0.9903             | 0.0           | 0.4269      | 0.9701        |
| 0.1772        | 70.0  | 140  | 0.2292          | 0.4676   | 0.7542        | 0.9717           | nan                | 0.5171           | 0.9914             | 0.0           | 0.4317      | 0.9711        |
| 0.169         | 75.0  | 150  | 0.2161          | 0.4681   | 0.7520        | 0.9719           | nan                | 0.5122           | 0.9919             | 0.0           | 0.4331      | 0.9713        |
| 0.1543        | 80.0  | 160  | 0.2111          | 0.4682   | 0.7530        | 0.9715           | nan                | 0.5147           | 0.9913             | 0.0           | 0.4336      | 0.9709        |
| 0.1374        | 85.0  | 170  | 0.1973          | 0.4659   | 0.7450        | 0.9715           | nan                | 0.4980           | 0.9921             | 0.0           | 0.4268      | 0.9709        |
| 0.1523        | 90.0  | 180  | 0.1974          | 0.4681   | 0.7501        | 0.9717           | nan                | 0.5085           | 0.9918             | 0.0           | 0.4332      | 0.9711        |
| 0.1323        | 95.0  | 190  | 0.1928          | 0.4658   | 0.7434        | 0.9717           | nan                | 0.4944           | 0.9924             | 0.0           | 0.4263      | 0.9711        |
| 0.1254        | 100.0 | 200  | 0.1923          | 0.4671   | 0.7467        | 0.9717           | nan                | 0.5013           | 0.9921             | 0.0           | 0.4301      | 0.9711        |
| 0.125         | 105.0 | 210  | 0.1867          | 0.4637   | 0.7380        | 0.9717           | nan                | 0.4831           | 0.9929             | 0.0           | 0.4201      | 0.9711        |
| 0.1239        | 110.0 | 220  | 0.1912          | 0.4694   | 0.7520        | 0.9719           | nan                | 0.5121           | 0.9919             | 0.0           | 0.4369      | 0.9713        |
| 0.1252        | 115.0 | 230  | 0.1913          | 0.4689   | 0.7503        | 0.9720           | nan                | 0.5085           | 0.9921             | 0.0           | 0.4354      | 0.9714        |
| 0.1357        | 120.0 | 240  | 0.1911          | 0.4677   | 0.7472        | 0.9719           | nan                | 0.5020           | 0.9923             | 0.0           | 0.4318      | 0.9713        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3