File size: 9,223 Bytes
adba754
 
 
4f5695e
 
adba754
 
 
 
 
 
 
 
 
 
 
4f5695e
adba754
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGB-b0_3
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGB-b0_3

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3958
- Mean Iou: 0.6134
- Mean Accuracy: 0.6480
- Overall Accuracy: 0.9627
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3026
- Accuracy Undropoff: 0.9933
- Iou Unlabeled: nan
- Iou Dropoff: 0.2645
- Iou Undropoff: 0.9622

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1015        | 3.33   | 10   | 1.0990          | 0.1184   | 0.4572        | 0.3294           | nan                | 0.5975           | 0.3170             | 0.0           | 0.0427      | 0.3124        |
| 1.0478        | 6.67   | 20   | 1.0756          | 0.2121   | 0.7082        | 0.5654           | nan                | 0.8648           | 0.5515             | 0.0           | 0.0879      | 0.5482        |
| 1.0451        | 10.0   | 30   | 1.0269          | 0.2846   | 0.8053        | 0.7334           | nan                | 0.8842           | 0.7264             | 0.0           | 0.1313      | 0.7226        |
| 0.9095        | 13.33  | 40   | 0.9476          | 0.3360   | 0.7905        | 0.8411           | nan                | 0.7349           | 0.8460             | 0.0           | 0.1723      | 0.8358        |
| 0.8091        | 16.67  | 50   | 0.8425          | 0.3858   | 0.7645        | 0.9167           | nan                | 0.5975           | 0.9315             | 0.0           | 0.2429      | 0.9145        |
| 0.8094        | 20.0   | 60   | 0.7489          | 0.4090   | 0.7445        | 0.9417           | nan                | 0.5281           | 0.9608             | 0.0           | 0.2866      | 0.9403        |
| 0.6945        | 23.33  | 70   | 0.7005          | 0.4148   | 0.7472        | 0.9453           | nan                | 0.5298           | 0.9646             | 0.0           | 0.3004      | 0.9440        |
| 0.6337        | 26.67  | 80   | 0.6331          | 0.6267   | 0.7334        | 0.9499           | nan                | 0.4958           | 0.9709             | nan           | 0.3047      | 0.9488        |
| 0.603         | 30.0   | 90   | 0.5726          | 0.6222   | 0.6935        | 0.9559           | nan                | 0.4057           | 0.9814             | nan           | 0.2894      | 0.9551        |
| 0.5903        | 33.33  | 100  | 0.5841          | 0.6248   | 0.7151        | 0.9526           | nan                | 0.4546           | 0.9757             | nan           | 0.2980      | 0.9516        |
| 0.5514        | 36.67  | 110  | 0.5157          | 0.6227   | 0.6818        | 0.9585           | nan                | 0.3781           | 0.9854             | nan           | 0.2875      | 0.9578        |
| 0.6464        | 40.0   | 120  | 0.5141          | 0.6240   | 0.6889        | 0.9575           | nan                | 0.3941           | 0.9836             | nan           | 0.2912      | 0.9568        |
| 0.5198        | 43.33  | 130  | 0.4890          | 0.4141   | 0.6762        | 0.9591           | nan                | 0.3657           | 0.9866             | 0.0           | 0.2838      | 0.9585        |
| 0.5077        | 46.67  | 140  | 0.4855          | 0.4118   | 0.6719        | 0.9588           | nan                | 0.3572           | 0.9866             | 0.0           | 0.2773      | 0.9581        |
| 0.4817        | 50.0   | 150  | 0.4710          | 0.6182   | 0.6733        | 0.9587           | nan                | 0.3602           | 0.9864             | nan           | 0.2784      | 0.9580        |
| 0.4713        | 53.33  | 160  | 0.4669          | 0.6196   | 0.6683        | 0.9603           | nan                | 0.3479           | 0.9887             | nan           | 0.2795      | 0.9597        |
| 0.4516        | 56.67  | 170  | 0.4486          | 0.4107   | 0.6586        | 0.9612           | nan                | 0.3265           | 0.9906             | 0.0           | 0.2715      | 0.9606        |
| 0.4059        | 60.0   | 180  | 0.4361          | 0.6136   | 0.6548        | 0.9612           | nan                | 0.3187           | 0.9909             | nan           | 0.2665      | 0.9606        |
| 0.4142        | 63.33  | 190  | 0.4267          | 0.6115   | 0.6503        | 0.9615           | nan                | 0.3089           | 0.9917             | nan           | 0.2621      | 0.9610        |
| 0.4393        | 66.67  | 200  | 0.4188          | 0.6035   | 0.6354        | 0.9623           | nan                | 0.2768           | 0.9940             | nan           | 0.2452      | 0.9618        |
| 0.4071        | 70.0   | 210  | 0.4224          | 0.6137   | 0.6528        | 0.9617           | nan                | 0.3138           | 0.9917             | nan           | 0.2663      | 0.9612        |
| 0.4009        | 73.33  | 220  | 0.4205          | 0.6136   | 0.6540        | 0.9614           | nan                | 0.3167           | 0.9912             | nan           | 0.2664      | 0.9608        |
| 0.4043        | 76.67  | 230  | 0.4148          | 0.6132   | 0.6514        | 0.9619           | nan                | 0.3108           | 0.9920             | nan           | 0.2651      | 0.9613        |
| 0.6302        | 80.0   | 240  | 0.4116          | 0.6133   | 0.6513        | 0.9619           | nan                | 0.3105           | 0.9921             | nan           | 0.2653      | 0.9614        |
| 0.3859        | 83.33  | 250  | 0.4113          | 0.6141   | 0.6543        | 0.9615           | nan                | 0.3174           | 0.9913             | nan           | 0.2673      | 0.9609        |
| 0.3791        | 86.67  | 260  | 0.4033          | 0.6042   | 0.6361        | 0.9623           | nan                | 0.2782           | 0.9940             | nan           | 0.2465      | 0.9619        |
| 0.5716        | 90.0   | 270  | 0.4088          | 0.6168   | 0.6575        | 0.9617           | nan                | 0.3237           | 0.9913             | nan           | 0.2724      | 0.9612        |
| 0.3803        | 93.33  | 280  | 0.4024          | 0.6171   | 0.6565        | 0.9621           | nan                | 0.3211           | 0.9918             | nan           | 0.2727      | 0.9615        |
| 0.371         | 96.67  | 290  | 0.3979          | 0.6166   | 0.6539        | 0.9625           | nan                | 0.3154           | 0.9925             | nan           | 0.2713      | 0.9620        |
| 0.3656        | 100.0  | 300  | 0.3992          | 0.6204   | 0.6615        | 0.9621           | nan                | 0.3316           | 0.9913             | nan           | 0.2793      | 0.9615        |
| 0.3674        | 103.33 | 310  | 0.3930          | 0.6110   | 0.6433        | 0.9630           | nan                | 0.2925           | 0.9941             | nan           | 0.2594      | 0.9625        |
| 0.378         | 106.67 | 320  | 0.3925          | 0.6124   | 0.6459        | 0.9629           | nan                | 0.2981           | 0.9937             | nan           | 0.2623      | 0.9624        |
| 0.5766        | 110.0  | 330  | 0.3965          | 0.6192   | 0.6594        | 0.9621           | nan                | 0.3272           | 0.9916             | nan           | 0.2768      | 0.9616        |
| 0.3513        | 113.33 | 340  | 0.3927          | 0.6161   | 0.6523        | 0.9627           | nan                | 0.3118           | 0.9928             | nan           | 0.2701      | 0.9622        |
| 0.3731        | 116.67 | 350  | 0.3975          | 0.6200   | 0.6613        | 0.9620           | nan                | 0.3315           | 0.9912             | nan           | 0.2785      | 0.9614        |
| 0.3489        | 120.0  | 360  | 0.3958          | 0.6134   | 0.6480        | 0.9627           | nan                | 0.3026           | 0.9933             | nan           | 0.2645      | 0.9622        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3