File size: 9,223 Bytes
2fecad1
 
 
7daa7d1
 
2fecad1
 
 
 
 
 
 
 
 
 
 
7daa7d1
2fecad1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGB-b0_1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGB-b0_1

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5626
- Mean Iou: 0.4261
- Mean Accuracy: 0.7046
- Overall Accuracy: 0.9598
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4247
- Accuracy Undropoff: 0.9846
- Iou Unlabeled: 0.0
- Iou Dropoff: 0.3192
- Iou Undropoff: 0.9590

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 9e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.1029        | 3.33   | 10   | 1.0852          | 0.1637   | 0.3955        | 0.4522           | nan                | 0.3333           | 0.4577             | 0.0           | 0.0410      | 0.4501        |
| 1.0856        | 6.67   | 20   | 1.0764          | 0.1911   | 0.5086        | 0.5025           | nan                | 0.5153           | 0.5019             | 0.0           | 0.0761      | 0.4972        |
| 1.0755        | 10.0   | 30   | 1.0611          | 0.2252   | 0.6367        | 0.5749           | nan                | 0.7045           | 0.5688             | 0.0           | 0.1104      | 0.5652        |
| 1.0285        | 13.33  | 40   | 1.0382          | 0.2622   | 0.7487        | 0.6568           | nan                | 0.8494           | 0.6479             | 0.0           | 0.1420      | 0.6445        |
| 0.9935        | 16.67  | 50   | 1.0151          | 0.2893   | 0.7814        | 0.7201           | nan                | 0.8486           | 0.7141             | 0.0           | 0.1580      | 0.7099        |
| 0.9927        | 20.0   | 60   | 0.9834          | 0.3160   | 0.7963        | 0.7816           | nan                | 0.8124           | 0.7801             | 0.0           | 0.1735      | 0.7744        |
| 0.938         | 23.33  | 70   | 0.9585          | 0.3308   | 0.8084        | 0.8127           | nan                | 0.8036           | 0.8131             | 0.0           | 0.1860      | 0.8065        |
| 0.9169        | 26.67  | 80   | 0.9376          | 0.3457   | 0.8169        | 0.8376           | nan                | 0.7943           | 0.8396             | 0.0           | 0.2048      | 0.8324        |
| 0.8814        | 30.0   | 90   | 0.9003          | 0.3624   | 0.8086        | 0.8691           | nan                | 0.7421           | 0.8750             | 0.0           | 0.2220      | 0.8651        |
| 0.8618        | 33.33  | 100  | 0.8894          | 0.3669   | 0.8184        | 0.8761           | nan                | 0.7550           | 0.8817             | 0.0           | 0.2287      | 0.8720        |
| 0.8388        | 36.67  | 110  | 0.8618          | 0.3774   | 0.8096        | 0.8926           | nan                | 0.7187           | 0.9006             | 0.0           | 0.2431      | 0.8892        |
| 0.8878        | 40.0   | 120  | 0.8269          | 0.3929   | 0.7937        | 0.9140           | nan                | 0.6618           | 0.9257             | 0.0           | 0.2671      | 0.9116        |
| 0.8066        | 43.33  | 130  | 0.8074          | 0.4014   | 0.7955        | 0.9225           | nan                | 0.6562           | 0.9348             | 0.0           | 0.2839      | 0.9202        |
| 0.8084        | 46.67  | 140  | 0.7919          | 0.4023   | 0.7932        | 0.9248           | nan                | 0.6487           | 0.9376             | 0.0           | 0.2844      | 0.9226        |
| 0.7415        | 50.0   | 150  | 0.7707          | 0.4068   | 0.7850        | 0.9309           | nan                | 0.6249           | 0.9451             | 0.0           | 0.2913      | 0.9290        |
| 0.7508        | 53.33  | 160  | 0.7326          | 0.4154   | 0.7660        | 0.9415           | nan                | 0.5735           | 0.9585             | 0.0           | 0.3063      | 0.9400        |
| 0.7312        | 56.67  | 170  | 0.7126          | 0.4196   | 0.7636        | 0.9449           | nan                | 0.5646           | 0.9625             | 0.0           | 0.3155      | 0.9435        |
| 0.6442        | 60.0   | 180  | 0.6869          | 0.4255   | 0.7500        | 0.9509           | nan                | 0.5296           | 0.9704             | 0.0           | 0.3268      | 0.9497        |
| 0.6633        | 63.33  | 190  | 0.6765          | 0.4286   | 0.7524        | 0.9525           | nan                | 0.5328           | 0.9719             | 0.0           | 0.3343      | 0.9513        |
| 0.7247        | 66.67  | 200  | 0.6557          | 0.4307   | 0.7335        | 0.9568           | nan                | 0.4886           | 0.9785             | 0.0           | 0.3364      | 0.9558        |
| 0.6133        | 70.0   | 210  | 0.6369          | 0.4298   | 0.7279        | 0.9573           | nan                | 0.4761           | 0.9796             | 0.0           | 0.3330      | 0.9564        |
| 0.6309        | 73.33  | 220  | 0.6309          | 0.4298   | 0.7437        | 0.9547           | nan                | 0.5123           | 0.9752             | 0.0           | 0.3356      | 0.9536        |
| 0.6373        | 76.67  | 230  | 0.6094          | 0.4276   | 0.7197        | 0.9577           | nan                | 0.4585           | 0.9808             | 0.0           | 0.3262      | 0.9568        |
| 0.8436        | 80.0   | 240  | 0.6195          | 0.4341   | 0.7438        | 0.9569           | nan                | 0.5101           | 0.9776             | 0.0           | 0.3463      | 0.9559        |
| 0.6172        | 83.33  | 250  | 0.6207          | 0.4323   | 0.7384        | 0.9570           | nan                | 0.4987           | 0.9782             | 0.0           | 0.3409      | 0.9560        |
| 0.6048        | 86.67  | 260  | 0.5949          | 0.4272   | 0.7136        | 0.9586           | nan                | 0.4449           | 0.9824             | 0.0           | 0.3237      | 0.9578        |
| 0.7887        | 90.0   | 270  | 0.6007          | 0.4308   | 0.7282        | 0.9580           | nan                | 0.4760           | 0.9803             | 0.0           | 0.3353      | 0.9571        |
| 0.605         | 93.33  | 280  | 0.5883          | 0.4284   | 0.7157        | 0.9589           | nan                | 0.4489           | 0.9825             | 0.0           | 0.3271      | 0.9581        |
| 0.5964        | 96.67  | 290  | 0.5872          | 0.4277   | 0.7134        | 0.9590           | nan                | 0.4439           | 0.9828             | 0.0           | 0.3251      | 0.9581        |
| 0.6097        | 100.0  | 300  | 0.5903          | 0.4300   | 0.7240        | 0.9582           | nan                | 0.4669           | 0.9810             | 0.0           | 0.3325      | 0.9573        |
| 0.5886        | 103.33 | 310  | 0.5710          | 0.4250   | 0.7035        | 0.9594           | nan                | 0.4227           | 0.9843             | 0.0           | 0.3162      | 0.9586        |
| 0.6079        | 106.67 | 320  | 0.5695          | 0.4277   | 0.7112        | 0.9594           | nan                | 0.4390           | 0.9835             | 0.0           | 0.3245      | 0.9586        |
| 0.8054        | 110.0  | 330  | 0.5746          | 0.4308   | 0.7237        | 0.9588           | nan                | 0.4657           | 0.9816             | 0.0           | 0.3344      | 0.9579        |
| 0.5496        | 113.33 | 340  | 0.5631          | 0.4285   | 0.7129        | 0.9595           | nan                | 0.4424           | 0.9835             | 0.0           | 0.3269      | 0.9587        |
| 0.6271        | 116.67 | 350  | 0.5761          | 0.4302   | 0.7214        | 0.9589           | nan                | 0.4608           | 0.9819             | 0.0           | 0.3326      | 0.9580        |
| 0.5511        | 120.0  | 360  | 0.5626          | 0.4261   | 0.7046        | 0.9598           | nan                | 0.4247           | 0.9846             | 0.0           | 0.3192      | 0.9590        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3