File size: 6,739 Bytes
774887b
 
 
25285e2
 
774887b
 
 
 
 
 
 
 
 
 
 
25285e2
774887b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGBD-b0_5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGBD-b0_5

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2608
- Mean Iou: 0.6161
- Mean Accuracy: 0.6630
- Overall Accuracy: 0.9623
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3365
- Accuracy Undropoff: 0.9894
- Iou Unlabeled: nan
- Iou Dropoff: 0.2705
- Iou Undropoff: 0.9617

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.9263        | 5.0   | 10   | 1.0370          | 0.2869   | 0.7147        | 0.7632           | nan                | 0.6618           | 0.7675             | 0.0           | 0.1042      | 0.7565        |
| 0.8069        | 10.0  | 20   | 0.8622          | 0.4857   | 0.5062        | 0.9589           | nan                | 0.0125           | 0.9999             | nan           | 0.0124      | 0.9589        |
| 0.6851        | 15.0  | 30   | 0.6490          | 0.4876   | 0.5081        | 0.9586           | nan                | 0.0167           | 0.9995             | nan           | 0.0165      | 0.9586        |
| 0.5882        | 20.0  | 40   | 0.4739          | 0.3253   | 0.5085        | 0.9586           | nan                | 0.0177           | 0.9994             | 0.0           | 0.0174      | 0.9585        |
| 0.53          | 25.0  | 50   | 0.4153          | 0.3375   | 0.5274        | 0.9584           | nan                | 0.0573           | 0.9975             | 0.0           | 0.0542      | 0.9583        |
| 0.5009        | 30.0  | 60   | 0.4275          | 0.3835   | 0.6488        | 0.9475           | nan                | 0.3230           | 0.9746             | 0.0           | 0.2037      | 0.9468        |
| 0.4699        | 35.0  | 70   | 0.3819          | 0.4158   | 0.6985        | 0.9578           | nan                | 0.4157           | 0.9813             | 0.0           | 0.2904      | 0.9570        |
| 0.3946        | 40.0  | 80   | 0.3563          | 0.6183   | 0.6844        | 0.9585           | nan                | 0.3854           | 0.9834             | nan           | 0.2787      | 0.9579        |
| 0.3788        | 45.0  | 90   | 0.3259          | 0.6292   | 0.7011        | 0.9593           | nan                | 0.4196           | 0.9827             | nan           | 0.2998      | 0.9585        |
| 0.3412        | 50.0  | 100  | 0.3392          | 0.6170   | 0.6933        | 0.9562           | nan                | 0.4066           | 0.9801             | nan           | 0.2785      | 0.9555        |
| 0.3326        | 55.0  | 110  | 0.3214          | 0.6279   | 0.6914        | 0.9606           | nan                | 0.3977           | 0.9851             | nan           | 0.2958      | 0.9600        |
| 0.2954        | 60.0  | 120  | 0.3119          | 0.6261   | 0.6847        | 0.9613           | nan                | 0.3831           | 0.9864             | nan           | 0.2915      | 0.9607        |
| 0.3006        | 65.0  | 130  | 0.2853          | 0.5900   | 0.6223        | 0.9625           | nan                | 0.2513           | 0.9934             | nan           | 0.2180      | 0.9621        |
| 0.2715        | 70.0  | 140  | 0.3021          | 0.6314   | 0.6903        | 0.9620           | nan                | 0.3938           | 0.9867             | nan           | 0.3014      | 0.9614        |
| 0.276         | 75.0  | 150  | 0.2950          | 0.6243   | 0.6783        | 0.9619           | nan                | 0.3690           | 0.9877             | nan           | 0.2873      | 0.9613        |
| 0.2622        | 80.0  | 160  | 0.2843          | 0.6134   | 0.6651        | 0.9608           | nan                | 0.3426           | 0.9876             | nan           | 0.2665      | 0.9602        |
| 0.2395        | 85.0  | 170  | 0.2752          | 0.6050   | 0.6495        | 0.9613           | nan                | 0.3094           | 0.9895             | nan           | 0.2493      | 0.9608        |
| 0.2597        | 90.0  | 180  | 0.2813          | 0.6296   | 0.6874        | 0.9620           | nan                | 0.3879           | 0.9869             | nan           | 0.2979      | 0.9614        |
| 0.2294        | 95.0  | 190  | 0.2747          | 0.6106   | 0.6575        | 0.9615           | nan                | 0.3259           | 0.9890             | nan           | 0.2602      | 0.9609        |
| 0.2303        | 100.0 | 200  | 0.2606          | 0.6040   | 0.6462        | 0.9616           | nan                | 0.3023           | 0.9902             | nan           | 0.2468      | 0.9611        |
| 0.2335        | 105.0 | 210  | 0.2606          | 0.6080   | 0.6515        | 0.9619           | nan                | 0.3130           | 0.9901             | nan           | 0.2547      | 0.9614        |
| 0.2322        | 110.0 | 220  | 0.2619          | 0.6167   | 0.6631        | 0.9624           | nan                | 0.3366           | 0.9896             | nan           | 0.2715      | 0.9619        |
| 0.2116        | 115.0 | 230  | 0.2618          | 0.6183   | 0.6660        | 0.9624           | nan                | 0.3427           | 0.9893             | nan           | 0.2747      | 0.9618        |
| 0.2099        | 120.0 | 240  | 0.2608          | 0.6161   | 0.6630        | 0.9623           | nan                | 0.3365           | 0.9894             | nan           | 0.2705      | 0.9617        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3