File size: 6,727 Bytes
2db175b
 
 
d33923c
 
2db175b
 
 
 
 
 
 
 
 
 
 
d33923c
2db175b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGBD-b5_4
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGBD-b5_4

This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2351
- Mean Iou: 0.4792
- Mean Accuracy: 0.5
- Overall Accuracy: 0.9584
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.0
- Accuracy Undropoff: 1.0
- Iou Unlabeled: nan
- Iou Dropoff: 0.0
- Iou Undropoff: 0.9584

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 7e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.0114        | 5.0   | 10   | 1.0037          | 0.2459   | 0.4345        | 0.7074           | nan                | 0.1368           | 0.7322             | 0.0           | 0.0286      | 0.7089        |
| 0.9088        | 10.0  | 20   | 0.8245          | 0.3119   | 0.5046        | 0.8887           | nan                | 0.0857           | 0.9235             | 0.0           | 0.0460      | 0.8897        |
| 0.8029        | 15.0  | 30   | 0.6620          | 0.3157   | 0.4998        | 0.9214           | nan                | 0.0399           | 0.9596             | 0.0           | 0.0253      | 0.9219        |
| 0.6935        | 20.0  | 40   | 0.5662          | 0.3154   | 0.4959        | 0.9309           | nan                | 0.0214           | 0.9704             | 0.0           | 0.0151      | 0.9311        |
| 0.635         | 25.0  | 50   | 0.5018          | 0.3175   | 0.4978        | 0.9401           | nan                | 0.0153           | 0.9803             | 0.0           | 0.0121      | 0.9404        |
| 0.5579        | 30.0  | 60   | 0.4701          | 0.3178   | 0.4978        | 0.9422           | nan                | 0.0131           | 0.9825             | 0.0           | 0.0111      | 0.9423        |
| 0.5086        | 35.0  | 70   | 0.4403          | 0.3181   | 0.4977        | 0.9459           | nan                | 0.0088           | 0.9866             | 0.0           | 0.0080      | 0.9461        |
| 0.472         | 40.0  | 80   | 0.4328          | 0.3177   | 0.4971        | 0.9471           | nan                | 0.0063           | 0.9879             | 0.0           | 0.0059      | 0.9473        |
| 0.4484        | 45.0  | 90   | 0.4136          | 0.3184   | 0.4981        | 0.9506           | nan                | 0.0046           | 0.9916             | 0.0           | 0.0044      | 0.9508        |
| 0.4026        | 50.0  | 100  | 0.4013          | 0.3186   | 0.4985        | 0.9516           | nan                | 0.0043           | 0.9926             | 0.0           | 0.0042      | 0.9517        |
| 0.3873        | 55.0  | 110  | 0.3621          | 0.3189   | 0.4991        | 0.9557           | nan                | 0.0010           | 0.9971             | 0.0           | 0.0009      | 0.9557        |
| 0.3549        | 60.0  | 120  | 0.3479          | 0.3189   | 0.4992        | 0.9564           | nan                | 0.0004           | 0.9979             | 0.0           | 0.0004      | 0.9564        |
| 0.3358        | 65.0  | 130  | 0.3282          | 0.3191   | 0.4994        | 0.9571           | nan                | 0.0001           | 0.9986             | 0.0           | 0.0001      | 0.9571        |
| 0.3146        | 70.0  | 140  | 0.3141          | 0.3193   | 0.4996        | 0.9577           | nan                | 0.0000           | 0.9993             | 0.0           | 0.0000      | 0.9577        |
| 0.3116        | 75.0  | 150  | 0.2941          | 0.3194   | 0.4999        | 0.9582           | nan                | 0.0              | 0.9998             | 0.0           | 0.0         | 0.9582        |
| 0.3151        | 80.0  | 160  | 0.2809          | 0.3195   | 0.5000        | 0.9584           | nan                | 0.0              | 0.9999             | 0.0           | 0.0         | 0.9584        |
| 0.2778        | 85.0  | 170  | 0.2750          | 0.3195   | 0.5000        | 0.9584           | nan                | 0.0              | 1.0000             | 0.0           | 0.0         | 0.9584        |
| 0.2753        | 90.0  | 180  | 0.2615          | 0.3195   | 0.5000        | 0.9584           | nan                | 0.0              | 1.0000             | 0.0           | 0.0         | 0.9584        |
| 0.2809        | 95.0  | 190  | 0.2547          | 0.4792   | 0.5           | 0.9584           | nan                | 0.0              | 1.0                | nan           | 0.0         | 0.9584        |
| 0.2606        | 100.0 | 200  | 0.2464          | 0.4792   | 0.5           | 0.9584           | nan                | 0.0              | 1.0                | nan           | 0.0         | 0.9584        |
| 0.2563        | 105.0 | 210  | 0.2459          | 0.4792   | 0.5           | 0.9584           | nan                | 0.0              | 1.0                | nan           | 0.0         | 0.9584        |
| 0.2454        | 110.0 | 220  | 0.2393          | 0.4792   | 0.5           | 0.9584           | nan                | 0.0              | 1.0                | nan           | 0.0         | 0.9584        |
| 0.2707        | 115.0 | 230  | 0.2368          | 0.4792   | 0.5           | 0.9584           | nan                | 0.0              | 1.0                | nan           | 0.0         | 0.9584        |
| 0.2433        | 120.0 | 240  | 0.2351          | 0.4792   | 0.5           | 0.9584           | nan                | 0.0              | 1.0                | nan           | 0.0         | 0.9584        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3