File size: 6,739 Bytes
778a166
 
 
518cfb4
 
778a166
 
 
 
 
 
 
 
 
 
 
518cfb4
778a166
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: other
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGBD-b5_6
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGBD-b5_6

This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the sam1120/dropoff-utcustom-TRAIN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1429
- Mean Iou: 0.6443
- Mean Accuracy: 0.6853
- Overall Accuracy: 0.9669
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.3782
- Accuracy Undropoff: 0.9925
- Iou Unlabeled: nan
- Iou Dropoff: 0.3223
- Iou Undropoff: 0.9664

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 1.159         | 5.0   | 10   | 1.0040          | 0.2283   | 0.5676        | 0.6267           | nan                | 0.5031           | 0.6321             | 0.0           | 0.0644      | 0.6203        |
| 0.8345        | 10.0  | 20   | 0.7480          | 0.3236   | 0.5320        | 0.9158           | nan                | 0.1134           | 0.9506             | 0.0           | 0.0555      | 0.9154        |
| 0.5406        | 15.0  | 30   | 0.5477          | 0.3223   | 0.5049        | 0.9513           | nan                | 0.0179           | 0.9918             | 0.0           | 0.0157      | 0.9513        |
| 0.3695        | 20.0  | 40   | 0.4590          | 0.3215   | 0.5036        | 0.9519           | nan                | 0.0146           | 0.9926             | 0.0           | 0.0125      | 0.9519        |
| 0.3053        | 25.0  | 50   | 0.3790          | 0.3196   | 0.5001        | 0.9565           | nan                | 0.0023           | 0.9979             | 0.0           | 0.0022      | 0.9565        |
| 0.2436        | 30.0  | 60   | 0.3303          | 0.4812   | 0.5020        | 0.9568           | nan                | 0.0059           | 0.9981             | nan           | 0.0056      | 0.9568        |
| 0.2148        | 35.0  | 70   | 0.2739          | 0.4794   | 0.5002        | 0.9580           | nan                | 0.0008           | 0.9996             | nan           | 0.0008      | 0.9580        |
| 0.1983        | 40.0  | 80   | 0.2348          | 0.5079   | 0.5284        | 0.9595           | nan                | 0.0582           | 0.9986             | nan           | 0.0564      | 0.9594        |
| 0.1784        | 45.0  | 90   | 0.2178          | 0.6064   | 0.6440        | 0.9631           | nan                | 0.2960           | 0.9920             | nan           | 0.2501      | 0.9626        |
| 0.1631        | 50.0  | 100  | 0.1943          | 0.6223   | 0.6811        | 0.9607           | nan                | 0.3760           | 0.9861             | nan           | 0.2846      | 0.9601        |
| 0.1468        | 55.0  | 110  | 0.1759          | 0.6206   | 0.6731        | 0.9617           | nan                | 0.3583           | 0.9879             | nan           | 0.2801      | 0.9611        |
| 0.1353        | 60.0  | 120  | 0.1657          | 0.6014   | 0.6335        | 0.9639           | nan                | 0.2731           | 0.9939             | nan           | 0.2393      | 0.9635        |
| 0.1474        | 65.0  | 130  | 0.1590          | 0.5943   | 0.6228        | 0.9641           | nan                | 0.2505           | 0.9951             | nan           | 0.2249      | 0.9637        |
| 0.1172        | 70.0  | 140  | 0.1562          | 0.6272   | 0.6662        | 0.9653           | nan                | 0.3400           | 0.9924             | nan           | 0.2896      | 0.9648        |
| 0.1169        | 75.0  | 150  | 0.1538          | 0.6302   | 0.6696        | 0.9656           | nan                | 0.3467           | 0.9925             | nan           | 0.2954      | 0.9651        |
| 0.1263        | 80.0  | 160  | 0.1540          | 0.6372   | 0.6784        | 0.9661           | nan                | 0.3645           | 0.9922             | nan           | 0.3089      | 0.9656        |
| 0.1028        | 85.0  | 170  | 0.1512          | 0.6462   | 0.6948        | 0.9659           | nan                | 0.3992           | 0.9904             | nan           | 0.3271      | 0.9653        |
| 0.1163        | 90.0  | 180  | 0.1493          | 0.6469   | 0.6932        | 0.9663           | nan                | 0.3953           | 0.9911             | nan           | 0.3280      | 0.9658        |
| 0.0998        | 95.0  | 190  | 0.1481          | 0.6457   | 0.6894        | 0.9666           | nan                | 0.3869           | 0.9918             | nan           | 0.3253      | 0.9661        |
| 0.0997        | 100.0 | 200  | 0.1465          | 0.6454   | 0.6893        | 0.9665           | nan                | 0.3869           | 0.9917             | nan           | 0.3247      | 0.9660        |
| 0.0998        | 105.0 | 210  | 0.1473          | 0.6488   | 0.6937        | 0.9668           | nan                | 0.3958           | 0.9916             | nan           | 0.3313      | 0.9662        |
| 0.1003        | 110.0 | 220  | 0.1437          | 0.6401   | 0.6774        | 0.9671           | nan                | 0.3614           | 0.9934             | nan           | 0.3136      | 0.9666        |
| 0.0932        | 115.0 | 230  | 0.1434          | 0.6469   | 0.6898        | 0.9669           | nan                | 0.3876           | 0.9920             | nan           | 0.3275      | 0.9664        |
| 0.0942        | 120.0 | 240  | 0.1429          | 0.6443   | 0.6853        | 0.9669           | nan                | 0.3782           | 0.9925             | nan           | 0.3223      | 0.9664        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3