File size: 6,683 Bytes
31fcf73
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
license: other
tags:
- generated_from_trainer
model-index:
- name: dropoff-utcustom-train-SF-RGBD-b0_6
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dropoff-utcustom-train-SF-RGBD-b0_6

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2353
- Mean Iou: 0.6539
- Mean Accuracy: 0.7065
- Overall Accuracy: 0.9662
- Accuracy Unlabeled: nan
- Accuracy Dropoff: 0.4233
- Accuracy Undropoff: 0.9897
- Iou Unlabeled: nan
- Iou Dropoff: 0.3423
- Iou Undropoff: 0.9656

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 120

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
| 0.9975        | 5.0   | 10   | 1.0470          | 0.2819   | 0.6747        | 0.7186           | nan                | 0.6267           | 0.7226             | 0.0           | 0.1290      | 0.7167        |
| 0.8329        | 10.0  | 20   | 0.8435          | 0.3211   | 0.5026        | 0.9526           | nan                | 0.0117           | 0.9934             | 0.0           | 0.0106      | 0.9526        |
| 0.6857        | 15.0  | 30   | 0.6184          | 0.3191   | 0.4994        | 0.9567           | nan                | 0.0006           | 0.9981             | 0.0           | 0.0006      | 0.9567        |
| 0.5913        | 20.0  | 40   | 0.4793          | 0.3193   | 0.4997        | 0.9573           | nan                | 0.0005           | 0.9988             | 0.0           | 0.0005      | 0.9573        |
| 0.5299        | 25.0  | 50   | 0.4529          | 0.3488   | 0.5442        | 0.9596           | nan                | 0.0911           | 0.9973             | 0.0           | 0.0869      | 0.9595        |
| 0.4922        | 30.0  | 60   | 0.4037          | 0.4352   | 0.6983        | 0.9671           | nan                | 0.4051           | 0.9915             | 0.0           | 0.3390      | 0.9666        |
| 0.4769        | 35.0  | 70   | 0.4161          | 0.4090   | 0.7560        | 0.9426           | nan                | 0.5524           | 0.9595             | 0.0           | 0.2858      | 0.9412        |
| 0.3916        | 40.0  | 80   | 0.3343          | 0.6320   | 0.6946        | 0.9614           | nan                | 0.4036           | 0.9856             | nan           | 0.3033      | 0.9608        |
| 0.3567        | 45.0  | 90   | 0.3372          | 0.6374   | 0.7140        | 0.9598           | nan                | 0.4458           | 0.9821             | nan           | 0.3157      | 0.9591        |
| 0.3234        | 50.0  | 100  | 0.3074          | 0.6402   | 0.6883        | 0.9652           | nan                | 0.3863           | 0.9903             | nan           | 0.3157      | 0.9646        |
| 0.3181        | 55.0  | 110  | 0.3043          | 0.6396   | 0.7138        | 0.9606           | nan                | 0.4446           | 0.9830             | nan           | 0.3194      | 0.9599        |
| 0.2584        | 60.0  | 120  | 0.3069          | 0.6450   | 0.7204        | 0.9613           | nan                | 0.4576           | 0.9831             | nan           | 0.3294      | 0.9605        |
| 0.2566        | 65.0  | 130  | 0.2824          | 0.6431   | 0.7063        | 0.9630           | nan                | 0.4263           | 0.9863             | nan           | 0.3239      | 0.9623        |
| 0.2353        | 70.0  | 140  | 0.2763          | 0.6470   | 0.7046        | 0.9645           | nan                | 0.4212           | 0.9880             | nan           | 0.3301      | 0.9638        |
| 0.2368        | 75.0  | 150  | 0.2644          | 0.6474   | 0.6973        | 0.9658           | nan                | 0.4044           | 0.9902             | nan           | 0.3296      | 0.9652        |
| 0.2225        | 80.0  | 160  | 0.2673          | 0.6462   | 0.7089        | 0.9635           | nan                | 0.4313           | 0.9866             | nan           | 0.3296      | 0.9629        |
| 0.1976        | 85.0  | 170  | 0.2568          | 0.6449   | 0.7057        | 0.9637           | nan                | 0.4244           | 0.9870             | nan           | 0.3268      | 0.9630        |
| 0.1981        | 90.0  | 180  | 0.2572          | 0.6444   | 0.7110        | 0.9626           | nan                | 0.4365           | 0.9855             | nan           | 0.3269      | 0.9619        |
| 0.1857        | 95.0  | 190  | 0.2503          | 0.6504   | 0.7027        | 0.9658           | nan                | 0.4157           | 0.9897             | nan           | 0.3356      | 0.9652        |
| 0.1826        | 100.0 | 200  | 0.2345          | 0.6509   | 0.6984        | 0.9666           | nan                | 0.4059           | 0.9909             | nan           | 0.3357      | 0.9660        |
| 0.1818        | 105.0 | 210  | 0.2484          | 0.6506   | 0.7160        | 0.9637           | nan                | 0.4458           | 0.9862             | nan           | 0.3381      | 0.9630        |
| 0.1919        | 110.0 | 220  | 0.2343          | 0.6526   | 0.6996        | 0.9669           | nan                | 0.4080           | 0.9912             | nan           | 0.3389      | 0.9663        |
| 0.17          | 115.0 | 230  | 0.2377          | 0.6535   | 0.7065        | 0.9661           | nan                | 0.4235           | 0.9896             | nan           | 0.3416      | 0.9655        |
| 0.1739        | 120.0 | 240  | 0.2353          | 0.6539   | 0.7065        | 0.9662           | nan                | 0.4233           | 0.9897             | nan           | 0.3423      | 0.9656        |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3