File size: 9,078 Bytes
2d0bd9e
 
 
 
ec1f85b
 
2d0bd9e
 
 
 
 
 
 
 
 
 
 
ec1f85b
2d0bd9e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
license: other
base_model: nvidia/mit-b0
tags:
- image-segmentation
- vision
- generated_from_trainer
model-index:
- name: segformer-finetuned-rwymarkings-3k-steps
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-finetuned-rwymarkings-3k-steps

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the Spatiallysaying/rwymarkings dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0182
- Mean Iou: 0.0441
- Mean Accuracy: 0.0510
- Overall Accuracy: 0.0800
- Accuracy  Backgound : nan
- Accuracy Tdz: 0.0908
- Accuracy Aim: 0.2203
- Accuracy Desig: 0.0
- Accuracy Rwythr: 0.0971
- Accuracy Thrbar: 0.0
- Accuracy Disp: 0.0
- Accuracy Chevron: 0.0
- Accuracy Arrow: 0.0
- Iou  Backgound : 0.0
- Iou Tdz: 0.0818
- Iou Aim: 0.2189
- Iou Desig: 0.0
- Iou Rwythr: 0.0958
- Iou Thrbar: 0.0
- Iou Disp: 0.0
- Iou Chevron: 0.0
- Iou Arrow: 0.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 3000

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy  Backgound  | Accuracy Tdz | Accuracy Aim | Accuracy Desig | Accuracy Rwythr | Accuracy Thrbar | Accuracy Disp | Accuracy Chevron | Accuracy Arrow | Iou  Backgound  | Iou Tdz | Iou Aim | Iou Desig | Iou Rwythr | Iou Thrbar | Iou Disp | Iou Chevron | Iou Arrow |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:------------:|:------------:|:--------------:|:---------------:|:---------------:|:-------------:|:----------------:|:--------------:|:---------------:|:-------:|:-------:|:---------:|:----------:|:----------:|:--------:|:-----------:|:---------:|
| 1.6294        | 1.0     | 173  | 0.5448          | 0.0      | 0.0           | 0.0              | nan                  | 0.0          | 0.0          | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0     | 0.0     | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.3371        | 2.0     | 346  | 0.1107          | 0.0      | 0.0           | 0.0              | nan                  | 0.0          | 0.0          | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0     | 0.0     | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0724        | 3.0     | 519  | 0.0483          | 0.0      | 0.0           | 0.0              | nan                  | 0.0          | 0.0          | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0     | 0.0     | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0508        | 4.0     | 692  | 0.0331          | 0.0      | 0.0           | 0.0              | nan                  | 0.0          | 0.0          | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0     | 0.0     | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0369        | 5.0     | 865  | 0.0289          | 0.0002   | 0.0002        | 0.0004           | nan                  | 0.0          | 0.0019       | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0     | 0.0019  | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0272        | 6.0     | 1038 | 0.0276          | 0.0106   | 0.0120        | 0.0195           | nan                  | 0.0107       | 0.0853       | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0105  | 0.0845  | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0258        | 7.0     | 1211 | 0.0233          | 0.0066   | 0.0075        | 0.0122           | nan                  | 0.0118       | 0.0480       | 0.0            | 0.0             | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0117  | 0.0480  | 0.0       | 0.0        | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0235        | 8.0     | 1384 | 0.0221          | 0.0150   | 0.0171        | 0.0277           | nan                  | 0.0233       | 0.1108       | 0.0            | 0.0024          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0224  | 0.1107  | 0.0       | 0.0024     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0213        | 9.0     | 1557 | 0.0209          | 0.0177   | 0.0200        | 0.0326           | nan                  | 0.0237       | 0.1351       | 0.0            | 0.0016          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0231  | 0.1346  | 0.0       | 0.0016     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0201        | 10.0    | 1730 | 0.0206          | 0.0277   | 0.0318        | 0.0512           | nan                  | 0.0595       | 0.1734       | 0.0            | 0.0211          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0559  | 0.1726  | 0.0       | 0.0211     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0203        | 11.0    | 1903 | 0.0198          | 0.0246   | 0.0281        | 0.0450           | nan                  | 0.0463       | 0.1512       | 0.0            | 0.0277          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0432  | 0.1505  | 0.0       | 0.0277     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0172        | 12.0    | 2076 | 0.0192          | 0.0377   | 0.0435        | 0.0690           | nan                  | 0.0744       | 0.2145       | 0.0            | 0.0592          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0680  | 0.2119  | 0.0       | 0.0589     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0168        | 13.0    | 2249 | 0.0189          | 0.0331   | 0.0381        | 0.0607           | nan                  | 0.0704       | 0.1884       | 0.0            | 0.0462          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0645  | 0.1876  | 0.0       | 0.0461     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0169        | 14.0    | 2422 | 0.0185          | 0.0383   | 0.0442        | 0.0701           | nan                  | 0.0786       | 0.2124       | 0.0            | 0.0628          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0716  | 0.2112  | 0.0       | 0.0623     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0172        | 15.0    | 2595 | 0.0184          | 0.0476   | 0.0551        | 0.0864           | nan                  | 0.0917       | 0.2463       | 0.0            | 0.1028          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0830  | 0.2443  | 0.0       | 0.1013     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0159        | 16.0    | 2768 | 0.0182          | 0.0523   | 0.0615        | 0.0964           | nan                  | 0.1202       | 0.2493       | 0.0            | 0.1225          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.1044  | 0.2468  | 0.0       | 0.1199     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0163        | 17.0    | 2941 | 0.0181          | 0.0492   | 0.0571        | 0.0892           | nan                  | 0.0987       | 0.2414       | 0.0            | 0.1167          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0885  | 0.2397  | 0.0       | 0.1146     | 0.0        | 0.0      | 0.0         | 0.0       |
| 0.0152        | 17.3410 | 3000 | 0.0182          | 0.0441   | 0.0510        | 0.0800           | nan                  | 0.0908       | 0.2203       | 0.0            | 0.0971          | 0.0             | 0.0           | 0.0              | 0.0            | 0.0             | 0.0818  | 0.2189  | 0.0       | 0.0958     | 0.0        | 0.0      | 0.0         | 0.0       |


### Framework versions

- Transformers 4.43.0.dev0
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1