File size: 8,379 Bytes
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
c5282ca
b3fd691
a4c2dac
 
 
 
b3fd691
a4c2dac
b3fd691
a4c2dac
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bc85fee
b3fd691
 
 
3e62892
 
a4c2dac
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b3fd691
 
 
 
 
ea0c6f8
 
b3fd691
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-toolwear
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0223
- Mean Iou: 0.4979
- Mean Accuracy: 0.9957
- Overall Accuracy: 0.9957
- Accuracy Unlabeled: nan
- Accuracy Tool: 0.9957
- Iou Unlabeled: 0.0
- Iou Tool: 0.9957

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Iou Unlabeled | Iou Tool |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
| 0.1534        | 1.18  | 20   | 0.3425          | 0.4977   | 0.9955        | 0.9955           | nan                | 0.9955        | 0.0           | 0.9955   |
| 0.091         | 2.35  | 40   | 0.1076          | 0.4948   | 0.9897        | 0.9897           | nan                | 0.9897        | 0.0           | 0.9897   |
| 0.0827        | 3.53  | 60   | 0.0828          | 0.4965   | 0.9931        | 0.9931           | nan                | 0.9931        | 0.0           | 0.9931   |
| 0.0729        | 4.71  | 80   | 0.0795          | 0.4967   | 0.9934        | 0.9934           | nan                | 0.9934        | 0.0           | 0.9934   |
| 0.0825        | 5.88  | 100  | 0.0606          | 0.4910   | 0.9819        | 0.9819           | nan                | 0.9819        | 0.0           | 0.9819   |
| 0.0604        | 7.06  | 120  | 0.0546          | 0.4910   | 0.9820        | 0.9820           | nan                | 0.9820        | 0.0           | 0.9820   |
| 0.0575        | 8.24  | 140  | 0.0460          | 0.4942   | 0.9884        | 0.9884           | nan                | 0.9884        | 0.0           | 0.9884   |
| 0.0592        | 9.41  | 160  | 0.0450          | 0.4906   | 0.9813        | 0.9813           | nan                | 0.9813        | 0.0           | 0.9813   |
| 0.0478        | 10.59 | 180  | 0.0400          | 0.4981   | 0.9962        | 0.9962           | nan                | 0.9962        | 0.0           | 0.9962   |
| 0.046         | 11.76 | 200  | 0.0403          | 0.4982   | 0.9964        | 0.9964           | nan                | 0.9964        | 0.0           | 0.9964   |
| 0.0535        | 12.94 | 220  | 0.0340          | 0.4971   | 0.9941        | 0.9941           | nan                | 0.9941        | 0.0           | 0.9941   |
| 0.0317        | 14.12 | 240  | 0.0332          | 0.4975   | 0.9949        | 0.9949           | nan                | 0.9949        | 0.0           | 0.9949   |
| 0.0352        | 15.29 | 260  | 0.0328          | 0.4982   | 0.9964        | 0.9964           | nan                | 0.9964        | 0.0           | 0.9964   |
| 0.0258        | 16.47 | 280  | 0.0295          | 0.4963   | 0.9926        | 0.9926           | nan                | 0.9926        | 0.0           | 0.9926   |
| 0.0218        | 17.65 | 300  | 0.0265          | 0.4968   | 0.9935        | 0.9935           | nan                | 0.9935        | 0.0           | 0.9935   |
| 0.026         | 18.82 | 320  | 0.0284          | 0.4979   | 0.9958        | 0.9958           | nan                | 0.9958        | 0.0           | 0.9958   |
| 0.026         | 20.0  | 340  | 0.0267          | 0.4971   | 0.9941        | 0.9941           | nan                | 0.9941        | 0.0           | 0.9941   |
| 0.02          | 21.18 | 360  | 0.0242          | 0.4967   | 0.9935        | 0.9935           | nan                | 0.9935        | 0.0           | 0.9935   |
| 0.0255        | 22.35 | 380  | 0.0270          | 0.4975   | 0.9949        | 0.9949           | nan                | 0.9949        | 0.0           | 0.9949   |
| 0.0282        | 23.53 | 400  | 0.0240          | 0.4973   | 0.9946        | 0.9946           | nan                | 0.9946        | 0.0           | 0.9946   |
| 0.0188        | 24.71 | 420  | 0.0244          | 0.4972   | 0.9944        | 0.9944           | nan                | 0.9944        | 0.0           | 0.9944   |
| 0.0196        | 25.88 | 440  | 0.0226          | 0.4961   | 0.9922        | 0.9922           | nan                | 0.9922        | 0.0           | 0.9922   |
| 0.0165        | 27.06 | 460  | 0.0235          | 0.4968   | 0.9937        | 0.9937           | nan                | 0.9937        | 0.0           | 0.9937   |
| 0.02          | 28.24 | 480  | 0.0245          | 0.4981   | 0.9962        | 0.9962           | nan                | 0.9962        | 0.0           | 0.9962   |
| 0.0213        | 29.41 | 500  | 0.0225          | 0.4972   | 0.9944        | 0.9944           | nan                | 0.9944        | 0.0           | 0.9944   |
| 0.0174        | 30.59 | 520  | 0.0221          | 0.4970   | 0.9940        | 0.9940           | nan                | 0.9940        | 0.0           | 0.9940   |
| 0.0163        | 31.76 | 540  | 0.0226          | 0.4975   | 0.9951        | 0.9951           | nan                | 0.9951        | 0.0           | 0.9951   |
| 0.0242        | 32.94 | 560  | 0.0236          | 0.4978   | 0.9956        | 0.9956           | nan                | 0.9956        | 0.0           | 0.9956   |
| 0.0195        | 34.12 | 580  | 0.0217          | 0.4976   | 0.9953        | 0.9953           | nan                | 0.9953        | 0.0           | 0.9953   |
| 0.0134        | 35.29 | 600  | 0.0220          | 0.4974   | 0.9948        | 0.9948           | nan                | 0.9948        | 0.0           | 0.9948   |
| 0.0192        | 36.47 | 620  | 0.0216          | 0.4974   | 0.9947        | 0.9947           | nan                | 0.9947        | 0.0           | 0.9947   |
| 0.0138        | 37.65 | 640  | 0.0219          | 0.4974   | 0.9948        | 0.9948           | nan                | 0.9948        | 0.0           | 0.9948   |
| 0.0147        | 38.82 | 660  | 0.0215          | 0.4973   | 0.9945        | 0.9945           | nan                | 0.9945        | 0.0           | 0.9945   |
| 0.0208        | 40.0  | 680  | 0.0219          | 0.4979   | 0.9958        | 0.9958           | nan                | 0.9958        | 0.0           | 0.9958   |
| 0.0152        | 41.18 | 700  | 0.0211          | 0.4974   | 0.9948        | 0.9948           | nan                | 0.9948        | 0.0           | 0.9948   |
| 0.0145        | 42.35 | 720  | 0.0214          | 0.4977   | 0.9954        | 0.9954           | nan                | 0.9954        | 0.0           | 0.9954   |
| 0.0138        | 43.53 | 740  | 0.0217          | 0.4977   | 0.9954        | 0.9954           | nan                | 0.9954        | 0.0           | 0.9954   |
| 0.0122        | 44.71 | 760  | 0.0218          | 0.4977   | 0.9954        | 0.9954           | nan                | 0.9954        | 0.0           | 0.9954   |
| 0.0201        | 45.88 | 780  | 0.0220          | 0.4976   | 0.9953        | 0.9953           | nan                | 0.9953        | 0.0           | 0.9953   |
| 0.0147        | 47.06 | 800  | 0.0219          | 0.4977   | 0.9954        | 0.9954           | nan                | 0.9954        | 0.0           | 0.9954   |
| 0.0131        | 48.24 | 820  | 0.0213          | 0.4975   | 0.9950        | 0.9950           | nan                | 0.9950        | 0.0           | 0.9950   |
| 0.016         | 49.41 | 840  | 0.0223          | 0.4979   | 0.9957        | 0.9957           | nan                | 0.9957        | 0.0           | 0.9957   |


### Framework versions

- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.13.3