File size: 7,442 Bytes
b3fd691
 
5727563
b3fd691
 
 
 
 
 
 
 
 
 
 
 
c0c4e6e
b3fd691
f24e927
 
 
 
b3fd691
f24e927
b3fd691
f24e927
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0bd5730
5727563
 
b3fd691
 
 
5727563
b3fd691
 
 
5727563
 
f24e927
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b3fd691
 
 
 
5727563
0bd5730
 
5727563
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
---
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-toolwear
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0491
- Mean Iou: 0.3531
- Mean Accuracy: 0.7062
- Overall Accuracy: 0.7062
- Accuracy Unlabeled: nan
- Accuracy Mass: 0.7062
- Iou Unlabeled: 0.0
- Iou Mass: 0.7062

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 45

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Mass | Iou Unlabeled | Iou Mass |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
| 0.3512        | 1.25  | 20   | 0.3893          | 0.0773   | 0.1545        | 0.1545           | nan                | 0.1545        | 0.0           | 0.1545   |
| 0.2286        | 2.5   | 40   | 0.2047          | 0.1937   | 0.3874        | 0.3874           | nan                | 0.3874        | 0.0           | 0.3874   |
| 0.1657        | 3.75  | 60   | 0.1423          | 0.2491   | 0.4982        | 0.4982           | nan                | 0.4982        | 0.0           | 0.4982   |
| 0.1581        | 5.0   | 80   | 0.1117          | 0.2649   | 0.5299        | 0.5299           | nan                | 0.5299        | 0.0           | 0.5299   |
| 0.1063        | 6.25  | 100  | 0.0943          | 0.3327   | 0.6653        | 0.6653           | nan                | 0.6653        | 0.0           | 0.6653   |
| 0.0829        | 7.5   | 120  | 0.0782          | 0.2983   | 0.5966        | 0.5966           | nan                | 0.5966        | 0.0           | 0.5966   |
| 0.0808        | 8.75  | 140  | 0.0740          | 0.3257   | 0.6515        | 0.6515           | nan                | 0.6515        | 0.0           | 0.6515   |
| 0.0694        | 10.0  | 160  | 0.0725          | 0.3503   | 0.7005        | 0.7005           | nan                | 0.7005        | 0.0           | 0.7005   |
| 0.0589        | 11.25 | 180  | 0.0663          | 0.2629   | 0.5259        | 0.5259           | nan                | 0.5259        | 0.0           | 0.5259   |
| 0.0473        | 12.5  | 200  | 0.0604          | 0.3685   | 0.7369        | 0.7369           | nan                | 0.7369        | 0.0           | 0.7369   |
| 0.0433        | 13.75 | 220  | 0.0569          | 0.3055   | 0.6109        | 0.6109           | nan                | 0.6109        | 0.0           | 0.6109   |
| 0.0511        | 15.0  | 240  | 0.0546          | 0.3572   | 0.7145        | 0.7145           | nan                | 0.7145        | 0.0           | 0.7145   |
| 0.04          | 16.25 | 260  | 0.0536          | 0.3234   | 0.6467        | 0.6467           | nan                | 0.6467        | 0.0           | 0.6467   |
| 0.0365        | 17.5  | 280  | 0.0555          | 0.3086   | 0.6171        | 0.6171           | nan                | 0.6171        | 0.0           | 0.6171   |
| 0.0314        | 18.75 | 300  | 0.0505          | 0.3595   | 0.7191        | 0.7191           | nan                | 0.7191        | 0.0           | 0.7191   |
| 0.0295        | 20.0  | 320  | 0.0536          | 0.3079   | 0.6159        | 0.6159           | nan                | 0.6159        | 0.0           | 0.6159   |
| 0.0337        | 21.25 | 340  | 0.0490          | 0.3446   | 0.6891        | 0.6891           | nan                | 0.6891        | 0.0           | 0.6891   |
| 0.0325        | 22.5  | 360  | 0.0489          | 0.3946   | 0.7891        | 0.7891           | nan                | 0.7891        | 0.0           | 0.7891   |
| 0.0314        | 23.75 | 380  | 0.0514          | 0.3184   | 0.6368        | 0.6368           | nan                | 0.6368        | 0.0           | 0.6368   |
| 0.0267        | 25.0  | 400  | 0.0485          | 0.3572   | 0.7144        | 0.7144           | nan                | 0.7144        | 0.0           | 0.7144   |
| 0.0321        | 26.25 | 420  | 0.0490          | 0.3787   | 0.7573        | 0.7573           | nan                | 0.7573        | 0.0           | 0.7573   |
| 0.025         | 27.5  | 440  | 0.0474          | 0.3615   | 0.7230        | 0.7230           | nan                | 0.7230        | 0.0           | 0.7230   |
| 0.0225        | 28.75 | 460  | 0.0472          | 0.3660   | 0.7319        | 0.7319           | nan                | 0.7319        | 0.0           | 0.7319   |
| 0.0247        | 30.0  | 480  | 0.0502          | 0.3488   | 0.6976        | 0.6976           | nan                | 0.6976        | 0.0           | 0.6976   |
| 0.0216        | 31.25 | 500  | 0.0483          | 0.3536   | 0.7072        | 0.7072           | nan                | 0.7072        | 0.0           | 0.7072   |
| 0.0195        | 32.5  | 520  | 0.0508          | 0.3289   | 0.6578        | 0.6578           | nan                | 0.6578        | 0.0           | 0.6578   |
| 0.0259        | 33.75 | 540  | 0.0496          | 0.3846   | 0.7692        | 0.7692           | nan                | 0.7692        | 0.0           | 0.7692   |
| 0.0242        | 35.0  | 560  | 0.0487          | 0.3464   | 0.6928        | 0.6928           | nan                | 0.6928        | 0.0           | 0.6928   |
| 0.0217        | 36.25 | 580  | 0.0503          | 0.3325   | 0.6650        | 0.6650           | nan                | 0.6650        | 0.0           | 0.6650   |
| 0.0204        | 37.5  | 600  | 0.0502          | 0.3429   | 0.6858        | 0.6858           | nan                | 0.6858        | 0.0           | 0.6858   |
| 0.0204        | 38.75 | 620  | 0.0507          | 0.3457   | 0.6913        | 0.6913           | nan                | 0.6913        | 0.0           | 0.6913   |
| 0.0191        | 40.0  | 640  | 0.0494          | 0.3494   | 0.6988        | 0.6988           | nan                | 0.6988        | 0.0           | 0.6988   |
| 0.0204        | 41.25 | 660  | 0.0503          | 0.3426   | 0.6852        | 0.6852           | nan                | 0.6852        | 0.0           | 0.6852   |
| 0.019         | 42.5  | 680  | 0.0485          | 0.3616   | 0.7232        | 0.7232           | nan                | 0.7232        | 0.0           | 0.7232   |
| 0.0198        | 43.75 | 700  | 0.0494          | 0.3504   | 0.7008        | 0.7008           | nan                | 0.7008        | 0.0           | 0.7008   |
| 0.0212        | 45.0  | 720  | 0.0491          | 0.3531   | 0.7062        | 0.7062           | nan                | 0.7062        | 0.0           | 0.7062   |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2