File size: 6,041 Bytes
e77c624
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-finetuned-4ss1st3r_s3gs3m_24Jan_all-10k-steps
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-finetuned-4ss1st3r_s3gs3m_24Jan_all-10k-steps

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3095
- Mean Iou: 0.5513
- Mean Accuracy: 0.7874
- Overall Accuracy: 0.9260
- Accuracy Bg: nan
- Accuracy Fallo cohesivo: 0.9668
- Accuracy Fallo malla: 0.6808
- Accuracy Fallo adhesivo: 0.9727
- Accuracy Fallo burbuja: 0.5291
- Iou Bg: 0.0
- Iou Fallo cohesivo: 0.9167
- Iou Fallo malla: 0.6189
- Iou Fallo adhesivo: 0.7307
- Iou Fallo burbuja: 0.4903

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 10000

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Bg | Accuracy Fallo cohesivo | Accuracy Fallo malla | Accuracy Fallo adhesivo | Accuracy Fallo burbuja | Iou Bg | Iou Fallo cohesivo | Iou Fallo malla | Iou Fallo adhesivo | Iou Fallo burbuja |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-----------:|:-----------------------:|:--------------------:|:-----------------------:|:----------------------:|:------:|:------------------:|:---------------:|:------------------:|:-----------------:|
| 0.1378        | 1.0   | 783   | 0.2677          | 0.4895   | 0.7143        | 0.9122           | nan         | 0.9724                  | 0.5531               | 0.9663                  | 0.3654                 | 0.0    | 0.9038             | 0.5327          | 0.6757             | 0.3351            |
| 0.1117        | 2.0   | 1566  | 0.2305          | 0.5289   | 0.7978        | 0.9246           | nan         | 0.9507                  | 0.7727               | 0.9705                  | 0.4974                 | 0.0    | 0.9214             | 0.6808          | 0.5876             | 0.4549            |
| 0.0881        | 3.0   | 2349  | 0.2041          | 0.5556   | 0.7867        | 0.9354           | nan         | 0.9712                  | 0.7391               | 0.9389                  | 0.4975                 | 0.0    | 0.9273             | 0.6790          | 0.7323             | 0.4394            |
| 0.0878        | 4.0   | 3132  | 0.1984          | 0.5584   | 0.8003        | 0.9346           | nan         | 0.9556                  | 0.8247               | 0.9602                  | 0.4606                 | 0.0    | 0.9261             | 0.6935          | 0.7373             | 0.4352            |
| 0.0895        | 5.0   | 3915  | 0.2841          | 0.5246   | 0.8086        | 0.9088           | nan         | 0.9137                  | 0.8834               | 0.9719                  | 0.4652                 | 0.0    | 0.8964             | 0.6309          | 0.6593             | 0.4365            |
| 0.0773        | 6.0   | 4698  | 0.2547          | 0.5652   | 0.7823        | 0.9336           | nan         | 0.9775                  | 0.6843               | 0.9384                  | 0.5291                 | 0.0    | 0.9251             | 0.6378          | 0.7820             | 0.4813            |
| 0.0667        | 7.0   | 5481  | 0.2726          | 0.5609   | 0.7932        | 0.9295           | nan         | 0.9741                  | 0.6609               | 0.9689                  | 0.5689                 | 0.0    | 0.9203             | 0.6202          | 0.7548             | 0.5093            |
| 0.0678        | 8.0   | 6264  | 0.2950          | 0.5276   | 0.8002        | 0.9175           | nan         | 0.9443                  | 0.7561               | 0.9713                  | 0.5292                 | 0.0    | 0.9089             | 0.6570          | 0.5900             | 0.4822            |
| 0.0653        | 9.0   | 7047  | 0.2712          | 0.5467   | 0.7682        | 0.9288           | nan         | 0.9690                  | 0.6971               | 0.9641                  | 0.4425                 | 0.0    | 0.9189             | 0.6330          | 0.7588             | 0.4228            |
| 0.0646        | 10.0  | 7830  | 0.2841          | 0.5499   | 0.7819        | 0.9272           | nan         | 0.9681                  | 0.6840               | 0.9688                  | 0.5068                 | 0.0    | 0.9178             | 0.6243          | 0.7345             | 0.4728            |
| 0.057         | 11.0  | 8613  | 0.3373          | 0.5257   | 0.7782        | 0.9166           | nan         | 0.9593                  | 0.6555               | 0.9739                  | 0.5242                 | 0.0    | 0.9075             | 0.6040          | 0.6319             | 0.4848            |
| 0.0591        | 12.0  | 9396  | 0.3082          | 0.5504   | 0.7900        | 0.9247           | nan         | 0.9656                  | 0.6776               | 0.9705                  | 0.5463                 | 0.0    | 0.9148             | 0.6172          | 0.7182             | 0.5019            |
| 0.053         | 12.77 | 10000 | 0.3095          | 0.5513   | 0.7874        | 0.9260           | nan         | 0.9668                  | 0.6808               | 0.9727                  | 0.5291                 | 0.0    | 0.9167             | 0.6189          | 0.7307             | 0.4903            |


### Framework versions

- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cpu
- Datasets 2.13.1
- Tokenizers 0.13.3