selvaa commited on
Commit
f9eb593
1 Parent(s): 92bae41

End of training

Browse files
README.md ADDED
@@ -0,0 +1,119 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ base_model: nvidia/segformer-b1-finetuned-cityscapes-1024-1024
4
+ tags:
5
+ - vision
6
+ - image-segmentation
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: segformer-b1-finetuned-cityscapes-1024-1024-full-ds
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # segformer-b1-finetuned-cityscapes-1024-1024-full-ds
17
+
18
+ This model is a fine-tuned version of [nvidia/segformer-b1-finetuned-cityscapes-1024-1024](https://huggingface.co/nvidia/segformer-b1-finetuned-cityscapes-1024-1024) on the selvaa/final_iteration dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0537
21
+ - Mean Iou: 0.9119
22
+ - Mean Accuracy: 0.9519
23
+ - Overall Accuracy: 0.9830
24
+ - Accuracy Default: 1e-06
25
+ - Accuracy Pipe: 0.8919
26
+ - Accuracy Floor: 0.9695
27
+ - Accuracy Background: 0.9942
28
+ - Iou Default: 1e-06
29
+ - Iou Pipe: 0.7943
30
+ - Iou Floor: 0.9593
31
+ - Iou Background: 0.9822
32
+
33
+ ## Model description
34
+
35
+ More information needed
36
+
37
+ ## Intended uses & limitations
38
+
39
+ More information needed
40
+
41
+ ## Training and evaluation data
42
+
43
+ More information needed
44
+
45
+ ## Training procedure
46
+
47
+ ### Training hyperparameters
48
+
49
+ The following hyperparameters were used during training:
50
+ - learning_rate: 6e-05
51
+ - train_batch_size: 3
52
+ - eval_batch_size: 3
53
+ - seed: 42
54
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
+ - lr_scheduler_type: linear
56
+ - num_epochs: 60
57
+ - mixed_precision_training: Native AMP
58
+
59
+ ### Training results
60
+
61
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Default | Accuracy Pipe | Accuracy Floor | Accuracy Background | Iou Default | Iou Pipe | Iou Floor | Iou Background |
62
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:----------------:|:-------------:|:--------------:|:-------------------:|:-----------:|:--------:|:---------:|:--------------:|
63
+ | 0.6857 | 1.0 | 52 | 0.3047 | 0.7456 | 0.8104 | 0.9494 | 1e-06 | 0.4976 | 0.9560 | 0.9776 | 1e-06 | 0.3799 | 0.9109 | 0.9460 |
64
+ | 0.2657 | 2.0 | 104 | 0.1869 | 0.8168 | 0.8664 | 0.9656 | 1e-06 | 0.6511 | 0.9583 | 0.9897 | 1e-06 | 0.5513 | 0.9373 | 0.9619 |
65
+ | 0.1674 | 3.0 | 156 | 0.1333 | 0.8510 | 0.9041 | 0.9717 | 1e-06 | 0.7620 | 0.9601 | 0.9903 | 1e-06 | 0.6405 | 0.9424 | 0.9699 |
66
+ | 0.127 | 4.0 | 208 | 0.1039 | 0.8678 | 0.9158 | 0.9743 | 1e-06 | 0.7938 | 0.9625 | 0.9910 | 1e-06 | 0.6861 | 0.9455 | 0.9719 |
67
+ | 0.1047 | 5.0 | 260 | 0.0968 | 0.8756 | 0.9343 | 0.9761 | 1e-06 | 0.8516 | 0.9609 | 0.9903 | 1e-06 | 0.7024 | 0.9490 | 0.9753 |
68
+ | 0.0924 | 6.0 | 312 | 0.0843 | 0.8839 | 0.9355 | 0.9775 | 1e-06 | 0.8512 | 0.9641 | 0.9912 | 1e-06 | 0.7244 | 0.9513 | 0.9760 |
69
+ | 0.083 | 7.0 | 364 | 0.0749 | 0.8879 | 0.9422 | 0.9786 | 1e-06 | 0.8713 | 0.9637 | 0.9914 | 1e-06 | 0.7320 | 0.9541 | 0.9775 |
70
+ | 0.0761 | 8.0 | 416 | 0.0717 | 0.8895 | 0.9412 | 0.9789 | 1e-06 | 0.8678 | 0.9634 | 0.9923 | 1e-06 | 0.7364 | 0.9541 | 0.9779 |
71
+ | 0.0709 | 9.0 | 468 | 0.0723 | 0.8891 | 0.9289 | 0.9789 | 1e-06 | 0.8282 | 0.9635 | 0.9949 | 1e-06 | 0.7361 | 0.9543 | 0.9769 |
72
+ | 0.0664 | 10.0 | 520 | 0.0653 | 0.8952 | 0.9385 | 0.9800 | 1e-06 | 0.8554 | 0.9663 | 0.9936 | 1e-06 | 0.7507 | 0.9568 | 0.9783 |
73
+ | 0.0628 | 11.0 | 572 | 0.0668 | 0.8934 | 0.9317 | 0.9797 | 1e-06 | 0.8345 | 0.9658 | 0.9948 | 1e-06 | 0.7460 | 0.9566 | 0.9776 |
74
+ | 0.0599 | 12.0 | 624 | 0.0612 | 0.9000 | 0.9526 | 0.9808 | 1e-06 | 0.8987 | 0.9675 | 0.9914 | 1e-06 | 0.7624 | 0.9574 | 0.9801 |
75
+ | 0.0578 | 13.0 | 676 | 0.0604 | 0.8982 | 0.9458 | 0.9803 | 1e-06 | 0.8770 | 0.9686 | 0.9918 | 1e-06 | 0.7602 | 0.9549 | 0.9795 |
76
+ | 0.0542 | 14.0 | 728 | 0.0609 | 0.9003 | 0.9435 | 0.9809 | 1e-06 | 0.8698 | 0.9673 | 0.9936 | 1e-06 | 0.7636 | 0.9578 | 0.9795 |
77
+ | 0.0528 | 15.0 | 780 | 0.0562 | 0.9054 | 0.9461 | 0.9818 | 1e-06 | 0.8767 | 0.9672 | 0.9945 | 1e-06 | 0.7771 | 0.9586 | 0.9806 |
78
+ | 0.0505 | 16.0 | 832 | 0.0550 | 0.9039 | 0.9546 | 0.9815 | 1e-06 | 0.9044 | 0.9672 | 0.9921 | 1e-06 | 0.7734 | 0.9576 | 0.9808 |
79
+ | 0.0496 | 17.0 | 884 | 0.0594 | 0.9016 | 0.9447 | 0.9811 | 1e-06 | 0.8753 | 0.9638 | 0.9949 | 1e-06 | 0.7673 | 0.9578 | 0.9799 |
80
+ | 0.0478 | 18.0 | 936 | 0.0543 | 0.9042 | 0.9554 | 0.9816 | 1e-06 | 0.9056 | 0.9695 | 0.9913 | 1e-06 | 0.7732 | 0.9586 | 0.9808 |
81
+ | 0.0472 | 19.0 | 988 | 0.0554 | 0.9046 | 0.9510 | 0.9816 | 1e-06 | 0.8921 | 0.9683 | 0.9927 | 1e-06 | 0.7751 | 0.9582 | 0.9806 |
82
+ | 0.0457 | 20.0 | 1040 | 0.0590 | 0.9010 | 0.9407 | 0.9810 | 1e-06 | 0.8585 | 0.9703 | 0.9933 | 1e-06 | 0.7661 | 0.9572 | 0.9795 |
83
+ | 0.0439 | 21.0 | 1092 | 0.0558 | 0.9045 | 0.9484 | 0.9817 | 1e-06 | 0.8841 | 0.9674 | 0.9937 | 1e-06 | 0.7747 | 0.9579 | 0.9807 |
84
+ | 0.0432 | 22.0 | 1144 | 0.0564 | 0.9060 | 0.9469 | 0.9818 | 1e-06 | 0.8783 | 0.9683 | 0.9940 | 1e-06 | 0.7791 | 0.9582 | 0.9806 |
85
+ | 0.0425 | 23.0 | 1196 | 0.0541 | 0.9064 | 0.9531 | 0.9820 | 1e-06 | 0.8980 | 0.9686 | 0.9927 | 1e-06 | 0.7798 | 0.9580 | 0.9813 |
86
+ | 0.0424 | 24.0 | 1248 | 0.0562 | 0.9059 | 0.9405 | 0.9819 | 1e-06 | 0.8584 | 0.9675 | 0.9957 | 1e-06 | 0.7784 | 0.9592 | 0.9800 |
87
+ | 0.0412 | 25.0 | 1300 | 0.0553 | 0.9032 | 0.9537 | 0.9816 | 1e-06 | 0.9002 | 0.9693 | 0.9918 | 1e-06 | 0.7700 | 0.9584 | 0.9811 |
88
+ | 0.0404 | 26.0 | 1352 | 0.0533 | 0.9075 | 0.9514 | 0.9822 | 1e-06 | 0.8927 | 0.9678 | 0.9937 | 1e-06 | 0.7823 | 0.9586 | 0.9815 |
89
+ | 0.0397 | 27.0 | 1404 | 0.0526 | 0.9073 | 0.9525 | 0.9821 | 1e-06 | 0.8950 | 0.9697 | 0.9928 | 1e-06 | 0.7819 | 0.9584 | 0.9814 |
90
+ | 0.0394 | 28.0 | 1456 | 0.0523 | 0.9082 | 0.9563 | 0.9825 | 1e-06 | 0.9078 | 0.9681 | 0.9930 | 1e-06 | 0.7835 | 0.9590 | 0.9822 |
91
+ | 0.0388 | 29.0 | 1508 | 0.0526 | 0.9078 | 0.9541 | 0.9823 | 1e-06 | 0.8999 | 0.9701 | 0.9924 | 1e-06 | 0.7834 | 0.9584 | 0.9817 |
92
+ | 0.0384 | 30.0 | 1560 | 0.0531 | 0.9087 | 0.9512 | 0.9825 | 1e-06 | 0.8903 | 0.9695 | 0.9936 | 1e-06 | 0.7852 | 0.9593 | 0.9817 |
93
+ | 0.0379 | 31.0 | 1612 | 0.0534 | 0.9084 | 0.9525 | 0.9825 | 1e-06 | 0.8962 | 0.9674 | 0.9940 | 1e-06 | 0.7846 | 0.9585 | 0.9820 |
94
+ | 0.0371 | 32.0 | 1664 | 0.0530 | 0.9104 | 0.9513 | 0.9827 | 1e-06 | 0.8919 | 0.9675 | 0.9945 | 1e-06 | 0.7904 | 0.9590 | 0.9818 |
95
+ | 0.0365 | 33.0 | 1716 | 0.0522 | 0.9096 | 0.9535 | 0.9826 | 1e-06 | 0.8980 | 0.9690 | 0.9935 | 1e-06 | 0.7877 | 0.9593 | 0.9819 |
96
+ | 0.0362 | 34.0 | 1768 | 0.0523 | 0.9106 | 0.9528 | 0.9827 | 1e-06 | 0.8948 | 0.9702 | 0.9934 | 1e-06 | 0.7909 | 0.9590 | 0.9819 |
97
+ | 0.0368 | 35.0 | 1820 | 0.0532 | 0.9099 | 0.9501 | 0.9826 | 1e-06 | 0.8858 | 0.9710 | 0.9935 | 1e-06 | 0.7892 | 0.9590 | 0.9816 |
98
+ | 0.0365 | 36.0 | 1872 | 0.0513 | 0.9106 | 0.9556 | 0.9828 | 1e-06 | 0.9043 | 0.9695 | 0.9932 | 1e-06 | 0.7901 | 0.9594 | 0.9823 |
99
+ | 0.0357 | 37.0 | 1924 | 0.0535 | 0.9093 | 0.9511 | 0.9826 | 1e-06 | 0.8907 | 0.9685 | 0.9941 | 1e-06 | 0.7867 | 0.9596 | 0.9817 |
100
+ | 0.0359 | 38.0 | 1976 | 0.0518 | 0.9091 | 0.9571 | 0.9826 | 1e-06 | 0.9105 | 0.9678 | 0.9930 | 1e-06 | 0.7861 | 0.9590 | 0.9822 |
101
+ | 0.0344 | 39.0 | 2028 | 0.0535 | 0.9102 | 0.9505 | 0.9827 | 1e-06 | 0.8882 | 0.9689 | 0.9943 | 1e-06 | 0.7894 | 0.9593 | 0.9818 |
102
+ | 0.034 | 40.0 | 2080 | 0.0519 | 0.9115 | 0.9547 | 0.9830 | 1e-06 | 0.9009 | 0.9697 | 0.9936 | 1e-06 | 0.7923 | 0.9597 | 0.9824 |
103
+ | 0.0339 | 41.0 | 2132 | 0.0528 | 0.9120 | 0.9525 | 0.9830 | 1e-06 | 0.8935 | 0.9698 | 0.9940 | 1e-06 | 0.7946 | 0.9592 | 0.9823 |
104
+ | 0.0338 | 42.0 | 2184 | 0.0531 | 0.9118 | 0.9529 | 0.9830 | 1e-06 | 0.8957 | 0.9687 | 0.9944 | 1e-06 | 0.7934 | 0.9595 | 0.9824 |
105
+ | 0.0339 | 43.0 | 2236 | 0.0542 | 0.9113 | 0.9518 | 0.9829 | 1e-06 | 0.8917 | 0.9697 | 0.9941 | 1e-06 | 0.7923 | 0.9594 | 0.9823 |
106
+ | 0.0337 | 44.0 | 2288 | 0.0541 | 0.9092 | 0.9535 | 0.9825 | 1e-06 | 0.8982 | 0.9688 | 0.9935 | 1e-06 | 0.7865 | 0.9594 | 0.9818 |
107
+ | 0.0332 | 45.0 | 2340 | 0.0535 | 0.9111 | 0.9533 | 0.9829 | 1e-06 | 0.8960 | 0.9702 | 0.9936 | 1e-06 | 0.7916 | 0.9595 | 0.9822 |
108
+ | 0.0328 | 46.0 | 2392 | 0.0535 | 0.9112 | 0.9519 | 0.9828 | 1e-06 | 0.8914 | 0.9704 | 0.9937 | 1e-06 | 0.7924 | 0.9590 | 0.9822 |
109
+ | 0.0342 | 47.0 | 2444 | 0.0533 | 0.9118 | 0.9533 | 0.9829 | 1e-06 | 0.8963 | 0.9696 | 0.9939 | 1e-06 | 0.7938 | 0.9594 | 0.9822 |
110
+ | 0.0331 | 48.0 | 2496 | 0.0552 | 0.9101 | 0.9509 | 0.9827 | 1e-06 | 0.8891 | 0.9696 | 0.9940 | 1e-06 | 0.7889 | 0.9593 | 0.9820 |
111
+ | 0.0325 | 49.0 | 2548 | 0.0537 | 0.9119 | 0.9519 | 0.9830 | 1e-06 | 0.8919 | 0.9695 | 0.9942 | 1e-06 | 0.7943 | 0.9593 | 0.9822 |
112
+
113
+
114
+ ### Framework versions
115
+
116
+ - Transformers 4.35.2
117
+ - Pytorch 2.0.1
118
+ - Datasets 2.15.0
119
+ - Tokenizers 0.15.0
config.json ADDED
@@ -0,0 +1,82 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/segformer-b1-finetuned-cityscapes-1024-1024",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 64,
26
+ 128,
27
+ 320,
28
+ 512
29
+ ],
30
+ "id2label": {
31
+ "0": "default",
32
+ "1": "pipe",
33
+ "2": "floor",
34
+ "3": "background"
35
+ },
36
+ "image_size": 224,
37
+ "initializer_range": 0.02,
38
+ "label2id": {
39
+ "background": 3,
40
+ "default": 0,
41
+ "floor": 2,
42
+ "pipe": 1
43
+ },
44
+ "layer_norm_eps": 1e-06,
45
+ "mlp_ratios": [
46
+ 4,
47
+ 4,
48
+ 4,
49
+ 4
50
+ ],
51
+ "model_type": "segformer",
52
+ "num_attention_heads": [
53
+ 1,
54
+ 2,
55
+ 5,
56
+ 8
57
+ ],
58
+ "num_channels": 3,
59
+ "num_encoder_blocks": 4,
60
+ "patch_sizes": [
61
+ 7,
62
+ 3,
63
+ 3,
64
+ 3
65
+ ],
66
+ "reshape_last_stage": true,
67
+ "semantic_loss_ignore_index": 255,
68
+ "sr_ratios": [
69
+ 8,
70
+ 4,
71
+ 2,
72
+ 1
73
+ ],
74
+ "strides": [
75
+ 4,
76
+ 2,
77
+ 2,
78
+ 2
79
+ ],
80
+ "torch_dtype": "float32",
81
+ "transformers_version": "4.35.2"
82
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7969cd0e11c9a2253b499f42fa5abdc7e91e218b6e4ee3aca970bb299b5dff6
3
+ size 54739432
runs/Apr25_12-09-02_antd/events.out.tfevents.1714039754.antd.55498.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:311c552c2298a90dd20b4a732d04371b95a6537baa1df6ea246e6e5e4f9d17fb
3
+ size 56853
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5c8ed4e8a27a1a5fe68e7701005880eb8145cac87d97b36223a25972afd99889
3
+ size 4283