sumeyya commited on
Commit
e499b96
·
verified ·
1 Parent(s): 65bc1d0

End of training

Browse files
README.md CHANGED
@@ -1,199 +1,172 @@
1
  ---
2
  library_name: transformers
3
- tags: []
 
 
 
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
 
1
  ---
2
  library_name: transformers
3
+ license: other
4
+ base_model: nvidia/mit-b0
5
+ tags:
6
+ - vision
7
+ - image-segmentation
8
+ - generated_from_trainer
9
+ model-index:
10
+ - name: segformer-b0-finetuned-segments-sidewalk-oct-22
11
+ results: []
12
  ---
13
 
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # segformer-b0-finetuned-segments-sidewalk-oct-22
18
+
19
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.7351
22
+ - Mean Iou: 0.2097
23
+ - Mean Accuracy: 0.2588
24
+ - Overall Accuracy: 0.7962
25
+ - Accuracy Unlabeled: nan
26
+ - Accuracy Flat-road: 0.7781
27
+ - Accuracy Flat-sidewalk: 0.9422
28
+ - Accuracy Flat-crosswalk: 0.0
29
+ - Accuracy Flat-cyclinglane: 0.6678
30
+ - Accuracy Flat-parkingdriveway: 0.5103
31
+ - Accuracy Flat-railtrack: nan
32
+ - Accuracy Flat-curb: 0.3970
33
+ - Accuracy Human-person: 0.0
34
+ - Accuracy Human-rider: 0.0
35
+ - Accuracy Vehicle-car: 0.9361
36
+ - Accuracy Vehicle-truck: 0.0
37
+ - Accuracy Vehicle-bus: 0.0
38
+ - Accuracy Vehicle-tramtrain: 0.0
39
+ - Accuracy Vehicle-motorcycle: 0.0
40
+ - Accuracy Vehicle-bicycle: 0.0
41
+ - Accuracy Vehicle-caravan: 0.0
42
+ - Accuracy Vehicle-cartrailer: 0.0
43
+ - Accuracy Construction-building: 0.9283
44
+ - Accuracy Construction-door: 0.0
45
+ - Accuracy Construction-wall: 0.2268
46
+ - Accuracy Construction-fenceguardrail: 0.0000
47
+ - Accuracy Construction-bridge: 0.0
48
+ - Accuracy Construction-tunnel: nan
49
+ - Accuracy Construction-stairs: 0.0
50
+ - Accuracy Object-pole: 0.0873
51
+ - Accuracy Object-trafficsign: 0.0
52
+ - Accuracy Object-trafficlight: 0.0
53
+ - Accuracy Nature-vegetation: 0.9230
54
+ - Accuracy Nature-terrain: 0.9084
55
+ - Accuracy Sky: 0.9461
56
+ - Accuracy Void-ground: 0.0
57
+ - Accuracy Void-dynamic: 0.0
58
+ - Accuracy Void-static: 0.0306
59
+ - Accuracy Void-unclear: 0.0
60
+ - Iou Unlabeled: nan
61
+ - Iou Flat-road: 0.6262
62
+ - Iou Flat-sidewalk: 0.8234
63
+ - Iou Flat-crosswalk: 0.0
64
+ - Iou Flat-cyclinglane: 0.5624
65
+ - Iou Flat-parkingdriveway: 0.3620
66
+ - Iou Flat-railtrack: nan
67
+ - Iou Flat-curb: 0.3029
68
+ - Iou Human-person: 0.0
69
+ - Iou Human-rider: 0.0
70
+ - Iou Vehicle-car: 0.7060
71
+ - Iou Vehicle-truck: 0.0
72
+ - Iou Vehicle-bus: 0.0
73
+ - Iou Vehicle-tramtrain: 0.0
74
+ - Iou Vehicle-motorcycle: 0.0
75
+ - Iou Vehicle-bicycle: 0.0
76
+ - Iou Vehicle-caravan: 0.0
77
+ - Iou Vehicle-cartrailer: 0.0
78
+ - Iou Construction-building: 0.6436
79
+ - Iou Construction-door: 0.0
80
+ - Iou Construction-wall: 0.1895
81
+ - Iou Construction-fenceguardrail: 0.0000
82
+ - Iou Construction-bridge: 0.0
83
+ - Iou Construction-tunnel: nan
84
+ - Iou Construction-stairs: 0.0
85
+ - Iou Object-pole: 0.0808
86
+ - Iou Object-trafficsign: 0.0
87
+ - Iou Object-trafficlight: 0.0
88
+ - Iou Nature-vegetation: 0.7906
89
+ - Iou Nature-terrain: 0.7024
90
+ - Iou Sky: 0.8933
91
+ - Iou Void-ground: 0.0
92
+ - Iou Void-dynamic: 0.0
93
+ - Iou Void-static: 0.0273
94
+ - Iou Void-unclear: 0.0
95
+
96
+ ## Model description
97
+
98
+ More information needed
99
+
100
+ ## Intended uses & limitations
101
+
102
+ More information needed
103
+
104
+ ## Training and evaluation data
105
+
106
+ More information needed
107
+
108
+ ## Training procedure
109
+
110
+ ### Training hyperparameters
111
+
112
+ The following hyperparameters were used during training:
113
+ - learning_rate: 0.0002
114
+ - train_batch_size: 2
115
+ - eval_batch_size: 2
116
+ - seed: 42
117
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
118
+ - lr_scheduler_type: linear
119
+ - num_epochs: 2
120
+
121
+ ### Training results
122
+
123
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear |
124
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:|
125
+ | 2.6052 | 0.05 | 20 | 2.6583 | 0.0781 | 0.1236 | 0.5809 | nan | 0.1191 | 0.9803 | 0.0 | 0.0 | 0.0014 | nan | 0.0011 | 0.0 | 0.0 | 0.9544 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6683 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8576 | 0.0002 | 0.3741 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.1069 | 0.5811 | 0.0 | 0.0 | 0.0014 | nan | 0.0011 | 0.0 | 0.0 | 0.4003 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4146 | 0.0 | 0.0000 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6237 | 0.0002 | 0.3710 | 0.0 | 0.0 | 0.0 | 0.0 |
126
+ | 1.9158 | 0.1 | 40 | 1.8276 | 0.0852 | 0.1341 | 0.6115 | nan | 0.8028 | 0.8197 | 0.0 | 0.0 | 0.0043 | nan | 0.0013 | 0.0 | 0.0 | 0.8105 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8498 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9232 | 0.0006 | 0.0783 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.3391 | 0.6942 | 0.0 | 0.0 | 0.0042 | nan | 0.0013 | 0.0 | 0.0 | 0.5540 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4445 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6091 | 0.0006 | 0.0781 | 0.0 | 0.0 | 0.0 | 0.0 |
127
+ | 1.4713 | 0.15 | 60 | 1.5056 | 0.1027 | 0.1504 | 0.6575 | nan | 0.6471 | 0.9460 | 0.0 | 0.0 | 0.0031 | nan | 0.0043 | 0.0 | 0.0 | 0.9179 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8383 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9344 | 0.0030 | 0.5194 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.3964 | 0.7171 | 0.0 | 0.0 | 0.0031 | nan | 0.0043 | 0.0 | 0.0 | 0.5233 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5092 | 0.0 | 0.0002 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6168 | 0.0030 | 0.5139 | 0.0 | 0.0 | 0.0 | 0.0 |
128
+ | 1.1025 | 0.2 | 80 | 1.3972 | 0.1167 | 0.1598 | 0.6714 | nan | 0.6796 | 0.9507 | 0.0 | 0.0001 | 0.0002 | nan | 0.0017 | 0.0 | 0.0 | 0.7917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8734 | 0.0 | 0.0090 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9286 | 0.0247 | 0.8540 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4167 | 0.7149 | 0.0 | 0.0001 | 0.0002 | nan | 0.0017 | 0.0 | 0.0 | 0.6355 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5041 | 0.0 | 0.0087 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6289 | 0.0244 | 0.7983 | 0.0 | 0.0 | 0.0 | 0.0 |
129
+ | 1.9869 | 0.25 | 100 | 1.3690 | 0.1082 | 0.1580 | 0.6565 | nan | 0.5658 | 0.9480 | 0.0 | 0.0005 | 0.0005 | nan | 0.0005 | 0.0 | 0.0 | 0.9066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7864 | 0.0 | 0.0055 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9378 | 0.0041 | 0.9016 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.3963 | 0.6944 | 0.0 | 0.0005 | 0.0005 | nan | 0.0005 | 0.0 | 0.0 | 0.4568 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5445 | 0.0 | 0.0053 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.5857 | 0.0041 | 0.7738 | 0.0 | 0.0 | 0.0 | 0.0 |
130
+ | 1.126 | 0.3 | 120 | 1.2700 | 0.1288 | 0.1754 | 0.6867 | nan | 0.8098 | 0.8984 | 0.0 | 0.0093 | 0.0002 | nan | 0.0001 | 0.0 | 0.0 | 0.8207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8592 | 0.0 | 0.0043 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9337 | 0.3991 | 0.8793 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4231 | 0.7335 | 0.0 | 0.0093 | 0.0002 | nan | 0.0001 | 0.0 | 0.0 | 0.6281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5333 | 0.0 | 0.0042 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6647 | 0.3577 | 0.7679 | 0.0 | 0.0 | 0.0 | 0.0 |
131
+ | 1.3711 | 0.35 | 140 | 1.1517 | 0.1452 | 0.1931 | 0.7186 | nan | 0.7518 | 0.9370 | 0.0 | 0.2270 | 0.0011 | nan | 0.0001 | 0.0 | 0.0 | 0.9359 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8504 | 0.0 | 0.0036 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9066 | 0.6993 | 0.8660 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4932 | 0.7384 | 0.0 | 0.2135 | 0.0011 | nan | 0.0001 | 0.0 | 0.0 | 0.5199 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5779 | 0.0 | 0.0035 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7254 | 0.5490 | 0.8244 | 0.0 | 0.0 | 0.0 | 0.0 |
132
+ | 1.3994 | 0.4 | 160 | 1.1185 | 0.1508 | 0.1970 | 0.7199 | nan | 0.8502 | 0.9062 | 0.0 | 0.2392 | 0.0004 | nan | 0.0001 | 0.0 | 0.0 | 0.8281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8655 | 0.0 | 0.0017 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8850 | 0.7953 | 0.9333 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4463 | 0.7613 | 0.0 | 0.2293 | 0.0004 | nan | 0.0001 | 0.0 | 0.0 | 0.6460 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5373 | 0.0 | 0.0017 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7474 | 0.6127 | 0.8426 | 0.0 | 0.0 | 0.0 | 0.0 |
133
+ | 0.911 | 0.45 | 180 | 1.0558 | 0.1550 | 0.2026 | 0.7305 | nan | 0.7464 | 0.9479 | 0.0 | 0.4110 | 0.0128 | nan | 0.0013 | 0.0 | 0.0 | 0.9009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9054 | 0.0 | 0.0008 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8071 | 0.8876 | 0.8631 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5120 | 0.7481 | 0.0 | 0.3866 | 0.0127 | nan | 0.0013 | 0.0 | 0.0 | 0.6344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5574 | 0.0 | 0.0007 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7060 | 0.5653 | 0.8353 | 0.0 | 0.0 | 0.0 | 0.0 |
134
+ | 1.8468 | 0.5 | 200 | 1.0063 | 0.1573 | 0.2034 | 0.7369 | nan | 0.8325 | 0.9268 | 0.0 | 0.3866 | 0.0021 | nan | 0.0029 | 0.0 | 0.0 | 0.9098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8590 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9134 | 0.7506 | 0.9239 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.4926 | 0.7629 | 0.0 | 0.3684 | 0.0021 | nan | 0.0029 | 0.0 | 0.0 | 0.6087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5709 | 0.0 | 0.0001 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7512 | 0.6387 | 0.8367 | 0.0 | 0.0 | 0.0 | 0.0 |
135
+ | 0.8309 | 0.55 | 220 | 0.9882 | 0.1604 | 0.2073 | 0.7393 | nan | 0.7298 | 0.9406 | 0.0 | 0.3782 | 0.0239 | nan | 0.0704 | 0.0 | 0.0 | 0.9040 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8975 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9062 | 0.8872 | 0.8960 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5191 | 0.7437 | 0.0 | 0.3403 | 0.0235 | nan | 0.0654 | 0.0 | 0.0 | 0.6150 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5780 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7599 | 0.6547 | 0.8333 | 0.0 | 0.0 | 0.0 | 0.0 |
136
+ | 1.5283 | 0.6 | 240 | 0.9686 | 0.1657 | 0.2120 | 0.7435 | nan | 0.8319 | 0.9088 | 0.0 | 0.5127 | 0.0254 | nan | 0.1306 | 0.0 | 0.0 | 0.8869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8993 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9219 | 0.7303 | 0.9347 | 0.0 | 0.0 | 0.0000 | 0.0 | nan | 0.5035 | 0.7652 | 0.0 | 0.4476 | 0.0250 | nan | 0.1151 | 0.0 | 0.0 | 0.6605 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5813 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7522 | 0.6158 | 0.8355 | 0.0 | 0.0 | 0.0000 | 0.0 |
137
+ | 0.5838 | 0.65 | 260 | 0.9858 | 0.1667 | 0.2164 | 0.7450 | nan | 0.7064 | 0.9452 | 0.0 | 0.4848 | 0.1076 | nan | 0.1731 | 0.0 | 0.0 | 0.9220 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8139 | 0.0 | 0.0006 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9085 | 0.9244 | 0.9371 | 0.0 | 0.0 | 0.0006 | 0.0 | nan | 0.5492 | 0.7671 | 0.0 | 0.4391 | 0.1003 | nan | 0.1451 | 0.0 | 0.0 | 0.6131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5788 | 0.0 | 0.0006 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.6824 | 0.5950 | 0.8630 | 0.0 | 0.0 | 0.0006 | 0.0 |
138
+ | 1.0103 | 0.7 | 280 | 0.9187 | 0.1717 | 0.2174 | 0.7573 | nan | 0.7729 | 0.9468 | 0.0 | 0.5562 | 0.1364 | nan | 0.1327 | 0.0 | 0.0 | 0.9128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8647 | 0.0 | 0.0024 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9379 | 0.7704 | 0.9241 | 0.0 | 0.0 | 0.0001 | 0.0 | nan | 0.5545 | 0.7857 | 0.0 | 0.4650 | 0.1235 | nan | 0.1163 | 0.0 | 0.0 | 0.6424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5814 | 0.0 | 0.0024 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7353 | 0.6449 | 0.8444 | 0.0 | 0.0 | 0.0001 | 0.0 |
139
+ | 0.7643 | 0.75 | 300 | 0.9432 | 0.1678 | 0.2079 | 0.7494 | nan | 0.7337 | 0.9651 | 0.0 | 0.5139 | 0.0532 | nan | 0.0619 | 0.0 | 0.0 | 0.8188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8693 | 0.0 | 0.0020 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9373 | 0.7835 | 0.9152 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5522 | 0.7514 | 0.0 | 0.4670 | 0.0507 | nan | 0.0576 | 0.0 | 0.0 | 0.6797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5663 | 0.0 | 0.0019 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7301 | 0.6469 | 0.8644 | 0.0 | 0.0 | 0.0 | 0.0 |
140
+ | 0.6425 | 0.8 | 320 | 0.9611 | 0.1722 | 0.2218 | 0.7449 | nan | 0.6811 | 0.9292 | 0.0 | 0.5486 | 0.1368 | nan | 0.2891 | 0.0 | 0.0 | 0.8899 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9512 | 0.0 | 0.0013 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8459 | 0.9144 | 0.9087 | 0.0 | 0.0 | 0.0 | 0.0 | nan | 0.5319 | 0.7647 | 0.0 | 0.4224 | 0.1236 | nan | 0.2104 | 0.0 | 0.0 | 0.6475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5370 | 0.0 | 0.0013 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7571 | 0.6566 | 0.8577 | 0.0 | 0.0 | 0.0 | 0.0 |
141
+ | 0.9292 | 0.85 | 340 | 0.8890 | 0.1759 | 0.2213 | 0.7594 | nan | 0.6721 | 0.9524 | 0.0 | 0.6827 | 0.1609 | nan | 0.1927 | 0.0 | 0.0 | 0.8767 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8994 | 0.0 | 0.0025 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.9567 | 0.7380 | 0.9474 | 0.0 | 0.0 | 0.0005 | 0.0 | nan | 0.5323 | 0.7863 | 0.0 | 0.4764 | 0.1441 | nan | 0.1562 | 0.0 | 0.0 | 0.6933 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5952 | 0.0 | 0.0025 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.7359 | 0.6399 | 0.8669 | 0.0 | 0.0 | 0.0005 | 0.0 |
142
+ | 0.6201 | 0.9 | 360 | 0.8572 | 0.1871 | 0.2354 | 0.7712 | nan | 0.6941 | 0.9488 | 0.0 | 0.6405 | 0.4223 | nan | 0.2294 | 0.0 | 0.0 | 0.9290 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8868 | 0.0 | 0.0532 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.9245 | 0.8871 | 0.9148 | 0.0 | 0.0 | 0.0014 | 0.0 | nan | 0.5658 | 0.7930 | 0.0 | 0.5446 | 0.2794 | nan | 0.1875 | 0.0 | 0.0 | 0.6262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6075 | 0.0 | 0.0508 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.7662 | 0.6993 | 0.8658 | 0.0 | 0.0 | 0.0014 | 0.0 |
143
+ | 1.0028 | 0.95 | 380 | 0.8642 | 0.1790 | 0.2248 | 0.7648 | nan | 0.7265 | 0.9510 | 0.0 | 0.6286 | 0.1598 | nan | 0.1719 | 0.0 | 0.0 | 0.9213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9119 | 0.0 | 0.0156 | 0.0 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.9108 | 0.8756 | 0.9200 | 0.0 | 0.0 | 0.0011 | 0.0 | nan | 0.5445 | 0.7817 | 0.0 | 0.4995 | 0.1473 | nan | 0.1486 | 0.0 | 0.0 | 0.6385 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5978 | 0.0 | 0.0153 | 0.0 | 0.0 | nan | 0.0 | 0.0007 | 0.0 | 0.0 | 0.7795 | 0.6997 | 0.8726 | 0.0 | 0.0 | 0.0011 | 0.0 |
144
+ | 0.8125 | 1.0 | 400 | 0.8575 | 0.1836 | 0.2336 | 0.7707 | nan | 0.8044 | 0.9387 | 0.0 | 0.5612 | 0.2633 | nan | 0.2876 | 0.0 | 0.0 | 0.9481 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8694 | 0.0 | 0.0216 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.8947 | 0.9314 | 0.9469 | 0.0 | 0.0 | 0.0072 | 0.0 | nan | 0.5789 | 0.7979 | 0.0 | 0.4938 | 0.2217 | nan | 0.2231 | 0.0 | 0.0 | 0.6085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6185 | 0.0 | 0.0214 | 0.0 | 0.0 | nan | 0.0 | 0.0002 | 0.0 | 0.0 | 0.7762 | 0.6458 | 0.8823 | 0.0 | 0.0 | 0.0069 | 0.0 |
145
+ | 0.7077 | 1.05 | 420 | 0.8753 | 0.1832 | 0.2253 | 0.7673 | nan | 0.7668 | 0.9672 | 0.0 | 0.4280 | 0.2171 | nan | 0.2647 | 0.0 | 0.0 | 0.8745 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9391 | 0.0 | 0.0469 | 0.0 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.8847 | 0.8916 | 0.9224 | 0.0 | 0.0 | 0.0064 | 0.0 | nan | 0.5851 | 0.7829 | 0.0 | 0.4063 | 0.1772 | nan | 0.2167 | 0.0 | 0.0 | 0.6977 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5724 | 0.0 | 0.0444 | 0.0 | 0.0 | nan | 0.0 | 0.0010 | 0.0 | 0.0 | 0.7869 | 0.7071 | 0.8785 | 0.0 | 0.0 | 0.0060 | 0.0 |
146
+ | 0.5107 | 1.1 | 440 | 0.8497 | 0.1901 | 0.2344 | 0.7732 | nan | 0.7275 | 0.9648 | 0.0 | 0.6019 | 0.1918 | nan | 0.2115 | 0.0 | 0.0 | 0.8908 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8728 | 0.0 | 0.2648 | 0.0 | 0.0 | nan | 0.0 | 0.0023 | 0.0 | 0.0 | 0.9263 | 0.9003 | 0.9358 | 0.0 | 0.0 | 0.0106 | 0.0 | nan | 0.5929 | 0.7820 | 0.0 | 0.4854 | 0.1647 | nan | 0.1832 | 0.0 | 0.0 | 0.7128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6123 | 0.0 | 0.1969 | 0.0 | 0.0 | nan | 0.0 | 0.0023 | 0.0 | 0.0 | 0.7653 | 0.6963 | 0.8804 | 0.0 | 0.0 | 0.0098 | 0.0 |
147
+ | 0.864 | 1.15 | 460 | 0.8285 | 0.1871 | 0.2330 | 0.7743 | nan | 0.8138 | 0.9395 | 0.0 | 0.6150 | 0.2174 | nan | 0.2799 | 0.0 | 0.0 | 0.9398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8939 | 0.0 | 0.0380 | 0.0 | 0.0 | nan | 0.0 | 0.0071 | 0.0 | 0.0 | 0.9182 | 0.8546 | 0.9326 | 0.0 | 0.0 | 0.0072 | 0.0 | nan | 0.5785 | 0.7945 | 0.0 | 0.4986 | 0.1889 | nan | 0.2285 | 0.0 | 0.0 | 0.6345 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6203 | 0.0 | 0.0356 | 0.0 | 0.0 | nan | 0.0 | 0.0071 | 0.0 | 0.0 | 0.7858 | 0.7263 | 0.8831 | 0.0 | 0.0 | 0.0070 | 0.0 |
148
+ | 1.1038 | 1.2 | 480 | 0.8424 | 0.1934 | 0.2389 | 0.7756 | nan | 0.6686 | 0.9679 | 0.0 | 0.6309 | 0.3414 | nan | 0.3363 | 0.0 | 0.0 | 0.8914 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9260 | 0.0 | 0.0435 | 0.0 | 0.0 | nan | 0.0 | 0.0439 | 0.0 | 0.0 | 0.8998 | 0.9057 | 0.9432 | 0.0 | 0.0 | 0.0451 | 0.0 | nan | 0.5839 | 0.7944 | 0.0 | 0.5166 | 0.2481 | nan | 0.2274 | 0.0 | 0.0 | 0.7174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5978 | 0.0 | 0.0388 | 0.0 | 0.0 | nan | 0.0 | 0.0429 | 0.0 | 0.0 | 0.7874 | 0.7051 | 0.8895 | 0.0 | 0.0 | 0.0383 | 0.0 |
149
+ | 0.7316 | 1.25 | 500 | 0.8044 | 0.2013 | 0.2515 | 0.7831 | nan | 0.7636 | 0.9435 | 0.0 | 0.5808 | 0.4689 | nan | 0.4001 | 0.0 | 0.0 | 0.9282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8854 | 0.0 | 0.2230 | 0.0 | 0.0 | nan | 0.0 | 0.0374 | 0.0 | 0.0 | 0.9060 | 0.9175 | 0.9477 | 0.0 | 0.0 | 0.0447 | 0.0 | nan | 0.5973 | 0.8132 | 0.0 | 0.5180 | 0.3150 | nan | 0.2651 | 0.0 | 0.0 | 0.6912 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6288 | 0.0 | 0.1796 | 0.0 | 0.0 | nan | 0.0 | 0.0365 | 0.0 | 0.0 | 0.7830 | 0.6928 | 0.8835 | 0.0 | 0.0 | 0.0364 | 0.0 |
150
+ | 0.8354 | 1.3 | 520 | 0.8133 | 0.1914 | 0.2358 | 0.7725 | nan | 0.8282 | 0.9275 | 0.0 | 0.4710 | 0.3078 | nan | 0.2966 | 0.0 | 0.0 | 0.9124 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9318 | 0.0 | 0.1146 | 0.0 | 0.0 | nan | 0.0 | 0.0343 | 0.0 | 0.0 | 0.9260 | 0.8574 | 0.9263 | 0.0 | 0.0 | 0.0108 | 0.0 | nan | 0.5677 | 0.7906 | 0.0 | 0.4449 | 0.2616 | nan | 0.2338 | 0.0 | 0.0 | 0.7037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6056 | 0.0 | 0.1018 | 0.0 | 0.0 | nan | 0.0 | 0.0340 | 0.0 | 0.0 | 0.7789 | 0.7060 | 0.8866 | 0.0 | 0.0 | 0.0101 | 0.0 |
151
+ | 0.393 | 1.35 | 540 | 0.7798 | 0.2001 | 0.2497 | 0.7829 | nan | 0.7838 | 0.9291 | 0.0 | 0.6871 | 0.3828 | nan | 0.3642 | 0.0 | 0.0 | 0.9158 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8658 | 0.0 | 0.2278 | 0.0 | 0.0 | nan | 0.0 | 0.0377 | 0.0 | 0.0 | 0.9394 | 0.8976 | 0.9466 | 0.0 | 0.0 | 0.0123 | 0.0 | nan | 0.6044 | 0.8051 | 0.0 | 0.5457 | 0.2971 | nan | 0.2586 | 0.0 | 0.0 | 0.6959 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6314 | 0.0 | 0.1836 | 0.0 | 0.0 | nan | 0.0 | 0.0368 | 0.0 | 0.0 | 0.7648 | 0.6827 | 0.8847 | 0.0 | 0.0 | 0.0115 | 0.0 |
152
+ | 0.8005 | 1.4 | 560 | 0.7905 | 0.1999 | 0.2438 | 0.7834 | nan | 0.7293 | 0.9711 | 0.0 | 0.6267 | 0.2994 | nan | 0.2955 | 0.0 | 0.0 | 0.9193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9042 | 0.0 | 0.3408 | 0.0 | 0.0 | nan | 0.0 | 0.0460 | 0.0 | 0.0 | 0.9256 | 0.8096 | 0.9264 | 0.0 | 0.0 | 0.0067 | 0.0 | nan | 0.6192 | 0.7916 | 0.0 | 0.5595 | 0.2382 | nan | 0.2245 | 0.0 | 0.0 | 0.6994 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6221 | 0.0 | 0.2395 | 0.0 | 0.0 | nan | 0.0 | 0.0452 | 0.0 | 0.0 | 0.7818 | 0.6788 | 0.8901 | 0.0 | 0.0 | 0.0065 | 0.0 |
153
+ | 1.1368 | 1.45 | 580 | 0.8022 | 0.1974 | 0.2481 | 0.7734 | nan | 0.8556 | 0.8998 | 0.0 | 0.5845 | 0.3805 | nan | 0.2213 | 0.0 | 0.0 | 0.9201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8873 | 0.0 | 0.3416 | 0.0 | 0.0 | nan | 0.0 | 0.0725 | 0.0 | 0.0 | 0.9076 | 0.9167 | 0.9397 | 0.0 | 0.0 | 0.0117 | 0.0 | nan | 0.5673 | 0.7996 | 0.0 | 0.5118 | 0.2996 | nan | 0.1588 | 0.0 | 0.0 | 0.6774 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6371 | 0.0 | 0.2584 | 0.0 | 0.0 | nan | 0.0 | 0.0693 | 0.0 | 0.0 | 0.7568 | 0.6834 | 0.8876 | 0.0 | 0.0 | 0.0112 | 0.0 |
154
+ | 0.6787 | 1.5 | 600 | 0.7612 | 0.2084 | 0.2588 | 0.7929 | nan | 0.7932 | 0.9457 | 0.0 | 0.6633 | 0.4719 | nan | 0.3431 | 0.0 | 0.0 | 0.9169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8916 | 0.0 | 0.3567 | 0.0 | 0.0 | nan | 0.0 | 0.0888 | 0.0 | 0.0 | 0.9000 | 0.9375 | 0.9516 | 0.0 | 0.0 | 0.0214 | 0.0 | nan | 0.6295 | 0.8204 | 0.0 | 0.5823 | 0.3363 | nan | 0.2593 | 0.0 | 0.0 | 0.7102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6424 | 0.0 | 0.2590 | 0.0 | 0.0 | nan | 0.0 | 0.0830 | 0.0 | 0.0 | 0.7745 | 0.6670 | 0.8865 | 0.0 | 0.0 | 0.0198 | 0.0 |
155
+ | 1.1082 | 1.55 | 620 | 0.7512 | 0.2110 | 0.2590 | 0.7960 | nan | 0.7743 | 0.9544 | 0.0 | 0.6595 | 0.5236 | nan | 0.3373 | 0.0 | 0.0 | 0.9282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8915 | 0.0 | 0.3487 | 0.0 | 0.0 | nan | 0.0 | 0.0861 | 0.0 | 0.0 | 0.9251 | 0.8795 | 0.9459 | 0.0 | 0.0 | 0.0328 | 0.0 | nan | 0.6434 | 0.8168 | 0.0 | 0.5709 | 0.3471 | nan | 0.2607 | 0.0 | 0.0 | 0.7052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6444 | 0.0 | 0.2489 | 0.0 | 0.0 | nan | 0.0 | 0.0802 | 0.0 | 0.0 | 0.7869 | 0.7279 | 0.8912 | 0.0 | 0.0 | 0.0287 | 0.0 |
156
+ | 0.4478 | 1.6 | 640 | 0.7734 | 0.2059 | 0.2576 | 0.7835 | nan | 0.7156 | 0.9429 | 0.0 | 0.6800 | 0.4737 | nan | 0.3374 | 0.0 | 0.0 | 0.9302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8480 | 0.0 | 0.4120 | 0.0 | 0.0 | nan | 0.0 | 0.0924 | 0.0 | 0.0 | 0.9317 | 0.8875 | 0.9497 | 0.0 | 0.0 | 0.0435 | 0.0 | nan | 0.5951 | 0.8091 | 0.0 | 0.4900 | 0.3152 | nan | 0.2683 | 0.0 | 0.0 | 0.7103 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6391 | 0.0 | 0.2632 | 0.0 | 0.0 | nan | 0.0 | 0.0853 | 0.0 | 0.0 | 0.7759 | 0.7108 | 0.8898 | 0.0 | 0.0 | 0.0372 | 0.0 |
157
+ | 0.4418 | 1.65 | 660 | 0.7509 | 0.2086 | 0.2561 | 0.7908 | nan | 0.7932 | 0.9408 | 0.0 | 0.6657 | 0.4764 | nan | 0.3407 | 0.0 | 0.0 | 0.8987 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9229 | 0.0 | 0.3130 | 0.0 | 0.0 | nan | 0.0 | 0.0713 | 0.0 | 0.0 | 0.9002 | 0.8990 | 0.9457 | 0.0 | 0.0 | 0.0273 | 0.0 | nan | 0.6236 | 0.8125 | 0.0 | 0.5500 | 0.3411 | nan | 0.2735 | 0.0 | 0.0 | 0.7468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6110 | 0.0 | 0.2254 | 0.0 | 0.0 | nan | 0.0 | 0.0675 | 0.0 | 0.0 | 0.7911 | 0.7172 | 0.8902 | 0.0 | 0.0 | 0.0245 | 0.0 |
158
+ | 0.5509 | 1.7 | 680 | 0.7444 | 0.2086 | 0.2558 | 0.7954 | nan | 0.7851 | 0.9546 | 0.0 | 0.6568 | 0.4434 | nan | 0.3502 | 0.0 | 0.0 | 0.9222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9121 | 0.0 | 0.2909 | 0.0 | 0.0 | nan | 0.0 | 0.0788 | 0.0 | 0.0 | 0.9158 | 0.9053 | 0.9467 | 0.0 | 0.0 | 0.0242 | 0.0 | nan | 0.6368 | 0.8195 | 0.0 | 0.5580 | 0.3363 | nan | 0.2812 | 0.0 | 0.0 | 0.7142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6319 | 0.0 | 0.2284 | 0.0 | 0.0 | nan | 0.0 | 0.0741 | 0.0 | 0.0 | 0.7875 | 0.6937 | 0.8910 | 0.0 | 0.0 | 0.0220 | 0.0 |
159
+ | 0.4418 | 1.75 | 700 | 0.7429 | 0.2102 | 0.2584 | 0.7963 | nan | 0.7949 | 0.9470 | 0.0 | 0.6460 | 0.5140 | nan | 0.4080 | 0.0 | 0.0 | 0.9187 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9320 | 0.0 | 0.2289 | 0.0 | 0.0 | nan | 0.0 | 0.0749 | 0.0 | 0.0 | 0.9000 | 0.9236 | 0.9450 | 0.0 | 0.0 | 0.0354 | 0.0 | nan | 0.6329 | 0.8229 | 0.0 | 0.5702 | 0.3704 | nan | 0.3002 | 0.0 | 0.0 | 0.7263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6290 | 0.0 | 0.1860 | 0.0 | 0.0 | nan | 0.0 | 0.0707 | 0.0 | 0.0 | 0.7934 | 0.6993 | 0.8924 | 0.0 | 0.0 | 0.0311 | 0.0 |
160
+ | 1.1377 | 1.8 | 720 | 0.7557 | 0.2069 | 0.2538 | 0.7947 | nan | 0.7764 | 0.9604 | 0.0 | 0.6152 | 0.4478 | nan | 0.3852 | 0.0 | 0.0 | 0.9378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9359 | 0.0 | 0.1908 | 0.0 | 0.0 | nan | 0.0 | 0.0709 | 0.0 | 0.0 | 0.8934 | 0.9274 | 0.9453 | 0.0 | 0.0 | 0.0354 | 0.0 | nan | 0.6402 | 0.8153 | 0.0 | 0.5580 | 0.3400 | nan | 0.2904 | 0.0 | 0.0 | 0.7131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6356 | 0.0 | 0.1647 | 0.0 | 0.0 | nan | 0.0 | 0.0675 | 0.0 | 0.0 | 0.7903 | 0.6823 | 0.8930 | 0.0 | 0.0 | 0.0310 | 0.0 |
161
+ | 0.6711 | 1.85 | 740 | 0.7428 | 0.2102 | 0.2585 | 0.7962 | nan | 0.8056 | 0.9498 | 0.0 | 0.6294 | 0.4972 | nan | 0.3879 | 0.0 | 0.0 | 0.9314 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9297 | 0.0 | 0.2605 | 0.0 | 0.0 | nan | 0.0 | 0.0755 | 0.0 | 0.0 | 0.8838 | 0.9305 | 0.9521 | 0.0 | 0.0 | 0.0399 | 0.0 | nan | 0.6366 | 0.8208 | 0.0 | 0.5724 | 0.3578 | nan | 0.2960 | 0.0 | 0.0 | 0.7290 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6394 | 0.0 | 0.2086 | 0.0 | 0.0 | nan | 0.0 | 0.0708 | 0.0 | 0.0 | 0.7883 | 0.6796 | 0.8913 | 0.0 | 0.0 | 0.0345 | 0.0 |
162
+ | 0.8084 | 1.9 | 760 | 0.7372 | 0.2089 | 0.2568 | 0.7952 | nan | 0.7489 | 0.9522 | 0.0 | 0.6611 | 0.5194 | nan | 0.3998 | 0.0 | 0.0 | 0.9210 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9349 | 0.0 | 0.2080 | 0.0 | 0.0 | nan | 0.0 | 0.0750 | 0.0 | 0.0 | 0.9224 | 0.8935 | 0.9472 | 0.0 | 0.0 | 0.0341 | 0.0 | nan | 0.6278 | 0.8211 | 0.0 | 0.5544 | 0.3528 | nan | 0.3032 | 0.0 | 0.0 | 0.7320 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6346 | 0.0 | 0.1770 | 0.0 | 0.0 | nan | 0.0 | 0.0703 | 0.0 | 0.0 | 0.7895 | 0.6990 | 0.8923 | 0.0 | 0.0 | 0.0302 | 0.0 |
163
+ | 0.4363 | 1.95 | 780 | 0.7384 | 0.2087 | 0.2566 | 0.7950 | nan | 0.7507 | 0.9530 | 0.0 | 0.6661 | 0.4853 | nan | 0.4066 | 0.0 | 0.0 | 0.9299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9426 | 0.0 | 0.2098 | 0.0 | 0.0 | nan | 0.0 | 0.0822 | 0.0 | 0.0 | 0.9113 | 0.9011 | 0.9429 | 0.0 | 0.0 | 0.0301 | 0.0 | nan | 0.6230 | 0.8204 | 0.0 | 0.5522 | 0.3507 | nan | 0.3033 | 0.0 | 0.0 | 0.7166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6316 | 0.0 | 0.1761 | 0.0 | 0.0 | nan | 0.0 | 0.0764 | 0.0 | 0.0 | 0.7975 | 0.7106 | 0.8946 | 0.0 | 0.0 | 0.0269 | 0.0 |
164
+ | 0.8956 | 2.0 | 800 | 0.7351 | 0.2097 | 0.2588 | 0.7962 | nan | 0.7781 | 0.9422 | 0.0 | 0.6678 | 0.5103 | nan | 0.3970 | 0.0 | 0.0 | 0.9361 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9283 | 0.0 | 0.2268 | 0.0000 | 0.0 | nan | 0.0 | 0.0873 | 0.0 | 0.0 | 0.9230 | 0.9084 | 0.9461 | 0.0 | 0.0 | 0.0306 | 0.0 | nan | 0.6262 | 0.8234 | 0.0 | 0.5624 | 0.3620 | nan | 0.3029 | 0.0 | 0.0 | 0.7060 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6436 | 0.0 | 0.1895 | 0.0000 | 0.0 | nan | 0.0 | 0.0808 | 0.0 | 0.0 | 0.7906 | 0.7024 | 0.8933 | 0.0 | 0.0 | 0.0273 | 0.0 |
165
+
166
+
167
+ ### Framework versions
168
+
169
+ - Transformers 4.46.2
170
+ - Pytorch 2.5.1+cu121
171
+ - Datasets 3.1.0
172
+ - Tokenizers 0.20.3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json ADDED
@@ -0,0 +1,144 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "flat-road",
33
+ "2": "flat-sidewalk",
34
+ "3": "flat-crosswalk",
35
+ "4": "flat-cyclinglane",
36
+ "5": "flat-parkingdriveway",
37
+ "6": "flat-railtrack",
38
+ "7": "flat-curb",
39
+ "8": "human-person",
40
+ "9": "human-rider",
41
+ "10": "vehicle-car",
42
+ "11": "vehicle-truck",
43
+ "12": "vehicle-bus",
44
+ "13": "vehicle-tramtrain",
45
+ "14": "vehicle-motorcycle",
46
+ "15": "vehicle-bicycle",
47
+ "16": "vehicle-caravan",
48
+ "17": "vehicle-cartrailer",
49
+ "18": "construction-building",
50
+ "19": "construction-door",
51
+ "20": "construction-wall",
52
+ "21": "construction-fenceguardrail",
53
+ "22": "construction-bridge",
54
+ "23": "construction-tunnel",
55
+ "24": "construction-stairs",
56
+ "25": "object-pole",
57
+ "26": "object-trafficsign",
58
+ "27": "object-trafficlight",
59
+ "28": "nature-vegetation",
60
+ "29": "nature-terrain",
61
+ "30": "sky",
62
+ "31": "void-ground",
63
+ "32": "void-dynamic",
64
+ "33": "void-static",
65
+ "34": "void-unclear"
66
+ },
67
+ "image_size": 224,
68
+ "initializer_range": 0.02,
69
+ "label2id": {
70
+ "construction-bridge": 22,
71
+ "construction-building": 18,
72
+ "construction-door": 19,
73
+ "construction-fenceguardrail": 21,
74
+ "construction-stairs": 24,
75
+ "construction-tunnel": 23,
76
+ "construction-wall": 20,
77
+ "flat-crosswalk": 3,
78
+ "flat-curb": 7,
79
+ "flat-cyclinglane": 4,
80
+ "flat-parkingdriveway": 5,
81
+ "flat-railtrack": 6,
82
+ "flat-road": 1,
83
+ "flat-sidewalk": 2,
84
+ "human-person": 8,
85
+ "human-rider": 9,
86
+ "nature-terrain": 29,
87
+ "nature-vegetation": 28,
88
+ "object-pole": 25,
89
+ "object-trafficlight": 27,
90
+ "object-trafficsign": 26,
91
+ "sky": 30,
92
+ "unlabeled": 0,
93
+ "vehicle-bicycle": 15,
94
+ "vehicle-bus": 12,
95
+ "vehicle-car": 10,
96
+ "vehicle-caravan": 16,
97
+ "vehicle-cartrailer": 17,
98
+ "vehicle-motorcycle": 14,
99
+ "vehicle-tramtrain": 13,
100
+ "vehicle-truck": 11,
101
+ "void-dynamic": 32,
102
+ "void-ground": 31,
103
+ "void-static": 33,
104
+ "void-unclear": 34
105
+ },
106
+ "layer_norm_eps": 1e-06,
107
+ "mlp_ratios": [
108
+ 4,
109
+ 4,
110
+ 4,
111
+ 4
112
+ ],
113
+ "model_type": "segformer",
114
+ "num_attention_heads": [
115
+ 1,
116
+ 2,
117
+ 5,
118
+ 8
119
+ ],
120
+ "num_channels": 3,
121
+ "num_encoder_blocks": 4,
122
+ "patch_sizes": [
123
+ 7,
124
+ 3,
125
+ 3,
126
+ 3
127
+ ],
128
+ "reshape_last_stage": true,
129
+ "semantic_loss_ignore_index": 255,
130
+ "sr_ratios": [
131
+ 8,
132
+ 4,
133
+ 2,
134
+ 1
135
+ ],
136
+ "strides": [
137
+ 4,
138
+ 2,
139
+ 2,
140
+ 2
141
+ ],
142
+ "torch_dtype": "float32",
143
+ "transformers_version": "4.46.2"
144
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b3f0fd212fc1e15bcde393f5e4b77219c04083d2b7415316729f3af78aa611bc
3
+ size 14918708
runs/Dec05_02-26-43_1e955daa0c6f/events.out.tfevents.1733365690.1e955daa0c6f.282.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c90ee995988a31b0c275c9e34df83510865137e3707e7b25de41a28b9db50e4e
3
+ size 1153880
runs/Dec05_04-25-27_1e955daa0c6f/events.out.tfevents.1733372747.1e955daa0c6f.29407.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aa2ae34a3be71cde0184262b715548248ac7b8718b34b1b1d3e5659619054c91
3
+ size 375615
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dbf7e5f85e34453b980272a2efe8b502fd06fe82e6b0655b04082f6f5fdc6845
3
+ size 5368