unreal-hug commited on
Commit
1a45394
1 Parent(s): 110e26b

End of training

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. README.md +282 -198
  2. config.json +98 -0
  3. model.safetensors +3 -0
  4. tmp-checkpoint-100/config.json +98 -0
  5. tmp-checkpoint-100/model.safetensors +3 -0
  6. tmp-checkpoint-100/optimizer.pt +3 -0
  7. tmp-checkpoint-100/rng_state.pth +3 -0
  8. tmp-checkpoint-100/scheduler.pt +3 -0
  9. tmp-checkpoint-100/trainer_state.json +796 -0
  10. tmp-checkpoint-100/training_args.bin +3 -0
  11. tmp-checkpoint-1000/config.json +98 -0
  12. tmp-checkpoint-1000/model.safetensors +3 -0
  13. tmp-checkpoint-1000/optimizer.pt +3 -0
  14. tmp-checkpoint-1000/rng_state.pth +3 -0
  15. tmp-checkpoint-1000/scheduler.pt +3 -0
  16. tmp-checkpoint-1000/trainer_state.json +0 -0
  17. tmp-checkpoint-1000/training_args.bin +3 -0
  18. tmp-checkpoint-1020/config.json +98 -0
  19. tmp-checkpoint-1020/model.safetensors +3 -0
  20. tmp-checkpoint-1020/optimizer.pt +3 -0
  21. tmp-checkpoint-1020/rng_state.pth +3 -0
  22. tmp-checkpoint-1020/scheduler.pt +3 -0
  23. tmp-checkpoint-1020/trainer_state.json +0 -0
  24. tmp-checkpoint-1020/training_args.bin +3 -0
  25. tmp-checkpoint-1040/config.json +98 -0
  26. tmp-checkpoint-1040/model.safetensors +3 -0
  27. tmp-checkpoint-1040/optimizer.pt +3 -0
  28. tmp-checkpoint-1040/rng_state.pth +3 -0
  29. tmp-checkpoint-1040/scheduler.pt +3 -0
  30. tmp-checkpoint-1040/trainer_state.json +0 -0
  31. tmp-checkpoint-1040/training_args.bin +3 -0
  32. tmp-checkpoint-1060/config.json +98 -0
  33. tmp-checkpoint-1060/model.safetensors +3 -0
  34. tmp-checkpoint-1060/optimizer.pt +3 -0
  35. tmp-checkpoint-1060/rng_state.pth +3 -0
  36. tmp-checkpoint-1060/scheduler.pt +3 -0
  37. tmp-checkpoint-1060/trainer_state.json +0 -0
  38. tmp-checkpoint-1060/training_args.bin +3 -0
  39. tmp-checkpoint-1080/config.json +98 -0
  40. tmp-checkpoint-1080/model.safetensors +3 -0
  41. tmp-checkpoint-1080/optimizer.pt +3 -0
  42. tmp-checkpoint-1080/rng_state.pth +3 -0
  43. tmp-checkpoint-1080/scheduler.pt +3 -0
  44. tmp-checkpoint-1080/trainer_state.json +0 -0
  45. tmp-checkpoint-1080/training_args.bin +3 -0
  46. tmp-checkpoint-1100/config.json +98 -0
  47. tmp-checkpoint-1100/model.safetensors +3 -0
  48. tmp-checkpoint-1100/optimizer.pt +3 -0
  49. tmp-checkpoint-1100/rng_state.pth +3 -0
  50. tmp-checkpoint-1100/scheduler.pt +3 -0
README.md CHANGED
@@ -1,201 +1,285 @@
1
  ---
2
- library_name: transformers
3
- tags: []
 
 
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
200
-
201
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: other
3
+ base_model: nvidia/mit-b0
4
+ tags:
5
+ - vision
6
+ - image-segmentation
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: segformer-b0-finetuned-segments-ECHO-jan-25-v2
10
+ results: []
11
  ---
12
 
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # segformer-b0-finetuned-segments-ECHO-jan-25-v2
17
+
18
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the unreal-hug/REAL_DATASET_SEG_401_6_lbls dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.4155
21
+ - Mean Iou: 0.3349
22
+ - Mean Accuracy: 0.3935
23
+ - Overall Accuracy: 0.5591
24
+ - Accuracy Unlabeled: nan
25
+ - Accuracy Lv: 0.6815
26
+ - Accuracy Rv: 0.3865
27
+ - Accuracy Ra: 0.5805
28
+ - Accuracy La: 0.6544
29
+ - Accuracy Vs: 0.1155
30
+ - Accuracy As: nan
31
+ - Accuracy Mk: 0.0497
32
+ - Accuracy Tk: nan
33
+ - Accuracy Asd: 0.2779
34
+ - Accuracy Vsd: 0.3995
35
+ - Accuracy Ak: 0.3959
36
+ - Iou Unlabeled: 0.0
37
+ - Iou Lv: 0.6626
38
+ - Iou Rv: 0.3764
39
+ - Iou Ra: 0.5699
40
+ - Iou La: 0.6056
41
+ - Iou Vs: 0.1108
42
+ - Iou As: nan
43
+ - Iou Mk: 0.0485
44
+ - Iou Tk: nan
45
+ - Iou Asd: 0.2565
46
+ - Iou Vsd: 0.3465
47
+ - Iou Ak: 0.3718
48
+
49
+ ## Model description
50
+
51
+ More information needed
52
+
53
+ ## Intended uses & limitations
54
+
55
+ More information needed
56
+
57
+ ## Training and evaluation data
58
+
59
+ More information needed
60
+
61
+ ## Training procedure
62
+
63
+ ### Training hyperparameters
64
+
65
+ The following hyperparameters were used during training:
66
+ - learning_rate: 0.0001
67
+ - train_batch_size: 2
68
+ - eval_batch_size: 2
69
+ - seed: 42
70
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
71
+ - lr_scheduler_type: linear
72
+ - num_epochs: 25
73
+
74
+ ### Training results
75
+
76
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Lv | Accuracy Rv | Accuracy Ra | Accuracy La | Accuracy Vs | Accuracy As | Accuracy Mk | Accuracy Tk | Accuracy Asd | Accuracy Vsd | Accuracy Ak | Iou Unlabeled | Iou Lv | Iou Rv | Iou Ra | Iou La | Iou Vs | Iou As | Iou Mk | Iou Tk | Iou Asd | Iou Vsd | Iou Ak |
77
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:------------:|:------------:|:-----------:|:-------------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:-------:|:-------:|:------:|
78
+ | 2.0322 | 0.12 | 20 | 2.2124 | 0.0954 | 0.1885 | 0.3033 | nan | 0.3903 | 0.4680 | 0.0850 | 0.0173 | 0.0 | nan | 0.0011 | nan | 0.0 | 0.1087 | 0.6263 | 0.0 | 0.2970 | 0.2085 | 0.0782 | 0.0172 | 0.0 | nan | 0.0011 | 0.0 | 0.0 | 0.0823 | 0.3647 |
79
+ | 1.6027 | 0.25 | 40 | 1.5649 | 0.0789 | 0.1168 | 0.2640 | nan | 0.5149 | 0.0061 | 0.0264 | 0.0839 | 0.0 | nan | 0.0 | nan | 0.0001 | 0.0014 | 0.4180 | 0.0 | 0.3418 | 0.0061 | 0.0262 | 0.0787 | 0.0 | nan | 0.0 | nan | 0.0001 | 0.0014 | 0.3342 |
80
+ | 1.2877 | 0.38 | 60 | 1.2616 | 0.0943 | 0.1296 | 0.2685 | nan | 0.4665 | 0.0053 | 0.0547 | 0.2421 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0050 | 0.3930 | 0.0 | 0.3612 | 0.0053 | 0.0529 | 0.1877 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0050 | 0.3312 |
81
+ | 1.0981 | 0.5 | 80 | 1.2208 | 0.0967 | 0.1552 | 0.3898 | nan | 0.8151 | 0.0079 | 0.0082 | 0.0794 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4863 | 0.0 | 0.4737 | 0.0079 | 0.0082 | 0.0750 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4020 |
82
+ | 1.0235 | 0.62 | 100 | 0.9343 | 0.1218 | 0.1888 | 0.4419 | nan | 0.8508 | 0.0102 | 0.0423 | 0.3015 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4947 | 0.0 | 0.5319 | 0.0101 | 0.0418 | 0.2283 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4059 |
83
+ | 0.8977 | 0.75 | 120 | 0.7806 | 0.1592 | 0.2227 | 0.4764 | nan | 0.8124 | 0.1787 | 0.1188 | 0.4178 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4763 | 0.0 | 0.6151 | 0.1741 | 0.1124 | 0.2995 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3906 |
84
+ | 0.6932 | 0.88 | 140 | 0.6246 | 0.1262 | 0.1590 | 0.3766 | nan | 0.6794 | 0.2019 | 0.1415 | 0.2810 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.1276 | 0.0 | 0.5674 | 0.1941 | 0.1372 | 0.2414 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.1217 |
85
+ | 0.6168 | 1.0 | 160 | 0.6124 | 0.1752 | 0.2277 | 0.4717 | nan | 0.7500 | 0.3261 | 0.1491 | 0.4375 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3864 | 0.0 | 0.6106 | 0.2973 | 0.1475 | 0.3522 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3439 |
86
+ | 0.5758 | 1.12 | 180 | 0.5658 | 0.2037 | 0.2520 | 0.4750 | nan | 0.6646 | 0.3955 | 0.3596 | 0.4433 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4048 | 0.0 | 0.6048 | 0.3705 | 0.3133 | 0.3865 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3623 |
87
+ | 0.5081 | 1.25 | 200 | 0.5116 | 0.2316 | 0.2993 | 0.5280 | nan | 0.6460 | 0.4867 | 0.4741 | 0.6477 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.4396 | 0.0 | 0.6098 | 0.4523 | 0.3961 | 0.4611 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3965 |
88
+ | 0.6351 | 1.38 | 220 | 0.4879 | 0.1127 | 0.1324 | 0.2609 | nan | 0.3749 | 0.0902 | 0.2601 | 0.3883 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.0783 | 0.0 | 0.3623 | 0.0897 | 0.2510 | 0.3466 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.0774 |
89
+ | 0.6241 | 1.5 | 240 | 0.4593 | 0.2439 | 0.3090 | 0.5686 | nan | 0.7439 | 0.4492 | 0.5367 | 0.6916 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3597 | 0.0 | 0.6995 | 0.4322 | 0.4400 | 0.5265 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3410 |
90
+ | 0.4315 | 1.62 | 260 | 0.4082 | 0.2175 | 0.2611 | 0.4948 | nan | 0.6811 | 0.3535 | 0.4253 | 0.5871 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.3025 | 0.0 | 0.6398 | 0.3459 | 0.3952 | 0.5052 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.2886 |
91
+ | 0.5236 | 1.75 | 280 | 0.4651 | 0.1063 | 0.1353 | 0.2191 | nan | 0.2161 | 0.0885 | 0.3687 | 0.4434 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.1015 | 0.0 | 0.2138 | 0.0884 | 0.3282 | 0.3313 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0 | 0.1012 |
92
+ | 0.3688 | 1.88 | 300 | 0.4279 | 0.2796 | 0.3459 | 0.6382 | nan | 0.8529 | 0.5705 | 0.5493 | 0.6449 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0051 | 0.4903 | 0.0 | 0.7546 | 0.5277 | 0.5044 | 0.5537 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0051 | 0.4500 |
93
+ | 0.3659 | 2.0 | 320 | 0.3907 | 0.1881 | 0.2192 | 0.4461 | nan | 0.7156 | 0.1476 | 0.3144 | 0.4135 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0006 | 0.3810 | 0.0 | 0.6851 | 0.1461 | 0.3012 | 0.3919 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0006 | 0.3560 |
94
+ | 0.3243 | 2.12 | 340 | 0.3846 | 0.2737 | 0.3272 | 0.5846 | nan | 0.7313 | 0.4747 | 0.6435 | 0.7038 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0010 | 0.3904 | 0.0 | 0.7045 | 0.4610 | 0.5733 | 0.6223 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0010 | 0.3752 |
95
+ | 0.4169 | 2.25 | 360 | 0.4099 | 0.1292 | 0.1475 | 0.2563 | nan | 0.3286 | 0.0968 | 0.3184 | 0.3088 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0005 | 0.2741 | 0.0 | 0.3241 | 0.0965 | 0.3061 | 0.2960 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0005 | 0.2685 |
96
+ | 0.2951 | 2.38 | 380 | 0.3583 | 0.2277 | 0.2701 | 0.4962 | nan | 0.6695 | 0.2136 | 0.5730 | 0.6784 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0442 | 0.2519 | 0.0 | 0.6409 | 0.2125 | 0.5347 | 0.5967 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0435 | 0.2488 |
97
+ | 0.3847 | 2.5 | 400 | 0.3565 | 0.2410 | 0.2843 | 0.5032 | nan | 0.6544 | 0.3067 | 0.5888 | 0.6409 | 0.0 | nan | 0.0 | nan | 0.0 | 0.1089 | 0.2594 | 0.0 | 0.6304 | 0.3023 | 0.5347 | 0.5853 | 0.0 | nan | 0.0 | nan | 0.0 | 0.1033 | 0.2535 |
98
+ | 0.339 | 2.62 | 420 | 0.3715 | 0.3085 | 0.3697 | 0.6227 | nan | 0.7530 | 0.5620 | 0.6411 | 0.6900 | 0.0 | nan | 0.0 | nan | 0.0015 | 0.1337 | 0.5460 | 0.0 | 0.7083 | 0.5347 | 0.5722 | 0.6160 | 0.0 | nan | 0.0 | nan | 0.0015 | 0.1261 | 0.5260 |
99
+ | 0.7318 | 2.75 | 440 | 0.3574 | 0.2478 | 0.2950 | 0.4525 | nan | 0.5247 | 0.2338 | 0.5171 | 0.6926 | 0.0 | nan | 0.0 | nan | 0.0097 | 0.3424 | 0.3350 | 0.0 | 0.5100 | 0.2322 | 0.4803 | 0.6174 | 0.0 | nan | 0.0 | nan | 0.0097 | 0.3048 | 0.3235 |
100
+ | 0.2905 | 2.88 | 460 | 0.3609 | 0.1903 | 0.2262 | 0.3935 | nan | 0.4734 | 0.1841 | 0.5925 | 0.5863 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0201 | 0.1799 | 0.0 | 0.4671 | 0.1834 | 0.5348 | 0.5192 | 0.0 | nan | 0.0 | nan | 0.0 | 0.0199 | 0.1786 |
101
+ | 0.3793 | 3.0 | 480 | 0.3452 | 0.2734 | 0.3213 | 0.5973 | nan | 0.8327 | 0.4635 | 0.5374 | 0.6168 | 0.0 | nan | 0.0 | nan | 0.0263 | 0.0746 | 0.3404 | 0.0 | 0.7723 | 0.4512 | 0.5139 | 0.5700 | 0.0 | nan | 0.0 | nan | 0.0260 | 0.0734 | 0.3270 |
102
+ | 0.3922 | 3.12 | 500 | 0.3695 | 0.2151 | 0.2604 | 0.3659 | nan | 0.2751 | 0.2847 | 0.6318 | 0.7206 | 0.0 | nan | 0.0 | nan | 0.0045 | 0.1409 | 0.2863 | 0.0 | 0.2726 | 0.2824 | 0.5652 | 0.6103 | 0.0 | nan | 0.0 | nan | 0.0045 | 0.1339 | 0.2824 |
103
+ | 0.3299 | 3.25 | 520 | 0.3326 | 0.3063 | 0.3610 | 0.6202 | nan | 0.8029 | 0.5001 | 0.5866 | 0.6558 | 0.0 | nan | 0.0 | nan | 0.0422 | 0.1575 | 0.5038 | 0.0 | 0.7639 | 0.4932 | 0.5461 | 0.5948 | 0.0 | nan | 0.0 | nan | 0.0416 | 0.1472 | 0.4762 |
104
+ | 0.2547 | 3.38 | 540 | 0.3323 | 0.2650 | 0.3121 | 0.5052 | nan | 0.6042 | 0.4311 | 0.6044 | 0.5282 | 0.0 | nan | 0.0 | nan | 0.0813 | 0.1438 | 0.4164 | 0.0 | 0.5882 | 0.4189 | 0.5254 | 0.5006 | 0.0 | nan | 0.0 | nan | 0.0801 | 0.1359 | 0.4010 |
105
+ | 0.2154 | 3.5 | 560 | 0.3211 | 0.2903 | 0.3397 | 0.5796 | nan | 0.7327 | 0.4341 | 0.6265 | 0.6269 | 0.0 | nan | 0.0 | nan | 0.0340 | 0.1079 | 0.4955 | 0.0 | 0.7034 | 0.4304 | 0.5828 | 0.5874 | 0.0 | nan | 0.0 | nan | 0.0337 | 0.1022 | 0.4634 |
106
+ | 0.3146 | 3.62 | 580 | 0.3642 | 0.3096 | 0.3854 | 0.5967 | nan | 0.6732 | 0.4518 | 0.7254 | 0.8100 | 0.0 | nan | 0.0 | nan | 0.1293 | 0.2673 | 0.4116 | 0.0 | 0.6557 | 0.4444 | 0.5843 | 0.6517 | 0.0 | nan | 0.0 | nan | 0.1212 | 0.2434 | 0.3957 |
107
+ | 0.2216 | 3.75 | 600 | 0.3178 | 0.3241 | 0.3818 | 0.5998 | nan | 0.7614 | 0.4294 | 0.5415 | 0.7168 | 0.0 | nan | 0.0 | nan | 0.1378 | 0.4248 | 0.4242 | 0.0 | 0.7254 | 0.4212 | 0.5274 | 0.6520 | 0.0 | nan | 0.0 | nan | 0.1338 | 0.3687 | 0.4125 |
108
+ | 0.2973 | 3.88 | 620 | 0.3199 | 0.3486 | 0.4127 | 0.6217 | nan | 0.7369 | 0.5178 | 0.5558 | 0.7739 | 0.0 | nan | 0.0 | nan | 0.1965 | 0.4456 | 0.4876 | 0.0 | 0.7072 | 0.4974 | 0.5407 | 0.7010 | 0.0 | nan | 0.0 | nan | 0.1859 | 0.3845 | 0.4692 |
109
+ | 0.2434 | 4.0 | 640 | 0.3179 | 0.3415 | 0.4057 | 0.6154 | nan | 0.7161 | 0.4582 | 0.6827 | 0.7445 | 0.0 | nan | 0.0 | nan | 0.1295 | 0.3827 | 0.5376 | 0.0 | 0.6869 | 0.4483 | 0.6280 | 0.6769 | 0.0 | nan | 0.0 | nan | 0.1254 | 0.3360 | 0.5134 |
110
+ | 0.2283 | 4.12 | 660 | 0.3310 | 0.2584 | 0.3073 | 0.5076 | nan | 0.6237 | 0.2267 | 0.6673 | 0.7014 | 0.0 | nan | 0.0 | nan | 0.0718 | 0.1288 | 0.3464 | 0.0 | 0.6078 | 0.2260 | 0.5912 | 0.6270 | 0.0 | nan | 0.0 | nan | 0.0707 | 0.1212 | 0.3401 |
111
+ | 0.6263 | 4.25 | 680 | 0.3153 | 0.2947 | 0.3436 | 0.5198 | nan | 0.6461 | 0.2824 | 0.5694 | 0.6236 | 0.0 | nan | 0.0 | nan | 0.1512 | 0.3950 | 0.4248 | 0.0 | 0.6244 | 0.2811 | 0.5498 | 0.5852 | 0.0 | nan | 0.0 | nan | 0.1449 | 0.3479 | 0.4140 |
112
+ | 0.1708 | 4.38 | 700 | 0.3173 | 0.2957 | 0.3435 | 0.5834 | nan | 0.7531 | 0.3902 | 0.5853 | 0.7153 | 0.0 | nan | 0.0 | nan | 0.1331 | 0.1239 | 0.3908 | 0.0 | 0.7188 | 0.3869 | 0.5656 | 0.6612 | 0.0 | nan | 0.0 | nan | 0.1298 | 0.1159 | 0.3788 |
113
+ | 0.246 | 4.5 | 720 | 0.3138 | 0.2570 | 0.2955 | 0.5052 | nan | 0.6686 | 0.3447 | 0.4552 | 0.5989 | 0.0 | nan | 0.0 | nan | 0.1237 | 0.1230 | 0.3459 | 0.0 | 0.6469 | 0.3416 | 0.4442 | 0.5640 | 0.0 | nan | 0.0 | nan | 0.1213 | 0.1159 | 0.3366 |
114
+ | 0.3876 | 4.62 | 740 | 0.3084 | 0.3646 | 0.4336 | 0.6321 | nan | 0.7367 | 0.4776 | 0.6536 | 0.7478 | 0.0 | nan | 0.0 | nan | 0.2351 | 0.4973 | 0.5539 | 0.0 | 0.7108 | 0.4700 | 0.6166 | 0.6824 | 0.0 | nan | 0.0 | nan | 0.2208 | 0.4179 | 0.5274 |
115
+ | 0.4766 | 4.75 | 760 | 0.3251 | 0.2509 | 0.2894 | 0.4716 | nan | 0.6095 | 0.3498 | 0.4348 | 0.4989 | 0.0 | nan | 0.0 | nan | 0.1119 | 0.2020 | 0.3972 | 0.0 | 0.5954 | 0.3434 | 0.4157 | 0.4754 | 0.0 | nan | 0.0 | nan | 0.1092 | 0.1836 | 0.3865 |
116
+ | 0.4431 | 4.88 | 780 | 0.3052 | 0.3104 | 0.3654 | 0.5781 | nan | 0.7024 | 0.4862 | 0.5150 | 0.7363 | 0.0 | nan | 0.0 | nan | 0.2163 | 0.2456 | 0.3866 | 0.0 | 0.6801 | 0.4736 | 0.5039 | 0.6461 | 0.0 | nan | 0.0 | nan | 0.2021 | 0.2209 | 0.3771 |
117
+ | 0.2319 | 5.0 | 800 | 0.3104 | 0.3316 | 0.3938 | 0.5875 | nan | 0.6790 | 0.5433 | 0.6711 | 0.6572 | 0.0 | nan | 0.0 | nan | 0.2908 | 0.3349 | 0.3675 | 0.0 | 0.6628 | 0.5265 | 0.5922 | 0.6113 | 0.0 | nan | 0.0 | nan | 0.2656 | 0.2987 | 0.3585 |
118
+ | 0.3361 | 5.12 | 820 | 0.3125 | 0.3219 | 0.3808 | 0.5905 | nan | 0.7234 | 0.3480 | 0.6026 | 0.7698 | 0.0 | nan | 0.0 | nan | 0.1558 | 0.3461 | 0.4818 | 0.0 | 0.7028 | 0.3453 | 0.5677 | 0.6877 | 0.0 | nan | 0.0 | nan | 0.1498 | 0.3077 | 0.4584 |
119
+ | 0.412 | 5.25 | 840 | 0.3477 | 0.2427 | 0.2810 | 0.4099 | nan | 0.3970 | 0.2768 | 0.5965 | 0.6918 | 0.0 | nan | 0.0 | nan | 0.1238 | 0.1614 | 0.2814 | 0.0 | 0.3899 | 0.2752 | 0.5703 | 0.6446 | 0.0 | nan | 0.0 | nan | 0.1208 | 0.1494 | 0.2767 |
120
+ | 0.1799 | 5.38 | 860 | 0.3132 | 0.3444 | 0.4035 | 0.6161 | nan | 0.7651 | 0.4804 | 0.6609 | 0.5953 | 0.0 | nan | 0.0 | nan | 0.2167 | 0.3901 | 0.5227 | 0.0 | 0.7389 | 0.4759 | 0.6240 | 0.5643 | 0.0 | nan | 0.0 | nan | 0.2033 | 0.3414 | 0.4965 |
121
+ | 0.1716 | 5.5 | 880 | 0.3186 | 0.2805 | 0.3289 | 0.4955 | nan | 0.5685 | 0.3185 | 0.5785 | 0.6888 | 0.0 | nan | 0.0 | nan | 0.1934 | 0.2548 | 0.3577 | 0.0 | 0.5592 | 0.3165 | 0.5487 | 0.6271 | 0.0 | nan | 0.0 | nan | 0.1798 | 0.2295 | 0.3441 |
122
+ | 0.4388 | 5.62 | 900 | 0.3171 | 0.4098 | 0.4914 | 0.7348 | nan | 0.8788 | 0.6109 | 0.7475 | 0.7979 | 0.0 | nan | 0.0 | nan | 0.2607 | 0.5234 | 0.6032 | 0.0 | 0.8320 | 0.5956 | 0.6842 | 0.7286 | 0.0 | nan | 0.0 | nan | 0.2445 | 0.4481 | 0.5648 |
123
+ | 0.2632 | 5.75 | 920 | 0.3163 | 0.2697 | 0.3130 | 0.4551 | nan | 0.5145 | 0.3445 | 0.4613 | 0.6042 | 0.0 | nan | 0.0 | nan | 0.1180 | 0.3896 | 0.3846 | 0.0 | 0.5045 | 0.3404 | 0.4568 | 0.5718 | 0.0 | nan | 0.0 | nan | 0.1141 | 0.3416 | 0.3677 |
124
+ | 0.3024 | 5.88 | 940 | 0.3063 | 0.3883 | 0.4645 | 0.6758 | nan | 0.7778 | 0.6624 | 0.7137 | 0.7013 | 0.0 | nan | 0.0 | nan | 0.3253 | 0.4950 | 0.5054 | 0.0 | 0.7479 | 0.6323 | 0.6434 | 0.6506 | 0.0 | nan | 0.0 | nan | 0.2919 | 0.4251 | 0.4919 |
125
+ | 0.2551 | 6.0 | 960 | 0.3489 | 0.2261 | 0.2625 | 0.4024 | nan | 0.5011 | 0.1084 | 0.4074 | 0.6709 | 0.0 | nan | 0.0 | nan | 0.1666 | 0.2592 | 0.2489 | 0.0 | 0.4903 | 0.1081 | 0.4042 | 0.6181 | 0.0 | nan | 0.0 | nan | 0.1562 | 0.2369 | 0.2470 |
126
+ | 0.3281 | 6.12 | 980 | 0.2939 | 0.3635 | 0.4275 | 0.6116 | nan | 0.6803 | 0.5665 | 0.6418 | 0.6806 | 0.0 | nan | 0.0 | nan | 0.2545 | 0.4660 | 0.5579 | 0.0 | 0.6659 | 0.5508 | 0.6049 | 0.6359 | 0.0 | nan | 0.0 | nan | 0.2372 | 0.4071 | 0.5330 |
127
+ | 0.1372 | 6.25 | 1000 | 0.2998 | 0.3755 | 0.4413 | 0.6450 | nan | 0.7530 | 0.5417 | 0.6673 | 0.7023 | 0.0 | nan | 0.0 | nan | 0.2979 | 0.4450 | 0.5648 | 0.0 | 0.7287 | 0.5300 | 0.6359 | 0.6582 | 0.0 | nan | 0.0 | nan | 0.2737 | 0.3899 | 0.5389 |
128
+ | 0.3485 | 6.38 | 1020 | 0.3398 | 0.2557 | 0.2941 | 0.4515 | nan | 0.5305 | 0.2957 | 0.5158 | 0.6263 | 0.0 | nan | 0.0 | nan | 0.1937 | 0.2043 | 0.2811 | 0.0 | 0.5206 | 0.2931 | 0.5070 | 0.5912 | 0.0 | nan | 0.0 | nan | 0.1833 | 0.1846 | 0.2769 |
129
+ | 0.3755 | 6.5 | 1040 | 0.3034 | 0.3526 | 0.4160 | 0.5795 | nan | 0.6346 | 0.4564 | 0.7050 | 0.6986 | 0.0 | nan | 0.0 | nan | 0.3010 | 0.4881 | 0.4598 | 0.0 | 0.6188 | 0.4509 | 0.6639 | 0.6554 | 0.0 | nan | 0.0 | nan | 0.2758 | 0.4166 | 0.4442 |
130
+ | 0.2617 | 6.62 | 1060 | 0.3166 | 0.2905 | 0.3384 | 0.4820 | nan | 0.5401 | 0.3225 | 0.5981 | 0.6153 | 0.0 | nan | 0.0 | nan | 0.1880 | 0.4060 | 0.3756 | 0.0 | 0.5337 | 0.3212 | 0.5770 | 0.5817 | 0.0 | nan | 0.0 | nan | 0.1774 | 0.3511 | 0.3627 |
131
+ | 0.2937 | 6.75 | 1080 | 0.3090 | 0.3864 | 0.4585 | 0.7031 | nan | 0.8093 | 0.6687 | 0.7189 | 0.7808 | 0.0 | nan | 0.0 | nan | 0.3271 | 0.2123 | 0.6092 | 0.0 | 0.7810 | 0.6460 | 0.6639 | 0.7094 | 0.0 | nan | 0.0 | nan | 0.2942 | 0.1957 | 0.5738 |
132
+ | 0.3588 | 6.88 | 1100 | 0.3011 | 0.3653 | 0.4310 | 0.6482 | nan | 0.8132 | 0.4360 | 0.6549 | 0.7523 | 0.0 | nan | 0.0 | nan | 0.3123 | 0.4840 | 0.4267 | 0.0 | 0.7784 | 0.4310 | 0.6235 | 0.6944 | 0.0 | nan | 0.0 | nan | 0.2884 | 0.4223 | 0.4149 |
133
+ | 0.1613 | 7.0 | 1120 | 0.3302 | 0.2838 | 0.3344 | 0.4622 | nan | 0.4687 | 0.3133 | 0.6402 | 0.6782 | 0.0 | nan | 0.0 | nan | 0.1601 | 0.4001 | 0.3490 | 0.0 | 0.4620 | 0.3100 | 0.5933 | 0.6247 | 0.0 | nan | 0.0 | nan | 0.1525 | 0.3558 | 0.3400 |
134
+ | 0.4217 | 7.12 | 1140 | 0.3087 | 0.3723 | 0.4451 | 0.6405 | nan | 0.7463 | 0.4361 | 0.7284 | 0.7778 | 0.0 | nan | 0.0 | nan | 0.3229 | 0.4981 | 0.4960 | 0.0 | 0.7258 | 0.4307 | 0.6602 | 0.7060 | 0.0 | nan | 0.0 | nan | 0.2946 | 0.4255 | 0.4800 |
135
+ | 0.1429 | 7.25 | 1160 | 0.3227 | 0.2794 | 0.3221 | 0.5335 | nan | 0.7381 | 0.3377 | 0.5059 | 0.5342 | 0.0 | nan | 0.0 | nan | 0.2719 | 0.1916 | 0.3198 | 0.0 | 0.7147 | 0.3361 | 0.4943 | 0.5100 | 0.0 | nan | 0.0 | nan | 0.2479 | 0.1761 | 0.3151 |
136
+ | 0.227 | 7.38 | 1180 | 0.3087 | 0.3749 | 0.4471 | 0.6221 | nan | 0.6645 | 0.5048 | 0.7103 | 0.7872 | 0.0 | nan | 0.0 | nan | 0.3502 | 0.4474 | 0.5594 | 0.0 | 0.6499 | 0.4966 | 0.6631 | 0.7065 | 0.0 | nan | 0.0 | nan | 0.3168 | 0.3915 | 0.5250 |
137
+ | 0.3733 | 7.5 | 1200 | 0.3304 | 0.2777 | 0.3229 | 0.4832 | nan | 0.5603 | 0.3886 | 0.5612 | 0.5532 | 0.0 | nan | 0.0 | nan | 0.1744 | 0.2915 | 0.3773 | 0.0 | 0.5501 | 0.3824 | 0.5434 | 0.5248 | 0.0 | nan | 0.0 | nan | 0.1655 | 0.2561 | 0.3542 |
138
+ | 0.3148 | 7.62 | 1220 | 0.3075 | 0.3787 | 0.4500 | 0.6531 | nan | 0.7425 | 0.5909 | 0.7189 | 0.7270 | 0.0 | nan | 0.0 | nan | 0.3257 | 0.4420 | 0.5030 | 0.0 | 0.7152 | 0.5687 | 0.6673 | 0.6700 | 0.0 | nan | 0.0 | nan | 0.2976 | 0.3823 | 0.4863 |
139
+ | 0.22 | 7.75 | 1240 | 0.3156 | 0.3340 | 0.3934 | 0.5589 | nan | 0.6127 | 0.4262 | 0.6387 | 0.7554 | 0.0 | nan | 0.0 | nan | 0.2170 | 0.4800 | 0.4108 | 0.0 | 0.6002 | 0.4203 | 0.6121 | 0.6848 | 0.0 | nan | 0.0 | nan | 0.2073 | 0.4168 | 0.3984 |
140
+ | 0.499 | 7.88 | 1260 | 0.3085 | 0.3454 | 0.4092 | 0.6278 | nan | 0.7534 | 0.4363 | 0.7109 | 0.7256 | 0.0 | nan | 0.0 | nan | 0.2525 | 0.2773 | 0.5267 | 0.0 | 0.7296 | 0.4311 | 0.6556 | 0.6668 | 0.0 | nan | 0.0 | nan | 0.2355 | 0.2512 | 0.4848 |
141
+ | 0.2604 | 8.0 | 1280 | 0.3123 | 0.3504 | 0.4118 | 0.6089 | nan | 0.7175 | 0.4779 | 0.6668 | 0.6668 | 0.0 | nan | 0.0 | nan | 0.2368 | 0.4184 | 0.5218 | 0.0 | 0.6949 | 0.4708 | 0.6402 | 0.6225 | 0.0 | nan | 0.0 | nan | 0.2192 | 0.3649 | 0.4915 |
142
+ | 0.146 | 8.12 | 1300 | 0.3274 | 0.3036 | 0.3526 | 0.5368 | nan | 0.6413 | 0.4378 | 0.6158 | 0.5674 | 0.0 | nan | 0.0 | nan | 0.1757 | 0.3165 | 0.4191 | 0.0 | 0.6216 | 0.4333 | 0.5946 | 0.5339 | 0.0 | nan | 0.0 | nan | 0.1697 | 0.2810 | 0.4015 |
143
+ | 0.1103 | 8.25 | 1320 | 0.3339 | 0.2738 | 0.3155 | 0.4762 | nan | 0.5556 | 0.4026 | 0.5015 | 0.5408 | 0.0 | nan | 0.0 | nan | 0.1370 | 0.2938 | 0.4084 | 0.0 | 0.5454 | 0.3969 | 0.4921 | 0.5111 | 0.0 | nan | 0.0 | nan | 0.1330 | 0.2658 | 0.3939 |
144
+ | 0.1323 | 8.38 | 1340 | 0.3179 | 0.3304 | 0.3865 | 0.5986 | nan | 0.7334 | 0.4769 | 0.6287 | 0.6839 | 0.0 | nan | 0.0 | nan | 0.2152 | 0.3473 | 0.3933 | 0.0 | 0.7022 | 0.4679 | 0.6091 | 0.6261 | 0.0 | nan | 0.0 | nan | 0.2049 | 0.3080 | 0.3863 |
145
+ | 0.1057 | 8.5 | 1360 | 0.4118 | 0.1949 | 0.2242 | 0.3076 | nan | 0.3007 | 0.1559 | 0.4492 | 0.5135 | 0.0 | nan | 0.0 | nan | 0.1282 | 0.2228 | 0.2472 | 0.0 | 0.2972 | 0.1552 | 0.4441 | 0.4845 | 0.0 | nan | 0.0 | nan | 0.1241 | 0.2028 | 0.2412 |
146
+ | 0.1248 | 8.62 | 1380 | 0.3228 | 0.4155 | 0.4991 | 0.7269 | nan | 0.8415 | 0.6462 | 0.7622 | 0.7778 | 0.0 | nan | 0.0 | nan | 0.3683 | 0.4851 | 0.6106 | 0.0 | 0.8100 | 0.6271 | 0.6994 | 0.7057 | 0.0 | nan | 0.0 | nan | 0.3280 | 0.4172 | 0.5674 |
147
+ | 0.1165 | 8.75 | 1400 | 0.3307 | 0.2995 | 0.3485 | 0.5194 | nan | 0.6149 | 0.3102 | 0.5592 | 0.6835 | 0.0 | nan | 0.0 | nan | 0.1857 | 0.3513 | 0.4318 | 0.0 | 0.5982 | 0.3074 | 0.5485 | 0.6349 | 0.0 | nan | 0.0 | nan | 0.1765 | 0.3142 | 0.4156 |
148
+ | 0.2999 | 8.88 | 1420 | 0.3766 | 0.2329 | 0.2673 | 0.3927 | nan | 0.4223 | 0.2954 | 0.4437 | 0.5620 | 0.0 | nan | 0.0 | nan | 0.1641 | 0.1857 | 0.3327 | 0.0 | 0.4163 | 0.2897 | 0.4411 | 0.5318 | 0.0 | nan | 0.0 | nan | 0.1570 | 0.1704 | 0.3232 |
149
+ | 0.2005 | 9.0 | 1440 | 0.3224 | 0.3457 | 0.4100 | 0.5800 | nan | 0.6510 | 0.4458 | 0.6701 | 0.6765 | 0.0 | nan | 0.0002 | nan | 0.2510 | 0.4869 | 0.5085 | 0.0 | 0.6327 | 0.4381 | 0.6441 | 0.6348 | 0.0 | nan | 0.0002 | nan | 0.2321 | 0.4092 | 0.4656 |
150
+ | 0.0952 | 9.12 | 1460 | 0.3368 | 0.2986 | 0.3475 | 0.5230 | nan | 0.5872 | 0.4686 | 0.6328 | 0.6070 | 0.0 | nan | 0.0 | nan | 0.2227 | 0.2252 | 0.3843 | 0.0 | 0.5748 | 0.4534 | 0.5986 | 0.5720 | 0.0 | nan | 0.0 | nan | 0.2114 | 0.2061 | 0.3692 |
151
+ | 0.3493 | 9.25 | 1480 | 0.3637 | 0.2527 | 0.2909 | 0.4285 | nan | 0.5072 | 0.3061 | 0.4793 | 0.5032 | 0.0 | nan | 0.0 | nan | 0.1539 | 0.3356 | 0.3325 | 0.0 | 0.4971 | 0.3025 | 0.4757 | 0.4801 | 0.0 | nan | 0.0 | nan | 0.1476 | 0.2993 | 0.3245 |
152
+ | 0.6102 | 9.38 | 1500 | 0.3302 | 0.3325 | 0.3885 | 0.5757 | nan | 0.6527 | 0.5490 | 0.6148 | 0.6472 | 0.0 | nan | 0.0 | nan | 0.2155 | 0.3649 | 0.4522 | 0.0 | 0.6350 | 0.5320 | 0.5993 | 0.6117 | 0.0 | nan | 0.0 | nan | 0.2015 | 0.3228 | 0.4231 |
153
+ | 0.1355 | 9.5 | 1520 | 0.3136 | 0.3397 | 0.3985 | 0.5917 | nan | 0.7078 | 0.4385 | 0.6197 | 0.6754 | 0.0 | nan | 0.0 | nan | 0.2411 | 0.4128 | 0.4908 | 0.0 | 0.6904 | 0.4342 | 0.6004 | 0.6272 | 0.0 | nan | 0.0 | nan | 0.2228 | 0.3615 | 0.4608 |
154
+ | 0.2828 | 9.62 | 1540 | 0.3214 | 0.3632 | 0.4329 | 0.6220 | nan | 0.7412 | 0.4203 | 0.6106 | 0.7371 | 0.0 | nan | 0.0004 | nan | 0.3513 | 0.4797 | 0.5558 | 0.0 | 0.7149 | 0.4099 | 0.5925 | 0.6724 | 0.0 | nan | 0.0004 | nan | 0.3130 | 0.4125 | 0.5160 |
155
+ | 0.2499 | 9.75 | 1560 | 0.3470 | 0.3178 | 0.3744 | 0.5370 | nan | 0.6135 | 0.4310 | 0.6613 | 0.5944 | 0.0 | nan | 0.0001 | nan | 0.3109 | 0.3698 | 0.3889 | 0.0 | 0.5963 | 0.4247 | 0.6161 | 0.5602 | 0.0 | nan | 0.0001 | nan | 0.2767 | 0.3282 | 0.3756 |
156
+ | 0.3973 | 9.88 | 1580 | 0.3292 | 0.3557 | 0.4222 | 0.6036 | nan | 0.6854 | 0.5253 | 0.6598 | 0.6929 | 0.0 | nan | 0.0 | nan | 0.2791 | 0.5040 | 0.4535 | 0.0 | 0.6632 | 0.5042 | 0.6184 | 0.6434 | 0.0 | nan | 0.0 | nan | 0.2550 | 0.4358 | 0.4374 |
157
+ | 0.1764 | 10.0 | 1600 | 0.3317 | 0.3493 | 0.4150 | 0.5984 | nan | 0.6897 | 0.4370 | 0.6896 | 0.7262 | 0.0 | nan | 0.0023 | nan | 0.2221 | 0.5208 | 0.4472 | 0.0 | 0.6702 | 0.4293 | 0.6340 | 0.6649 | 0.0 | nan | 0.0023 | nan | 0.2127 | 0.4505 | 0.4294 |
158
+ | 0.3667 | 10.12 | 1620 | 0.3224 | 0.3385 | 0.3978 | 0.5951 | nan | 0.7262 | 0.4129 | 0.5865 | 0.6864 | 0.0 | nan | 0.0041 | nan | 0.2466 | 0.4033 | 0.5143 | 0.0 | 0.7033 | 0.4053 | 0.5673 | 0.6383 | 0.0 | nan | 0.0041 | nan | 0.2340 | 0.3562 | 0.4768 |
159
+ | 0.2782 | 10.25 | 1640 | 0.3243 | 0.3675 | 0.4383 | 0.6355 | nan | 0.7304 | 0.5449 | 0.7001 | 0.7129 | 0.0 | nan | 0.0021 | nan | 0.2615 | 0.5006 | 0.4920 | 0.0 | 0.7054 | 0.5218 | 0.6534 | 0.6531 | 0.0 | nan | 0.0021 | nan | 0.2478 | 0.4247 | 0.4667 |
160
+ | 0.1716 | 10.38 | 1660 | 0.3199 | 0.3531 | 0.4131 | 0.6228 | nan | 0.7347 | 0.5687 | 0.6019 | 0.7168 | 0.0 | nan | 0.0001 | nan | 0.2756 | 0.3677 | 0.4526 | 0.0 | 0.7107 | 0.5562 | 0.5817 | 0.6609 | 0.0 | nan | 0.0001 | nan | 0.2583 | 0.3276 | 0.4354 |
161
+ | 0.1938 | 10.5 | 1680 | 0.3304 | 0.3369 | 0.4038 | 0.5768 | nan | 0.6403 | 0.3871 | 0.6906 | 0.7228 | 0.0 | nan | 0.0049 | nan | 0.2373 | 0.4005 | 0.5506 | 0.0 | 0.6248 | 0.3827 | 0.6360 | 0.6713 | 0.0 | nan | 0.0049 | nan | 0.2232 | 0.3537 | 0.4722 |
162
+ | 0.0939 | 10.62 | 1700 | 0.3178 | 0.3848 | 0.4610 | 0.6472 | nan | 0.7358 | 0.5401 | 0.7080 | 0.7452 | 0.0 | nan | 0.0062 | nan | 0.3044 | 0.6229 | 0.4866 | 0.0 | 0.7137 | 0.5308 | 0.6585 | 0.6909 | 0.0 | nan | 0.0062 | nan | 0.2798 | 0.5091 | 0.4593 |
163
+ | 0.1592 | 10.75 | 1720 | 0.3323 | 0.3312 | 0.3861 | 0.5834 | nan | 0.6894 | 0.6042 | 0.5845 | 0.5805 | 0.0 | nan | 0.0055 | nan | 0.2218 | 0.3658 | 0.4236 | 0.0 | 0.6704 | 0.5786 | 0.5710 | 0.5451 | 0.0 | nan | 0.0055 | nan | 0.2094 | 0.3236 | 0.4085 |
164
+ | 0.186 | 10.88 | 1740 | 0.3280 | 0.3838 | 0.4597 | 0.6407 | nan | 0.7285 | 0.4968 | 0.6695 | 0.7578 | 0.0 | nan | 0.0145 | nan | 0.4074 | 0.5172 | 0.5460 | 0.0 | 0.7056 | 0.4866 | 0.6377 | 0.6958 | 0.0 | nan | 0.0144 | nan | 0.3540 | 0.4358 | 0.5086 |
165
+ | 0.124 | 11.0 | 1760 | 0.4089 | 0.2396 | 0.2751 | 0.4128 | nan | 0.5165 | 0.2682 | 0.4594 | 0.4520 | 0.0 | nan | 0.0095 | nan | 0.2177 | 0.2646 | 0.2879 | 0.0 | 0.5080 | 0.2658 | 0.4487 | 0.4315 | 0.0 | nan | 0.0095 | nan | 0.2031 | 0.2473 | 0.2822 |
166
+ | 0.1084 | 11.12 | 1780 | 0.3512 | 0.3283 | 0.3864 | 0.5415 | nan | 0.5910 | 0.4464 | 0.6008 | 0.6724 | 0.0 | nan | 0.0105 | nan | 0.2345 | 0.4593 | 0.4626 | 0.0 | 0.5774 | 0.4313 | 0.5885 | 0.6218 | 0.0 | nan | 0.0105 | nan | 0.2220 | 0.3926 | 0.4390 |
167
+ | 0.3364 | 11.25 | 1800 | 0.3514 | 0.3245 | 0.3806 | 0.5425 | nan | 0.6046 | 0.4578 | 0.6162 | 0.6777 | 0.0 | nan | 0.0107 | nan | 0.2340 | 0.4572 | 0.3669 | 0.0 | 0.5876 | 0.4430 | 0.6018 | 0.6301 | 0.0 | nan | 0.0107 | nan | 0.2216 | 0.3951 | 0.3547 |
168
+ | 0.186 | 11.38 | 1820 | 0.3398 | 0.3337 | 0.3937 | 0.5743 | nan | 0.6718 | 0.3964 | 0.6547 | 0.6960 | 0.0 | nan | 0.0095 | nan | 0.2726 | 0.3967 | 0.4452 | 0.0 | 0.6555 | 0.3905 | 0.6240 | 0.6329 | 0.0 | nan | 0.0095 | nan | 0.2543 | 0.3500 | 0.4209 |
169
+ | 0.085 | 11.5 | 1840 | 0.3395 | 0.3541 | 0.4172 | 0.5969 | nan | 0.6834 | 0.4750 | 0.6591 | 0.6675 | 0.0029 | nan | 0.0163 | nan | 0.2791 | 0.4399 | 0.5321 | 0.0 | 0.6637 | 0.4648 | 0.6337 | 0.6201 | 0.0029 | nan | 0.0161 | nan | 0.2594 | 0.3801 | 0.5002 |
170
+ | 0.2861 | 11.62 | 1860 | 0.3575 | 0.3064 | 0.3548 | 0.5451 | nan | 0.6531 | 0.4069 | 0.6701 | 0.6340 | 0.0000 | nan | 0.0111 | nan | 0.2071 | 0.2802 | 0.3303 | 0.0 | 0.6370 | 0.4012 | 0.6428 | 0.5936 | 0.0000 | nan | 0.0110 | nan | 0.1952 | 0.2599 | 0.3230 |
171
+ | 0.2855 | 11.75 | 1880 | 0.3932 | 0.2415 | 0.2791 | 0.4276 | nan | 0.5475 | 0.2912 | 0.4359 | 0.4478 | 0.0 | nan | 0.0104 | nan | 0.1519 | 0.3073 | 0.3201 | 0.0 | 0.5356 | 0.2855 | 0.4203 | 0.4307 | 0.0 | nan | 0.0104 | nan | 0.1446 | 0.2782 | 0.3099 |
172
+ | 0.1548 | 11.88 | 1900 | 0.3398 | 0.3513 | 0.4167 | 0.6004 | nan | 0.7097 | 0.3884 | 0.6305 | 0.7354 | 0.0002 | nan | 0.0138 | nan | 0.2828 | 0.4755 | 0.5141 | 0.0 | 0.6920 | 0.3832 | 0.6074 | 0.6691 | 0.0002 | nan | 0.0138 | nan | 0.2669 | 0.4032 | 0.4775 |
173
+ | 0.123 | 12.0 | 1920 | 0.3279 | 0.3495 | 0.4113 | 0.6009 | nan | 0.7406 | 0.4218 | 0.5964 | 0.6630 | 0.0028 | nan | 0.0183 | nan | 0.2751 | 0.5053 | 0.4784 | 0.0 | 0.7190 | 0.4153 | 0.5867 | 0.6196 | 0.0028 | nan | 0.0181 | nan | 0.2542 | 0.4223 | 0.4568 |
174
+ | 0.1138 | 12.12 | 1940 | 0.3308 | 0.3983 | 0.4753 | 0.6792 | nan | 0.7682 | 0.5832 | 0.7082 | 0.8045 | 0.0028 | nan | 0.0209 | nan | 0.3537 | 0.4880 | 0.5477 | 0.0 | 0.7424 | 0.5667 | 0.6711 | 0.7307 | 0.0028 | nan | 0.0207 | nan | 0.3242 | 0.4171 | 0.5076 |
175
+ | 0.1582 | 12.25 | 1960 | 0.3342 | 0.3720 | 0.4410 | 0.6306 | nan | 0.7554 | 0.4633 | 0.6444 | 0.7011 | 0.0062 | nan | 0.0198 | nan | 0.3419 | 0.5214 | 0.5155 | 0.0 | 0.7309 | 0.4556 | 0.6269 | 0.6504 | 0.0062 | nan | 0.0196 | nan | 0.3087 | 0.4391 | 0.4823 |
176
+ | 0.3449 | 12.38 | 1980 | 0.3976 | 0.2429 | 0.2797 | 0.4225 | nan | 0.5187 | 0.2813 | 0.4234 | 0.5055 | 0.0 | nan | 0.0111 | nan | 0.1548 | 0.2818 | 0.3408 | 0.0 | 0.5069 | 0.2772 | 0.4172 | 0.4855 | 0.0 | nan | 0.0111 | nan | 0.1490 | 0.2573 | 0.3246 |
177
+ | 0.0296 | 12.5 | 2000 | 0.3332 | 0.3525 | 0.4166 | 0.5979 | nan | 0.6804 | 0.4709 | 0.6924 | 0.6988 | 0.0 | nan | 0.0172 | nan | 0.3403 | 0.3831 | 0.4662 | 0.0 | 0.6615 | 0.4562 | 0.6530 | 0.6470 | 0.0 | nan | 0.0171 | nan | 0.3043 | 0.3389 | 0.4466 |
178
+ | 0.2308 | 12.62 | 2020 | 0.3418 | 0.3574 | 0.4208 | 0.6123 | nan | 0.7446 | 0.3992 | 0.6491 | 0.7142 | 0.0 | nan | 0.0130 | nan | 0.3085 | 0.4940 | 0.4650 | 0.0 | 0.7205 | 0.3948 | 0.6289 | 0.6646 | 0.0 | nan | 0.0129 | nan | 0.2849 | 0.4238 | 0.4435 |
179
+ | 0.3632 | 12.75 | 2040 | 0.3847 | 0.2694 | 0.3118 | 0.4605 | nan | 0.5391 | 0.3475 | 0.5106 | 0.5736 | 0.0 | nan | 0.0090 | nan | 0.1930 | 0.3284 | 0.3053 | 0.0 | 0.5278 | 0.3411 | 0.5017 | 0.5453 | 0.0 | nan | 0.0089 | nan | 0.1846 | 0.2916 | 0.2930 |
180
+ | 0.284 | 12.88 | 2060 | 0.3425 | 0.3836 | 0.4597 | 0.6559 | nan | 0.7536 | 0.5838 | 0.6914 | 0.7272 | 0.0094 | nan | 0.0148 | nan | 0.3443 | 0.5142 | 0.4986 | 0.0 | 0.7278 | 0.5383 | 0.6459 | 0.6707 | 0.0094 | nan | 0.0147 | nan | 0.3187 | 0.4375 | 0.4732 |
181
+ | 0.1566 | 13.0 | 2080 | 0.3586 | 0.3093 | 0.3628 | 0.5439 | nan | 0.6643 | 0.4048 | 0.5981 | 0.6177 | 0.0075 | nan | 0.0124 | nan | 0.2639 | 0.3493 | 0.3475 | 0.0 | 0.6486 | 0.3917 | 0.5754 | 0.5727 | 0.0075 | nan | 0.0123 | nan | 0.2477 | 0.3058 | 0.3311 |
182
+ | 0.1545 | 13.12 | 2100 | 0.3630 | 0.3109 | 0.3644 | 0.5329 | nan | 0.6530 | 0.3620 | 0.5186 | 0.6261 | 0.0122 | nan | 0.0107 | nan | 0.1922 | 0.4837 | 0.4209 | 0.0 | 0.6339 | 0.3549 | 0.5053 | 0.5850 | 0.0122 | nan | 0.0106 | nan | 0.1847 | 0.4173 | 0.4049 |
183
+ | 0.1118 | 13.25 | 2120 | 0.3435 | 0.3579 | 0.4228 | 0.6077 | nan | 0.7231 | 0.4451 | 0.6276 | 0.7012 | 0.0175 | nan | 0.0249 | nan | 0.2535 | 0.5218 | 0.4904 | 0.0 | 0.7009 | 0.4350 | 0.6087 | 0.6502 | 0.0175 | nan | 0.0246 | nan | 0.2404 | 0.4449 | 0.4571 |
184
+ | 0.0828 | 13.38 | 2140 | 0.3544 | 0.3361 | 0.3956 | 0.5662 | nan | 0.6733 | 0.3762 | 0.6572 | 0.6600 | 0.0125 | nan | 0.0328 | nan | 0.3596 | 0.3828 | 0.4058 | 0.0 | 0.6520 | 0.3665 | 0.6258 | 0.6161 | 0.0125 | nan | 0.0321 | nan | 0.3236 | 0.3448 | 0.3878 |
185
+ | 0.2605 | 13.5 | 2160 | 0.3451 | 0.3732 | 0.4421 | 0.6309 | nan | 0.7398 | 0.4876 | 0.6322 | 0.7386 | 0.0182 | nan | 0.0378 | nan | 0.3453 | 0.4635 | 0.5161 | 0.0 | 0.7155 | 0.4705 | 0.6104 | 0.6734 | 0.0182 | nan | 0.0369 | nan | 0.3171 | 0.4016 | 0.4886 |
186
+ | 0.0129 | 13.62 | 2180 | 0.3919 | 0.2765 | 0.3196 | 0.4836 | nan | 0.5955 | 0.3326 | 0.5530 | 0.5408 | 0.0179 | nan | 0.0140 | nan | 0.1846 | 0.2991 | 0.3392 | 0.0 | 0.5800 | 0.3270 | 0.5417 | 0.5094 | 0.0179 | nan | 0.0139 | nan | 0.1754 | 0.2717 | 0.3283 |
187
+ | 0.1744 | 13.75 | 2200 | 0.3543 | 0.3287 | 0.3864 | 0.5730 | nan | 0.6835 | 0.3695 | 0.6680 | 0.7169 | 0.0132 | nan | 0.0120 | nan | 0.2276 | 0.4016 | 0.3853 | 0.0 | 0.6621 | 0.3663 | 0.6327 | 0.6620 | 0.0132 | nan | 0.0120 | nan | 0.2145 | 0.3531 | 0.3708 |
188
+ | 0.0863 | 13.88 | 2220 | 0.3536 | 0.3503 | 0.4130 | 0.6052 | nan | 0.7206 | 0.4645 | 0.5874 | 0.6916 | 0.0115 | nan | 0.0200 | nan | 0.3492 | 0.3373 | 0.5352 | 0.0 | 0.6962 | 0.4546 | 0.5755 | 0.6385 | 0.0115 | nan | 0.0198 | nan | 0.3158 | 0.2964 | 0.4949 |
189
+ | 0.2218 | 14.0 | 2240 | 0.3552 | 0.3527 | 0.4186 | 0.6185 | nan | 0.7443 | 0.4727 | 0.6913 | 0.6287 | 0.0118 | nan | 0.0181 | nan | 0.3010 | 0.3621 | 0.5372 | 0.0 | 0.7196 | 0.4609 | 0.6293 | 0.5846 | 0.0118 | nan | 0.0180 | nan | 0.2818 | 0.3178 | 0.5032 |
190
+ | 0.1603 | 14.12 | 2260 | 0.3853 | 0.2835 | 0.3305 | 0.4804 | nan | 0.5713 | 0.3329 | 0.5072 | 0.5800 | 0.0061 | nan | 0.0185 | nan | 0.2341 | 0.3506 | 0.3738 | 0.0 | 0.5570 | 0.3257 | 0.4960 | 0.5457 | 0.0061 | nan | 0.0183 | nan | 0.2195 | 0.3115 | 0.3554 |
191
+ | 0.1556 | 14.25 | 2280 | 0.3580 | 0.3469 | 0.4112 | 0.6041 | nan | 0.7349 | 0.4383 | 0.6388 | 0.6923 | 0.0131 | nan | 0.0196 | nan | 0.3261 | 0.4219 | 0.4161 | 0.0 | 0.7132 | 0.4194 | 0.6082 | 0.6401 | 0.0131 | nan | 0.0194 | nan | 0.2961 | 0.3680 | 0.3921 |
192
+ | 0.2714 | 14.38 | 2300 | 0.3716 | 0.3215 | 0.3763 | 0.5454 | nan | 0.6469 | 0.3780 | 0.5739 | 0.6474 | 0.0130 | nan | 0.0128 | nan | 0.2433 | 0.4166 | 0.4552 | 0.0 | 0.6285 | 0.3722 | 0.5628 | 0.6024 | 0.0130 | nan | 0.0127 | nan | 0.2288 | 0.3676 | 0.4273 |
193
+ | 0.2624 | 14.5 | 2320 | 0.3524 | 0.3357 | 0.3931 | 0.5833 | nan | 0.7198 | 0.4113 | 0.6229 | 0.6339 | 0.0164 | nan | 0.0153 | nan | 0.2559 | 0.4207 | 0.4414 | 0.0 | 0.6967 | 0.4056 | 0.6019 | 0.5973 | 0.0164 | nan | 0.0151 | nan | 0.2376 | 0.3674 | 0.4185 |
194
+ | 0.2223 | 14.62 | 2340 | 0.3570 | 0.3148 | 0.3663 | 0.5565 | nan | 0.6820 | 0.4156 | 0.5894 | 0.6286 | 0.0155 | nan | 0.0149 | nan | 0.2473 | 0.3015 | 0.4022 | 0.0 | 0.6633 | 0.4037 | 0.5782 | 0.5907 | 0.0155 | nan | 0.0148 | nan | 0.2266 | 0.2731 | 0.3817 |
195
+ | 0.1125 | 14.75 | 2360 | 0.3766 | 0.3027 | 0.3526 | 0.5064 | nan | 0.6058 | 0.3030 | 0.5083 | 0.6290 | 0.0217 | nan | 0.0107 | nan | 0.2239 | 0.4108 | 0.4599 | 0.0 | 0.5916 | 0.2977 | 0.5031 | 0.5910 | 0.0217 | nan | 0.0106 | nan | 0.2101 | 0.3637 | 0.4379 |
196
+ | 0.1139 | 14.88 | 2380 | 0.3541 | 0.3752 | 0.4445 | 0.6352 | nan | 0.7230 | 0.5186 | 0.7029 | 0.7593 | 0.0249 | nan | 0.0174 | nan | 0.2855 | 0.4914 | 0.4778 | 0.0 | 0.7011 | 0.4986 | 0.6740 | 0.6961 | 0.0249 | nan | 0.0172 | nan | 0.2652 | 0.4230 | 0.4514 |
197
+ | 0.1841 | 15.0 | 2400 | 0.3596 | 0.3337 | 0.3915 | 0.5678 | nan | 0.6749 | 0.3898 | 0.6395 | 0.6779 | 0.0267 | nan | 0.0162 | nan | 0.2609 | 0.4275 | 0.4104 | 0.0 | 0.6576 | 0.3815 | 0.6209 | 0.6334 | 0.0267 | nan | 0.0161 | nan | 0.2406 | 0.3728 | 0.3871 |
198
+ | 0.1828 | 15.12 | 2420 | 0.3641 | 0.3060 | 0.3564 | 0.5373 | nan | 0.6464 | 0.3564 | 0.5524 | 0.6620 | 0.0132 | nan | 0.0079 | nan | 0.1817 | 0.3260 | 0.4615 | 0.0 | 0.6299 | 0.3491 | 0.5459 | 0.6195 | 0.0132 | nan | 0.0078 | nan | 0.1742 | 0.2937 | 0.4271 |
199
+ | 0.0202 | 15.25 | 2440 | 0.3637 | 0.3310 | 0.3910 | 0.5513 | nan | 0.6071 | 0.4158 | 0.6488 | 0.7365 | 0.0339 | nan | 0.0181 | nan | 0.2819 | 0.3906 | 0.3863 | 0.0 | 0.5933 | 0.3991 | 0.6291 | 0.6687 | 0.0339 | nan | 0.0179 | nan | 0.2585 | 0.3465 | 0.3631 |
200
+ | 0.3244 | 15.38 | 2460 | 0.3752 | 0.3290 | 0.3885 | 0.5348 | nan | 0.5915 | 0.4283 | 0.6163 | 0.6207 | 0.0442 | nan | 0.0320 | nan | 0.2746 | 0.4170 | 0.4722 | 0.0 | 0.5762 | 0.4087 | 0.5914 | 0.5837 | 0.0442 | nan | 0.0315 | nan | 0.2498 | 0.3633 | 0.4410 |
201
+ | 0.0619 | 15.5 | 2480 | 0.3794 | 0.3052 | 0.3560 | 0.5207 | nan | 0.6266 | 0.3283 | 0.6050 | 0.6392 | 0.0387 | nan | 0.0303 | nan | 0.2255 | 0.3605 | 0.3501 | 0.0 | 0.6104 | 0.3232 | 0.5877 | 0.5951 | 0.0387 | nan | 0.0300 | nan | 0.2125 | 0.3189 | 0.3357 |
202
+ | 0.0788 | 15.62 | 2500 | 0.3641 | 0.3564 | 0.4204 | 0.6062 | nan | 0.6924 | 0.5027 | 0.6264 | 0.7243 | 0.0330 | nan | 0.0246 | nan | 0.2279 | 0.4379 | 0.5142 | 0.0 | 0.6721 | 0.4863 | 0.6116 | 0.6645 | 0.0330 | nan | 0.0244 | nan | 0.2181 | 0.3780 | 0.4764 |
203
+ | 0.1819 | 15.75 | 2520 | 0.3730 | 0.3330 | 0.3919 | 0.5496 | nan | 0.6366 | 0.3864 | 0.5794 | 0.6584 | 0.0296 | nan | 0.0193 | nan | 0.2730 | 0.4636 | 0.4804 | 0.0 | 0.6209 | 0.3793 | 0.5687 | 0.6158 | 0.0296 | nan | 0.0190 | nan | 0.2520 | 0.3970 | 0.4476 |
204
+ | 0.1583 | 15.88 | 2540 | 0.3707 | 0.3437 | 0.4053 | 0.5775 | nan | 0.6838 | 0.4042 | 0.6664 | 0.6669 | 0.0270 | nan | 0.0380 | nan | 0.2653 | 0.4832 | 0.4127 | 0.0 | 0.6658 | 0.3962 | 0.6401 | 0.6213 | 0.0269 | nan | 0.0374 | nan | 0.2491 | 0.4074 | 0.3930 |
205
+ | 0.0973 | 16.0 | 2560 | 0.3789 | 0.3102 | 0.3645 | 0.5251 | nan | 0.6202 | 0.3226 | 0.6377 | 0.6612 | 0.0240 | nan | 0.0420 | nan | 0.2656 | 0.3646 | 0.3432 | 0.0 | 0.6064 | 0.3155 | 0.6133 | 0.6167 | 0.0240 | nan | 0.0411 | nan | 0.2490 | 0.3193 | 0.3171 |
206
+ | 0.188 | 16.12 | 2580 | 0.3646 | 0.3456 | 0.4070 | 0.5929 | nan | 0.7021 | 0.4230 | 0.6325 | 0.7055 | 0.0310 | nan | 0.0434 | nan | 0.3221 | 0.3235 | 0.4801 | 0.0 | 0.6798 | 0.4127 | 0.6135 | 0.6518 | 0.0309 | nan | 0.0423 | nan | 0.2943 | 0.2847 | 0.4458 |
207
+ | 0.0581 | 16.25 | 2600 | 0.4030 | 0.2992 | 0.3506 | 0.4870 | nan | 0.5565 | 0.3320 | 0.6005 | 0.6297 | 0.0621 | nan | 0.0399 | nan | 0.2508 | 0.4016 | 0.2824 | 0.0 | 0.5438 | 0.3243 | 0.5851 | 0.5852 | 0.0610 | nan | 0.0391 | nan | 0.2336 | 0.3470 | 0.2729 |
208
+ | 0.1891 | 16.38 | 2620 | 0.3766 | 0.3408 | 0.4029 | 0.5618 | nan | 0.6308 | 0.4264 | 0.6309 | 0.6672 | 0.0537 | nan | 0.0358 | nan | 0.2444 | 0.4289 | 0.5077 | 0.0 | 0.6150 | 0.4147 | 0.6043 | 0.6202 | 0.0528 | nan | 0.0353 | nan | 0.2302 | 0.3673 | 0.4679 |
209
+ | 0.2495 | 16.5 | 2640 | 0.3758 | 0.3481 | 0.4106 | 0.5897 | nan | 0.7078 | 0.4004 | 0.6212 | 0.6981 | 0.0769 | nan | 0.0402 | nan | 0.2365 | 0.4410 | 0.4736 | 0.0 | 0.6873 | 0.3937 | 0.6026 | 0.6416 | 0.0749 | nan | 0.0391 | nan | 0.2227 | 0.3776 | 0.4418 |
210
+ | 0.153 | 16.62 | 2660 | 0.3899 | 0.2935 | 0.3435 | 0.5074 | nan | 0.6166 | 0.3835 | 0.5905 | 0.5641 | 0.0402 | nan | 0.0290 | nan | 0.2244 | 0.3407 | 0.3025 | 0.0 | 0.6027 | 0.3733 | 0.5639 | 0.5320 | 0.0402 | nan | 0.0286 | nan | 0.2092 | 0.2969 | 0.2883 |
211
+ | 0.083 | 16.75 | 2680 | 0.3758 | 0.3137 | 0.3669 | 0.5555 | nan | 0.6916 | 0.3837 | 0.5386 | 0.6336 | 0.0400 | nan | 0.0247 | nan | 0.2183 | 0.3063 | 0.4655 | 0.0 | 0.6742 | 0.3761 | 0.5322 | 0.5845 | 0.0400 | nan | 0.0243 | nan | 0.2066 | 0.2715 | 0.4276 |
212
+ | 0.136 | 16.88 | 2700 | 0.3570 | 0.3606 | 0.4264 | 0.6044 | nan | 0.7070 | 0.4583 | 0.6819 | 0.6820 | 0.0525 | nan | 0.0404 | nan | 0.3045 | 0.4473 | 0.4636 | 0.0 | 0.6892 | 0.4490 | 0.6517 | 0.6349 | 0.0524 | nan | 0.0396 | nan | 0.2758 | 0.3799 | 0.4333 |
213
+ | 0.0609 | 17.0 | 2720 | 0.3657 | 0.3182 | 0.3719 | 0.5547 | nan | 0.6883 | 0.3931 | 0.5851 | 0.6371 | 0.0482 | nan | 0.0290 | nan | 0.2633 | 0.3435 | 0.3598 | 0.0 | 0.6688 | 0.3841 | 0.5697 | 0.5975 | 0.0482 | nan | 0.0285 | nan | 0.2450 | 0.3019 | 0.3384 |
214
+ | 0.1483 | 17.12 | 2740 | 0.3847 | 0.3068 | 0.3620 | 0.5117 | nan | 0.5805 | 0.3386 | 0.5781 | 0.6593 | 0.0445 | nan | 0.0343 | nan | 0.2763 | 0.3060 | 0.4403 | 0.0 | 0.5686 | 0.3269 | 0.5604 | 0.6165 | 0.0439 | nan | 0.0335 | nan | 0.2507 | 0.2700 | 0.3978 |
215
+ | 0.2166 | 17.25 | 2760 | 0.3986 | 0.2853 | 0.3307 | 0.5154 | nan | 0.6550 | 0.3083 | 0.5705 | 0.6325 | 0.0442 | nan | 0.0205 | nan | 0.1784 | 0.2618 | 0.3047 | 0.0 | 0.6390 | 0.3038 | 0.5618 | 0.5883 | 0.0441 | nan | 0.0203 | nan | 0.1703 | 0.2369 | 0.2887 |
216
+ | 0.096 | 17.38 | 2780 | 0.4041 | 0.2884 | 0.3367 | 0.4939 | nan | 0.6043 | 0.2952 | 0.5538 | 0.6037 | 0.0470 | nan | 0.0231 | nan | 0.2257 | 0.3329 | 0.3444 | 0.0 | 0.5912 | 0.2905 | 0.5458 | 0.5617 | 0.0470 | nan | 0.0228 | nan | 0.2106 | 0.2930 | 0.3217 |
217
+ | 0.0839 | 17.5 | 2800 | 0.3773 | 0.3516 | 0.4152 | 0.6011 | nan | 0.7099 | 0.4639 | 0.6553 | 0.7115 | 0.0586 | nan | 0.0328 | nan | 0.3030 | 0.3879 | 0.4140 | 0.0 | 0.6884 | 0.4484 | 0.6341 | 0.6528 | 0.0586 | nan | 0.0324 | nan | 0.2784 | 0.3349 | 0.3877 |
218
+ | 0.1185 | 17.62 | 2820 | 0.3753 | 0.3255 | 0.3816 | 0.5517 | nan | 0.6575 | 0.4173 | 0.6167 | 0.6411 | 0.0711 | nan | 0.0270 | nan | 0.2554 | 0.3842 | 0.3643 | 0.0 | 0.6400 | 0.4044 | 0.6019 | 0.5946 | 0.0708 | nan | 0.0267 | nan | 0.2403 | 0.3319 | 0.3448 |
219
+ | 0.1155 | 17.75 | 2840 | 0.3742 | 0.3354 | 0.3939 | 0.5575 | nan | 0.6693 | 0.3661 | 0.6355 | 0.6422 | 0.0786 | nan | 0.0323 | nan | 0.2502 | 0.4593 | 0.4115 | 0.0 | 0.6510 | 0.3595 | 0.6193 | 0.5977 | 0.0776 | nan | 0.0320 | nan | 0.2347 | 0.3894 | 0.3928 |
220
+ | 0.1311 | 17.88 | 2860 | 0.3862 | 0.3306 | 0.3882 | 0.5436 | nan | 0.6268 | 0.3595 | 0.6226 | 0.6840 | 0.0823 | nan | 0.0400 | nan | 0.2510 | 0.4016 | 0.4261 | 0.0 | 0.6125 | 0.3525 | 0.6087 | 0.6301 | 0.0816 | nan | 0.0394 | nan | 0.2348 | 0.3472 | 0.3995 |
221
+ | 0.0059 | 18.0 | 2880 | 0.3997 | 0.2999 | 0.3509 | 0.5209 | nan | 0.6406 | 0.3559 | 0.6409 | 0.5987 | 0.0524 | nan | 0.0490 | nan | 0.2912 | 0.2650 | 0.2648 | 0.0 | 0.6221 | 0.3495 | 0.6051 | 0.5625 | 0.0523 | nan | 0.0474 | nan | 0.2684 | 0.2353 | 0.2563 |
222
+ | 0.2505 | 18.12 | 2900 | 0.3948 | 0.3107 | 0.3648 | 0.5266 | nan | 0.6174 | 0.3923 | 0.5960 | 0.6209 | 0.0640 | nan | 0.0383 | nan | 0.2507 | 0.3099 | 0.3937 | 0.0 | 0.6028 | 0.3777 | 0.5739 | 0.5786 | 0.0634 | nan | 0.0375 | nan | 0.2350 | 0.2722 | 0.3660 |
223
+ | 0.1181 | 18.25 | 2920 | 0.3764 | 0.3401 | 0.3999 | 0.5709 | nan | 0.6792 | 0.4368 | 0.6006 | 0.6382 | 0.0752 | nan | 0.0399 | nan | 0.2711 | 0.3991 | 0.4587 | 0.0 | 0.6604 | 0.4226 | 0.5815 | 0.5978 | 0.0736 | nan | 0.0392 | nan | 0.2526 | 0.3440 | 0.4290 |
224
+ | 0.1169 | 18.38 | 2940 | 0.3737 | 0.3522 | 0.4141 | 0.5853 | nan | 0.6818 | 0.4810 | 0.6037 | 0.6819 | 0.1025 | nan | 0.0410 | nan | 0.2644 | 0.4201 | 0.4503 | 0.0 | 0.6615 | 0.4636 | 0.5918 | 0.6288 | 0.1004 | nan | 0.0400 | nan | 0.2486 | 0.3615 | 0.4256 |
225
+ | 0.1075 | 18.5 | 2960 | 0.3985 | 0.3176 | 0.3711 | 0.5438 | nan | 0.6841 | 0.3154 | 0.5765 | 0.6472 | 0.0906 | nan | 0.0435 | nan | 0.2634 | 0.3396 | 0.3801 | 0.0 | 0.6643 | 0.3104 | 0.5673 | 0.6015 | 0.0897 | nan | 0.0423 | nan | 0.2444 | 0.3020 | 0.3537 |
226
+ | 0.1468 | 18.62 | 2980 | 0.3809 | 0.3592 | 0.4249 | 0.5883 | nan | 0.6884 | 0.4059 | 0.6539 | 0.6849 | 0.0905 | nan | 0.0482 | nan | 0.2846 | 0.4887 | 0.4791 | 0.0 | 0.6685 | 0.3996 | 0.6290 | 0.6350 | 0.0881 | nan | 0.0469 | nan | 0.2632 | 0.4144 | 0.4469 |
227
+ | 0.1438 | 18.75 | 3000 | 0.4059 | 0.3267 | 0.3847 | 0.5308 | nan | 0.6163 | 0.4193 | 0.5594 | 0.6195 | 0.1263 | nan | 0.0415 | nan | 0.2273 | 0.4443 | 0.4081 | 0.0 | 0.6014 | 0.4035 | 0.5470 | 0.5813 | 0.1224 | nan | 0.0406 | nan | 0.2127 | 0.3774 | 0.3810 |
228
+ | 0.1021 | 18.88 | 3020 | 0.3904 | 0.3525 | 0.4154 | 0.5957 | nan | 0.7113 | 0.4367 | 0.6382 | 0.7089 | 0.1015 | nan | 0.0432 | nan | 0.2595 | 0.4177 | 0.4213 | 0.0 | 0.6907 | 0.4264 | 0.6212 | 0.6528 | 0.0988 | nan | 0.0422 | nan | 0.2428 | 0.3582 | 0.3923 |
229
+ | 0.0308 | 19.0 | 3040 | 0.3790 | 0.3502 | 0.4129 | 0.5842 | nan | 0.6850 | 0.4325 | 0.6381 | 0.6733 | 0.0994 | nan | 0.0429 | nan | 0.2607 | 0.4038 | 0.4803 | 0.0 | 0.6662 | 0.4228 | 0.6189 | 0.6261 | 0.0967 | nan | 0.0420 | nan | 0.2443 | 0.3470 | 0.4380 |
230
+ | 0.2127 | 19.12 | 3060 | 0.3938 | 0.3268 | 0.3842 | 0.5413 | nan | 0.6469 | 0.3713 | 0.5834 | 0.6338 | 0.0856 | nan | 0.0419 | nan | 0.2657 | 0.4172 | 0.4118 | 0.0 | 0.6307 | 0.3641 | 0.5711 | 0.5914 | 0.0838 | nan | 0.0410 | nan | 0.2465 | 0.3570 | 0.3822 |
231
+ | 0.1228 | 19.25 | 3080 | 0.3930 | 0.3371 | 0.3961 | 0.5707 | nan | 0.6900 | 0.3839 | 0.5852 | 0.6874 | 0.0759 | nan | 0.0337 | nan | 0.2645 | 0.4055 | 0.4391 | 0.0 | 0.6701 | 0.3767 | 0.5733 | 0.6383 | 0.0745 | nan | 0.0331 | nan | 0.2462 | 0.3494 | 0.4094 |
232
+ | 0.0882 | 19.38 | 3100 | 0.3940 | 0.3372 | 0.3954 | 0.5712 | nan | 0.6829 | 0.4213 | 0.6223 | 0.6689 | 0.0740 | nan | 0.0338 | nan | 0.2622 | 0.3936 | 0.3997 | 0.0 | 0.6636 | 0.4108 | 0.6046 | 0.6226 | 0.0729 | nan | 0.0332 | nan | 0.2445 | 0.3419 | 0.3776 |
233
+ | 0.0798 | 19.5 | 3120 | 0.4141 | 0.3078 | 0.3595 | 0.5247 | nan | 0.6568 | 0.2977 | 0.5794 | 0.6253 | 0.0788 | nan | 0.0341 | nan | 0.2380 | 0.3701 | 0.3549 | 0.0 | 0.6388 | 0.2941 | 0.5651 | 0.5852 | 0.0775 | nan | 0.0332 | nan | 0.2237 | 0.3256 | 0.3351 |
234
+ | 0.2337 | 19.62 | 3140 | 0.3981 | 0.3068 | 0.3583 | 0.5271 | nan | 0.6597 | 0.3512 | 0.5748 | 0.5754 | 0.0715 | nan | 0.0304 | nan | 0.2115 | 0.3767 | 0.3733 | 0.0 | 0.6439 | 0.3439 | 0.5569 | 0.5412 | 0.0701 | nan | 0.0299 | nan | 0.2004 | 0.3301 | 0.3515 |
235
+ | 0.265 | 19.75 | 3160 | 0.3878 | 0.3441 | 0.4051 | 0.5883 | nan | 0.7216 | 0.4309 | 0.6112 | 0.6456 | 0.1042 | nan | 0.0316 | nan | 0.2539 | 0.3974 | 0.4499 | 0.0 | 0.6987 | 0.4186 | 0.5900 | 0.6016 | 0.1009 | nan | 0.0311 | nan | 0.2385 | 0.3425 | 0.4188 |
236
+ | 0.1612 | 19.88 | 3180 | 0.3921 | 0.3399 | 0.4015 | 0.5622 | nan | 0.6680 | 0.3758 | 0.6031 | 0.6745 | 0.1219 | nan | 0.0368 | nan | 0.2783 | 0.4113 | 0.4439 | 0.0 | 0.6509 | 0.3660 | 0.5870 | 0.6189 | 0.1176 | nan | 0.0361 | nan | 0.2584 | 0.3531 | 0.4113 |
237
+ | 0.119 | 20.0 | 3200 | 0.3930 | 0.3381 | 0.3973 | 0.5638 | nan | 0.6846 | 0.3830 | 0.6002 | 0.6528 | 0.1236 | nan | 0.0364 | nan | 0.2504 | 0.4254 | 0.4196 | 0.0 | 0.6659 | 0.3747 | 0.5862 | 0.6027 | 0.1201 | nan | 0.0357 | nan | 0.2355 | 0.3642 | 0.3958 |
238
+ | 0.2127 | 20.12 | 3220 | 0.4055 | 0.3262 | 0.3828 | 0.5438 | nan | 0.6687 | 0.3572 | 0.5735 | 0.6260 | 0.1227 | nan | 0.0452 | nan | 0.2759 | 0.3804 | 0.3954 | 0.0 | 0.6509 | 0.3480 | 0.5618 | 0.5802 | 0.1193 | nan | 0.0442 | nan | 0.2554 | 0.3293 | 0.3729 |
239
+ | 0.1082 | 20.25 | 3240 | 0.4076 | 0.3233 | 0.3790 | 0.5477 | nan | 0.6622 | 0.3678 | 0.6170 | 0.6725 | 0.1201 | nan | 0.0408 | nan | 0.2368 | 0.3502 | 0.3435 | 0.0 | 0.6445 | 0.3588 | 0.5987 | 0.6188 | 0.1167 | nan | 0.0400 | nan | 0.2242 | 0.3062 | 0.3253 |
240
+ | 0.1413 | 20.38 | 3260 | 0.3997 | 0.3315 | 0.3894 | 0.5515 | nan | 0.6595 | 0.3799 | 0.5992 | 0.6447 | 0.1206 | nan | 0.0406 | nan | 0.2610 | 0.3701 | 0.4294 | 0.0 | 0.6426 | 0.3715 | 0.5838 | 0.5996 | 0.1176 | nan | 0.0399 | nan | 0.2422 | 0.3200 | 0.3973 |
241
+ | 0.1308 | 20.5 | 3280 | 0.4210 | 0.3100 | 0.3624 | 0.5150 | nan | 0.6198 | 0.3069 | 0.5631 | 0.6500 | 0.1115 | nan | 0.0402 | nan | 0.2334 | 0.3535 | 0.3831 | 0.0 | 0.6049 | 0.3025 | 0.5540 | 0.6033 | 0.1102 | nan | 0.0393 | nan | 0.2189 | 0.3098 | 0.3574 |
242
+ | 0.0599 | 20.62 | 3300 | 0.3929 | 0.3457 | 0.4067 | 0.5822 | nan | 0.6963 | 0.4452 | 0.6235 | 0.6543 | 0.1024 | nan | 0.0453 | nan | 0.2641 | 0.3904 | 0.4386 | 0.0 | 0.6756 | 0.4337 | 0.6062 | 0.6087 | 0.1002 | nan | 0.0443 | nan | 0.2460 | 0.3365 | 0.4061 |
243
+ | 0.0948 | 20.75 | 3320 | 0.3978 | 0.3346 | 0.3933 | 0.5553 | nan | 0.6649 | 0.4108 | 0.6084 | 0.6221 | 0.1142 | nan | 0.0400 | nan | 0.2518 | 0.4197 | 0.4076 | 0.0 | 0.6479 | 0.3990 | 0.5918 | 0.5812 | 0.1109 | nan | 0.0393 | nan | 0.2337 | 0.3621 | 0.3800 |
244
+ | 0.0418 | 20.88 | 3340 | 0.3987 | 0.3413 | 0.4016 | 0.5691 | nan | 0.6820 | 0.4091 | 0.6041 | 0.6482 | 0.1110 | nan | 0.0344 | nan | 0.2501 | 0.4283 | 0.4476 | 0.0 | 0.6632 | 0.3996 | 0.5894 | 0.6026 | 0.1079 | nan | 0.0337 | nan | 0.2342 | 0.3677 | 0.4142 |
245
+ | 0.2251 | 21.0 | 3360 | 0.4091 | 0.3067 | 0.3576 | 0.5235 | nan | 0.6628 | 0.3239 | 0.5825 | 0.6036 | 0.1097 | nan | 0.0348 | nan | 0.2448 | 0.3554 | 0.3010 | 0.0 | 0.6446 | 0.3177 | 0.5663 | 0.5655 | 0.1064 | nan | 0.0340 | nan | 0.2288 | 0.3149 | 0.2890 |
246
+ | 0.0887 | 21.12 | 3380 | 0.3958 | 0.3502 | 0.4144 | 0.5830 | nan | 0.6798 | 0.4381 | 0.6436 | 0.7003 | 0.0985 | nan | 0.0553 | nan | 0.3298 | 0.3692 | 0.4148 | 0.0 | 0.6608 | 0.4252 | 0.6204 | 0.6454 | 0.0957 | nan | 0.0537 | nan | 0.2975 | 0.3220 | 0.3809 |
247
+ | 0.0981 | 21.25 | 3400 | 0.4053 | 0.3346 | 0.3938 | 0.5521 | nan | 0.6416 | 0.4158 | 0.6278 | 0.6327 | 0.1152 | nan | 0.0472 | nan | 0.2642 | 0.3607 | 0.4387 | 0.0 | 0.6267 | 0.4034 | 0.6096 | 0.5906 | 0.1116 | nan | 0.0460 | nan | 0.2430 | 0.3153 | 0.3998 |
248
+ | 0.129 | 21.38 | 3420 | 0.4044 | 0.3195 | 0.3724 | 0.5357 | nan | 0.6534 | 0.3715 | 0.5931 | 0.6088 | 0.1166 | nan | 0.0411 | nan | 0.2463 | 0.3438 | 0.3765 | 0.0 | 0.6364 | 0.3635 | 0.5791 | 0.5711 | 0.1137 | nan | 0.0403 | nan | 0.2292 | 0.3042 | 0.3575 |
249
+ | 0.0612 | 21.5 | 3440 | 0.3903 | 0.3486 | 0.4101 | 0.5821 | nan | 0.6849 | 0.4416 | 0.6240 | 0.6569 | 0.1031 | nan | 0.0424 | nan | 0.2770 | 0.3685 | 0.4925 | 0.0 | 0.6663 | 0.4307 | 0.6044 | 0.6109 | 0.0999 | nan | 0.0416 | nan | 0.2569 | 0.3219 | 0.4536 |
250
+ | 0.1272 | 21.62 | 3460 | 0.4190 | 0.3084 | 0.3600 | 0.5155 | nan | 0.6333 | 0.3347 | 0.5558 | 0.6046 | 0.0905 | nan | 0.0351 | nan | 0.2328 | 0.4021 | 0.3511 | 0.0 | 0.6178 | 0.3281 | 0.5458 | 0.5679 | 0.0895 | nan | 0.0345 | nan | 0.2190 | 0.3491 | 0.3325 |
251
+ | 0.0396 | 21.75 | 3480 | 0.4083 | 0.3243 | 0.3801 | 0.5463 | nan | 0.6695 | 0.3636 | 0.6050 | 0.6290 | 0.0988 | nan | 0.0423 | nan | 0.2516 | 0.3892 | 0.3716 | 0.0 | 0.6523 | 0.3566 | 0.5879 | 0.5871 | 0.0963 | nan | 0.0414 | nan | 0.2360 | 0.3372 | 0.3485 |
252
+ | 0.1612 | 21.88 | 3500 | 0.4034 | 0.3276 | 0.3836 | 0.5571 | nan | 0.6817 | 0.4096 | 0.5779 | 0.6470 | 0.1009 | nan | 0.0430 | nan | 0.2657 | 0.3577 | 0.3690 | 0.0 | 0.6632 | 0.3995 | 0.5630 | 0.6014 | 0.0982 | nan | 0.0420 | nan | 0.2475 | 0.3112 | 0.3502 |
253
+ | 0.168 | 22.0 | 3520 | 0.3960 | 0.3299 | 0.3866 | 0.5620 | nan | 0.6894 | 0.3956 | 0.5776 | 0.6484 | 0.0963 | nan | 0.0393 | nan | 0.2551 | 0.3656 | 0.4123 | 0.0 | 0.6689 | 0.3867 | 0.5646 | 0.6040 | 0.0935 | nan | 0.0384 | nan | 0.2394 | 0.3177 | 0.3853 |
254
+ | 0.1447 | 22.12 | 3540 | 0.4078 | 0.3373 | 0.3970 | 0.5641 | nan | 0.6879 | 0.3803 | 0.5834 | 0.6534 | 0.0952 | nan | 0.0500 | nan | 0.2945 | 0.4033 | 0.4248 | 0.0 | 0.6674 | 0.3724 | 0.5691 | 0.6079 | 0.0923 | nan | 0.0486 | nan | 0.2704 | 0.3478 | 0.3974 |
255
+ | 0.053 | 22.25 | 3560 | 0.4011 | 0.3369 | 0.3956 | 0.5696 | nan | 0.7026 | 0.4088 | 0.5707 | 0.6348 | 0.0921 | nan | 0.0507 | nan | 0.2935 | 0.3820 | 0.4256 | 0.0 | 0.6803 | 0.3990 | 0.5570 | 0.5941 | 0.0895 | nan | 0.0494 | nan | 0.2709 | 0.3313 | 0.3978 |
256
+ | 0.1187 | 22.38 | 3580 | 0.4053 | 0.3330 | 0.3913 | 0.5555 | nan | 0.6709 | 0.3798 | 0.5832 | 0.6476 | 0.0971 | nan | 0.0440 | nan | 0.2728 | 0.3996 | 0.4266 | 0.0 | 0.6529 | 0.3718 | 0.5682 | 0.6036 | 0.0940 | nan | 0.0429 | nan | 0.2524 | 0.3462 | 0.3977 |
257
+ | 0.1373 | 22.5 | 3600 | 0.4094 | 0.3264 | 0.3829 | 0.5411 | nan | 0.6504 | 0.3713 | 0.5818 | 0.6355 | 0.1050 | nan | 0.0402 | nan | 0.2590 | 0.4045 | 0.3984 | 0.0 | 0.6343 | 0.3624 | 0.5688 | 0.5951 | 0.1009 | nan | 0.0394 | nan | 0.2407 | 0.3515 | 0.3709 |
258
+ | 0.144 | 22.62 | 3620 | 0.4051 | 0.3243 | 0.3792 | 0.5460 | nan | 0.6617 | 0.3776 | 0.6009 | 0.6535 | 0.1049 | nan | 0.0384 | nan | 0.2643 | 0.3601 | 0.3512 | 0.0 | 0.6448 | 0.3688 | 0.5872 | 0.6082 | 0.1005 | nan | 0.0377 | nan | 0.2463 | 0.3173 | 0.3320 |
259
+ | 0.0716 | 22.75 | 3640 | 0.4071 | 0.3209 | 0.3750 | 0.5386 | nan | 0.6443 | 0.3817 | 0.5979 | 0.6464 | 0.1080 | nan | 0.0381 | nan | 0.2470 | 0.3450 | 0.3667 | 0.0 | 0.6288 | 0.3727 | 0.5835 | 0.6006 | 0.1032 | nan | 0.0374 | nan | 0.2315 | 0.3057 | 0.3453 |
260
+ | 0.0869 | 22.88 | 3660 | 0.4162 | 0.3118 | 0.3636 | 0.5269 | nan | 0.6438 | 0.3654 | 0.5745 | 0.6012 | 0.1058 | nan | 0.0359 | nan | 0.2423 | 0.3257 | 0.3781 | 0.0 | 0.6281 | 0.3557 | 0.5610 | 0.5643 | 0.1014 | nan | 0.0353 | nan | 0.2266 | 0.2902 | 0.3554 |
261
+ | 0.0846 | 23.0 | 3680 | 0.4079 | 0.3326 | 0.3902 | 0.5584 | nan | 0.6786 | 0.3830 | 0.5896 | 0.6438 | 0.1063 | nan | 0.0384 | nan | 0.2539 | 0.3928 | 0.4253 | 0.0 | 0.6605 | 0.3742 | 0.5765 | 0.5994 | 0.1019 | nan | 0.0376 | nan | 0.2376 | 0.3425 | 0.3957 |
262
+ | 0.1137 | 23.12 | 3700 | 0.4062 | 0.3270 | 0.3827 | 0.5498 | nan | 0.6647 | 0.3757 | 0.6069 | 0.6502 | 0.1083 | nan | 0.0380 | nan | 0.2465 | 0.3729 | 0.3814 | 0.0 | 0.6474 | 0.3675 | 0.5903 | 0.6042 | 0.1038 | nan | 0.0372 | nan | 0.2317 | 0.3290 | 0.3588 |
263
+ | 0.109 | 23.25 | 3720 | 0.4160 | 0.3217 | 0.3761 | 0.5445 | nan | 0.6679 | 0.3698 | 0.5816 | 0.6386 | 0.1013 | nan | 0.0424 | nan | 0.2568 | 0.3538 | 0.3723 | 0.0 | 0.6503 | 0.3618 | 0.5679 | 0.5943 | 0.0978 | nan | 0.0415 | nan | 0.2394 | 0.3129 | 0.3514 |
264
+ | 0.0314 | 23.38 | 3740 | 0.4166 | 0.3156 | 0.3685 | 0.5342 | nan | 0.6615 | 0.3651 | 0.5650 | 0.6126 | 0.1027 | nan | 0.0433 | nan | 0.2581 | 0.3479 | 0.3605 | 0.0 | 0.6445 | 0.3558 | 0.5521 | 0.5746 | 0.0988 | nan | 0.0424 | nan | 0.2393 | 0.3083 | 0.3401 |
265
+ | 0.1021 | 23.5 | 3760 | 0.4215 | 0.3218 | 0.3774 | 0.5350 | nan | 0.6494 | 0.3625 | 0.5961 | 0.6217 | 0.1038 | nan | 0.0448 | nan | 0.2675 | 0.3971 | 0.3539 | 0.0 | 0.6335 | 0.3532 | 0.5796 | 0.5818 | 0.0997 | nan | 0.0438 | nan | 0.2465 | 0.3460 | 0.3336 |
266
+ | 0.0086 | 23.62 | 3780 | 0.4121 | 0.3224 | 0.3779 | 0.5430 | nan | 0.6630 | 0.3837 | 0.5842 | 0.6257 | 0.1028 | nan | 0.0432 | nan | 0.2651 | 0.3702 | 0.3634 | 0.0 | 0.6461 | 0.3725 | 0.5697 | 0.5847 | 0.0990 | nan | 0.0423 | nan | 0.2451 | 0.3231 | 0.3413 |
267
+ | 0.1086 | 23.75 | 3800 | 0.4078 | 0.3309 | 0.3892 | 0.5521 | nan | 0.6630 | 0.3944 | 0.5991 | 0.6368 | 0.1092 | nan | 0.0444 | nan | 0.2721 | 0.3831 | 0.4007 | 0.0 | 0.6464 | 0.3824 | 0.5830 | 0.5932 | 0.1047 | nan | 0.0435 | nan | 0.2511 | 0.3331 | 0.3716 |
268
+ | 0.1121 | 23.88 | 3820 | 0.4119 | 0.3221 | 0.3780 | 0.5369 | nan | 0.6487 | 0.3695 | 0.5765 | 0.6271 | 0.1155 | nan | 0.0431 | nan | 0.2546 | 0.3730 | 0.3941 | 0.0 | 0.6329 | 0.3588 | 0.5643 | 0.5842 | 0.1105 | nan | 0.0422 | nan | 0.2368 | 0.3252 | 0.3666 |
269
+ | 0.1118 | 24.0 | 3840 | 0.4161 | 0.3239 | 0.3799 | 0.5430 | nan | 0.6645 | 0.3623 | 0.5649 | 0.6499 | 0.1168 | nan | 0.0422 | nan | 0.2492 | 0.3927 | 0.3769 | 0.0 | 0.6471 | 0.3532 | 0.5559 | 0.6011 | 0.1122 | nan | 0.0413 | nan | 0.2332 | 0.3407 | 0.3547 |
270
+ | 0.0399 | 24.12 | 3860 | 0.4107 | 0.3313 | 0.3888 | 0.5557 | nan | 0.6795 | 0.3847 | 0.5899 | 0.6415 | 0.1208 | nan | 0.0444 | nan | 0.2652 | 0.3843 | 0.3891 | 0.0 | 0.6607 | 0.3748 | 0.5767 | 0.5956 | 0.1152 | nan | 0.0435 | nan | 0.2462 | 0.3345 | 0.3660 |
271
+ | 0.0892 | 24.25 | 3880 | 0.4233 | 0.3200 | 0.3743 | 0.5371 | nan | 0.6571 | 0.3608 | 0.5803 | 0.6290 | 0.1144 | nan | 0.0414 | nan | 0.2556 | 0.3617 | 0.3685 | 0.0 | 0.6402 | 0.3525 | 0.5683 | 0.5850 | 0.1098 | nan | 0.0406 | nan | 0.2377 | 0.3184 | 0.3476 |
272
+ | 0.0504 | 24.38 | 3900 | 0.4126 | 0.3272 | 0.3838 | 0.5469 | nan | 0.6627 | 0.3726 | 0.5902 | 0.6472 | 0.1122 | nan | 0.0434 | nan | 0.2626 | 0.3820 | 0.3810 | 0.0 | 0.6457 | 0.3638 | 0.5776 | 0.5995 | 0.1075 | nan | 0.0425 | nan | 0.2436 | 0.3331 | 0.3583 |
273
+ | 0.1415 | 24.5 | 3920 | 0.4125 | 0.3313 | 0.3890 | 0.5544 | nan | 0.6739 | 0.3827 | 0.5877 | 0.6506 | 0.1154 | nan | 0.0461 | nan | 0.2685 | 0.3880 | 0.3885 | 0.0 | 0.6554 | 0.3725 | 0.5757 | 0.6023 | 0.1107 | nan | 0.0451 | nan | 0.2488 | 0.3378 | 0.3647 |
274
+ | 0.0919 | 24.62 | 3940 | 0.4155 | 0.3205 | 0.3752 | 0.5383 | nan | 0.6617 | 0.3538 | 0.5750 | 0.6351 | 0.1112 | nan | 0.0444 | nan | 0.2523 | 0.3764 | 0.3666 | 0.0 | 0.6444 | 0.3457 | 0.5639 | 0.5899 | 0.1069 | nan | 0.0434 | nan | 0.2354 | 0.3296 | 0.3458 |
275
+ | 0.0443 | 24.75 | 3960 | 0.4101 | 0.3270 | 0.3837 | 0.5447 | nan | 0.6616 | 0.3704 | 0.5699 | 0.6538 | 0.1158 | nan | 0.0460 | nan | 0.2670 | 0.3933 | 0.3753 | 0.0 | 0.6444 | 0.3607 | 0.5605 | 0.6039 | 0.1111 | nan | 0.0450 | nan | 0.2475 | 0.3427 | 0.3542 |
276
+ | 0.1815 | 24.88 | 3980 | 0.4182 | 0.3204 | 0.3750 | 0.5340 | nan | 0.6499 | 0.3529 | 0.5731 | 0.6316 | 0.1129 | nan | 0.0422 | nan | 0.2545 | 0.3768 | 0.3816 | 0.0 | 0.6337 | 0.3452 | 0.5621 | 0.5873 | 0.1083 | nan | 0.0413 | nan | 0.2370 | 0.3306 | 0.3586 |
277
+ | 0.005 | 25.0 | 4000 | 0.4155 | 0.3349 | 0.3935 | 0.5591 | nan | 0.6815 | 0.3865 | 0.5805 | 0.6544 | 0.1155 | nan | 0.0497 | nan | 0.2779 | 0.3995 | 0.3959 | 0.0 | 0.6626 | 0.3764 | 0.5699 | 0.6056 | 0.1108 | nan | 0.0485 | nan | 0.2565 | 0.3465 | 0.3718 |
278
+
279
+
280
+ ### Framework versions
281
+
282
+ - Transformers 4.37.1
283
+ - Pytorch 2.1.2+cu121
284
+ - Datasets 2.16.1
285
+ - Tokenizers 0.15.0
config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fd9e30c3eb7cc7d1ae85a4ada4c796c255a800c1d2462800a40698597c605e98
3
+ size 14895064
tmp-checkpoint-100/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-100/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f7f46049c1dab2c785257d9e74feef1c8f25be9e5d53316a509adb4529e1de35
3
+ size 14895064
tmp-checkpoint-100/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:99b3a452b72319a5609f78e2513812e4f955a4f3ef2eb09ba0a4621c0980f250
3
+ size 29908730
tmp-checkpoint-100/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a80b6b9f808f2068d87c34ef84a3deb4364ecc798e9c5134b59d2c64cc7b3e81
3
+ size 14244
tmp-checkpoint-100/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2efa67f56e073a99be558f038b62b808faf8b11b3e9cdf42d4946ca9680a3b19
3
+ size 1064
tmp-checkpoint-100/trainer_state.json ADDED
@@ -0,0 +1,796 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 0.9343371987342834,
3
+ "best_model_checkpoint": "segments-ECHO\\checkpoint-100",
4
+ "epoch": 0.625,
5
+ "eval_steps": 20,
6
+ "global_step": 100,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.01,
13
+ "learning_rate": 9.9975e-05,
14
+ "loss": 2.5254,
15
+ "step": 1
16
+ },
17
+ {
18
+ "epoch": 0.01,
19
+ "learning_rate": 9.995e-05,
20
+ "loss": 2.5039,
21
+ "step": 2
22
+ },
23
+ {
24
+ "epoch": 0.02,
25
+ "learning_rate": 9.9925e-05,
26
+ "loss": 2.4627,
27
+ "step": 3
28
+ },
29
+ {
30
+ "epoch": 0.03,
31
+ "learning_rate": 9.99e-05,
32
+ "loss": 2.4211,
33
+ "step": 4
34
+ },
35
+ {
36
+ "epoch": 0.03,
37
+ "learning_rate": 9.9875e-05,
38
+ "loss": 2.3994,
39
+ "step": 5
40
+ },
41
+ {
42
+ "epoch": 0.04,
43
+ "learning_rate": 9.985000000000001e-05,
44
+ "loss": 2.4166,
45
+ "step": 6
46
+ },
47
+ {
48
+ "epoch": 0.04,
49
+ "learning_rate": 9.9825e-05,
50
+ "loss": 2.3561,
51
+ "step": 7
52
+ },
53
+ {
54
+ "epoch": 0.05,
55
+ "learning_rate": 9.98e-05,
56
+ "loss": 2.2661,
57
+ "step": 8
58
+ },
59
+ {
60
+ "epoch": 0.06,
61
+ "learning_rate": 9.977500000000001e-05,
62
+ "loss": 2.3066,
63
+ "step": 9
64
+ },
65
+ {
66
+ "epoch": 0.06,
67
+ "learning_rate": 9.975000000000001e-05,
68
+ "loss": 2.2332,
69
+ "step": 10
70
+ },
71
+ {
72
+ "epoch": 0.07,
73
+ "learning_rate": 9.9725e-05,
74
+ "loss": 2.2073,
75
+ "step": 11
76
+ },
77
+ {
78
+ "epoch": 0.07,
79
+ "learning_rate": 9.970000000000001e-05,
80
+ "loss": 2.2208,
81
+ "step": 12
82
+ },
83
+ {
84
+ "epoch": 0.08,
85
+ "learning_rate": 9.967500000000001e-05,
86
+ "loss": 2.1216,
87
+ "step": 13
88
+ },
89
+ {
90
+ "epoch": 0.09,
91
+ "learning_rate": 9.965000000000001e-05,
92
+ "loss": 2.1051,
93
+ "step": 14
94
+ },
95
+ {
96
+ "epoch": 0.09,
97
+ "learning_rate": 9.9625e-05,
98
+ "loss": 2.1227,
99
+ "step": 15
100
+ },
101
+ {
102
+ "epoch": 0.1,
103
+ "learning_rate": 9.960000000000001e-05,
104
+ "loss": 2.0446,
105
+ "step": 16
106
+ },
107
+ {
108
+ "epoch": 0.11,
109
+ "learning_rate": 9.957500000000001e-05,
110
+ "loss": 2.0619,
111
+ "step": 17
112
+ },
113
+ {
114
+ "epoch": 0.11,
115
+ "learning_rate": 9.955000000000001e-05,
116
+ "loss": 2.015,
117
+ "step": 18
118
+ },
119
+ {
120
+ "epoch": 0.12,
121
+ "learning_rate": 9.952500000000001e-05,
122
+ "loss": 1.9739,
123
+ "step": 19
124
+ },
125
+ {
126
+ "epoch": 0.12,
127
+ "learning_rate": 9.95e-05,
128
+ "loss": 2.0322,
129
+ "step": 20
130
+ },
131
+ {
132
+ "epoch": 0.12,
133
+ "eval_accuracy_AK": 0.6263424445960992,
134
+ "eval_accuracy_AS": NaN,
135
+ "eval_accuracy_ASD": 0.0,
136
+ "eval_accuracy_LA": 0.017283506468198647,
137
+ "eval_accuracy_LV": 0.3903317252296394,
138
+ "eval_accuracy_MK": 0.0010812535450935904,
139
+ "eval_accuracy_RA": 0.08495028901734104,
140
+ "eval_accuracy_RV": 0.46802064873474486,
141
+ "eval_accuracy_TK": NaN,
142
+ "eval_accuracy_VS": 0.0,
143
+ "eval_accuracy_VSD": 0.10870013111069489,
144
+ "eval_accuracy_unlabeled": NaN,
145
+ "eval_iou_AK": 0.3646572320303637,
146
+ "eval_iou_AS": NaN,
147
+ "eval_iou_ASD": 0.0,
148
+ "eval_iou_LA": 0.017185325401453597,
149
+ "eval_iou_LV": 0.29703004778944986,
150
+ "eval_iou_MK": 0.0010515790925389602,
151
+ "eval_iou_RA": 0.07824425855254191,
152
+ "eval_iou_RV": 0.20849867275659106,
153
+ "eval_iou_TK": 0.0,
154
+ "eval_iou_VS": 0.0,
155
+ "eval_iou_VSD": 0.08233900373139623,
156
+ "eval_iou_unlabeled": 0.0,
157
+ "eval_loss": 2.2124316692352295,
158
+ "eval_mean_accuracy": 0.1885233331890902,
159
+ "eval_mean_iou": 0.09536419266857593,
160
+ "eval_overall_accuracy": 0.30325268684745055,
161
+ "eval_runtime": 4.3378,
162
+ "eval_samples_per_second": 18.673,
163
+ "eval_steps_per_second": 9.452,
164
+ "step": 20
165
+ },
166
+ {
167
+ "epoch": 0.13,
168
+ "learning_rate": 9.9475e-05,
169
+ "loss": 1.9305,
170
+ "step": 21
171
+ },
172
+ {
173
+ "epoch": 0.14,
174
+ "learning_rate": 9.945e-05,
175
+ "loss": 1.8974,
176
+ "step": 22
177
+ },
178
+ {
179
+ "epoch": 0.14,
180
+ "learning_rate": 9.9425e-05,
181
+ "loss": 1.8201,
182
+ "step": 23
183
+ },
184
+ {
185
+ "epoch": 0.15,
186
+ "learning_rate": 9.94e-05,
187
+ "loss": 1.9195,
188
+ "step": 24
189
+ },
190
+ {
191
+ "epoch": 0.16,
192
+ "learning_rate": 9.9375e-05,
193
+ "loss": 1.8381,
194
+ "step": 25
195
+ },
196
+ {
197
+ "epoch": 0.16,
198
+ "learning_rate": 9.935000000000002e-05,
199
+ "loss": 1.8615,
200
+ "step": 26
201
+ },
202
+ {
203
+ "epoch": 0.17,
204
+ "learning_rate": 9.9325e-05,
205
+ "loss": 1.9896,
206
+ "step": 27
207
+ },
208
+ {
209
+ "epoch": 0.17,
210
+ "learning_rate": 9.93e-05,
211
+ "loss": 1.7221,
212
+ "step": 28
213
+ },
214
+ {
215
+ "epoch": 0.18,
216
+ "learning_rate": 9.9275e-05,
217
+ "loss": 1.7829,
218
+ "step": 29
219
+ },
220
+ {
221
+ "epoch": 0.19,
222
+ "learning_rate": 9.925000000000001e-05,
223
+ "loss": 1.7313,
224
+ "step": 30
225
+ },
226
+ {
227
+ "epoch": 0.19,
228
+ "learning_rate": 9.9225e-05,
229
+ "loss": 1.7434,
230
+ "step": 31
231
+ },
232
+ {
233
+ "epoch": 0.2,
234
+ "learning_rate": 9.92e-05,
235
+ "loss": 1.7066,
236
+ "step": 32
237
+ },
238
+ {
239
+ "epoch": 0.21,
240
+ "learning_rate": 9.917500000000001e-05,
241
+ "loss": 1.7412,
242
+ "step": 33
243
+ },
244
+ {
245
+ "epoch": 0.21,
246
+ "learning_rate": 9.915000000000001e-05,
247
+ "loss": 1.6489,
248
+ "step": 34
249
+ },
250
+ {
251
+ "epoch": 0.22,
252
+ "learning_rate": 9.9125e-05,
253
+ "loss": 1.7579,
254
+ "step": 35
255
+ },
256
+ {
257
+ "epoch": 0.23,
258
+ "learning_rate": 9.910000000000001e-05,
259
+ "loss": 1.5982,
260
+ "step": 36
261
+ },
262
+ {
263
+ "epoch": 0.23,
264
+ "learning_rate": 9.907500000000001e-05,
265
+ "loss": 1.6152,
266
+ "step": 37
267
+ },
268
+ {
269
+ "epoch": 0.24,
270
+ "learning_rate": 9.905000000000001e-05,
271
+ "loss": 1.5696,
272
+ "step": 38
273
+ },
274
+ {
275
+ "epoch": 0.24,
276
+ "learning_rate": 9.9025e-05,
277
+ "loss": 1.5817,
278
+ "step": 39
279
+ },
280
+ {
281
+ "epoch": 0.25,
282
+ "learning_rate": 9.900000000000001e-05,
283
+ "loss": 1.6027,
284
+ "step": 40
285
+ },
286
+ {
287
+ "epoch": 0.25,
288
+ "eval_accuracy_AK": 0.4180375465440657,
289
+ "eval_accuracy_AS": NaN,
290
+ "eval_accuracy_ASD": 8.349196807267141e-05,
291
+ "eval_accuracy_LA": 0.08391232330500217,
292
+ "eval_accuracy_LV": 0.5149157899270969,
293
+ "eval_accuracy_MK": 0.0,
294
+ "eval_accuracy_RA": 0.026379190751445087,
295
+ "eval_accuracy_RV": 0.006093098673401302,
296
+ "eval_accuracy_TK": NaN,
297
+ "eval_accuracy_VS": 0.0,
298
+ "eval_accuracy_VSD": 0.0014422176437535118,
299
+ "eval_accuracy_unlabeled": NaN,
300
+ "eval_iou_AK": 0.3342379533570258,
301
+ "eval_iou_AS": NaN,
302
+ "eval_iou_ASD": 8.348987685243165e-05,
303
+ "eval_iou_LA": 0.07866686340848168,
304
+ "eval_iou_LV": 0.3418158910572593,
305
+ "eval_iou_MK": 0.0,
306
+ "eval_iou_RA": 0.026221859593881795,
307
+ "eval_iou_RV": 0.006072132718195484,
308
+ "eval_iou_TK": NaN,
309
+ "eval_iou_VS": 0.0,
310
+ "eval_iou_VSD": 0.0014407739014099,
311
+ "eval_iou_unlabeled": 0.0,
312
+ "eval_loss": 1.5649173259735107,
313
+ "eval_mean_accuracy": 0.11676262875698191,
314
+ "eval_mean_iou": 0.07885389639131064,
315
+ "eval_overall_accuracy": 0.2640260828145788,
316
+ "eval_runtime": 4.3349,
317
+ "eval_samples_per_second": 18.686,
318
+ "eval_steps_per_second": 9.458,
319
+ "step": 40
320
+ },
321
+ {
322
+ "epoch": 0.26,
323
+ "learning_rate": 9.897500000000001e-05,
324
+ "loss": 1.6129,
325
+ "step": 41
326
+ },
327
+ {
328
+ "epoch": 0.26,
329
+ "learning_rate": 9.895e-05,
330
+ "loss": 1.5371,
331
+ "step": 42
332
+ },
333
+ {
334
+ "epoch": 0.27,
335
+ "learning_rate": 9.8925e-05,
336
+ "loss": 1.4685,
337
+ "step": 43
338
+ },
339
+ {
340
+ "epoch": 0.28,
341
+ "learning_rate": 9.89e-05,
342
+ "loss": 1.4916,
343
+ "step": 44
344
+ },
345
+ {
346
+ "epoch": 0.28,
347
+ "learning_rate": 9.8875e-05,
348
+ "loss": 1.4807,
349
+ "step": 45
350
+ },
351
+ {
352
+ "epoch": 0.29,
353
+ "learning_rate": 9.885e-05,
354
+ "loss": 1.6242,
355
+ "step": 46
356
+ },
357
+ {
358
+ "epoch": 0.29,
359
+ "learning_rate": 9.8825e-05,
360
+ "loss": 1.4578,
361
+ "step": 47
362
+ },
363
+ {
364
+ "epoch": 0.3,
365
+ "learning_rate": 9.88e-05,
366
+ "loss": 1.3899,
367
+ "step": 48
368
+ },
369
+ {
370
+ "epoch": 0.31,
371
+ "learning_rate": 9.8775e-05,
372
+ "loss": 1.4259,
373
+ "step": 49
374
+ },
375
+ {
376
+ "epoch": 0.31,
377
+ "learning_rate": 9.875000000000002e-05,
378
+ "loss": 1.3936,
379
+ "step": 50
380
+ },
381
+ {
382
+ "epoch": 0.32,
383
+ "learning_rate": 9.8725e-05,
384
+ "loss": 1.4484,
385
+ "step": 51
386
+ },
387
+ {
388
+ "epoch": 0.33,
389
+ "learning_rate": 9.87e-05,
390
+ "loss": 1.4465,
391
+ "step": 52
392
+ },
393
+ {
394
+ "epoch": 0.33,
395
+ "learning_rate": 9.8675e-05,
396
+ "loss": 1.3384,
397
+ "step": 53
398
+ },
399
+ {
400
+ "epoch": 0.34,
401
+ "learning_rate": 9.865000000000001e-05,
402
+ "loss": 1.3987,
403
+ "step": 54
404
+ },
405
+ {
406
+ "epoch": 0.34,
407
+ "learning_rate": 9.8625e-05,
408
+ "loss": 1.325,
409
+ "step": 55
410
+ },
411
+ {
412
+ "epoch": 0.35,
413
+ "learning_rate": 9.86e-05,
414
+ "loss": 1.3986,
415
+ "step": 56
416
+ },
417
+ {
418
+ "epoch": 0.36,
419
+ "learning_rate": 9.857500000000001e-05,
420
+ "loss": 1.3668,
421
+ "step": 57
422
+ },
423
+ {
424
+ "epoch": 0.36,
425
+ "learning_rate": 9.855000000000001e-05,
426
+ "loss": 1.3011,
427
+ "step": 58
428
+ },
429
+ {
430
+ "epoch": 0.37,
431
+ "learning_rate": 9.8525e-05,
432
+ "loss": 1.3793,
433
+ "step": 59
434
+ },
435
+ {
436
+ "epoch": 0.38,
437
+ "learning_rate": 9.850000000000001e-05,
438
+ "loss": 1.2877,
439
+ "step": 60
440
+ },
441
+ {
442
+ "epoch": 0.38,
443
+ "eval_accuracy_AK": 0.39300851262526487,
444
+ "eval_accuracy_AS": NaN,
445
+ "eval_accuracy_ASD": 0.0,
446
+ "eval_accuracy_LA": 0.24209088456176642,
447
+ "eval_accuracy_LV": 0.46650240138698945,
448
+ "eval_accuracy_MK": 0.0,
449
+ "eval_accuracy_RA": 0.05467976878612717,
450
+ "eval_accuracy_RV": 0.005259041892968094,
451
+ "eval_accuracy_TK": NaN,
452
+ "eval_accuracy_VS": 0.0,
453
+ "eval_accuracy_VSD": 0.005047761753137292,
454
+ "eval_accuracy_unlabeled": NaN,
455
+ "eval_iou_AK": 0.33124112303256126,
456
+ "eval_iou_AS": NaN,
457
+ "eval_iou_ASD": 0.0,
458
+ "eval_iou_LA": 0.18770266500854244,
459
+ "eval_iou_LV": 0.3611907523728232,
460
+ "eval_iou_MK": 0.0,
461
+ "eval_iou_RA": 0.0528826157540955,
462
+ "eval_iou_RV": 0.005251120698764411,
463
+ "eval_iou_TK": NaN,
464
+ "eval_iou_VS": 0.0,
465
+ "eval_iou_VSD": 0.004983818770226537,
466
+ "eval_iou_unlabeled": 0.0,
467
+ "eval_loss": 1.261634111404419,
468
+ "eval_mean_accuracy": 0.12962093011180592,
469
+ "eval_mean_iou": 0.09432520956370134,
470
+ "eval_overall_accuracy": 0.26848080197537355,
471
+ "eval_runtime": 4.3522,
472
+ "eval_samples_per_second": 18.611,
473
+ "eval_steps_per_second": 9.421,
474
+ "step": 60
475
+ },
476
+ {
477
+ "epoch": 0.38,
478
+ "learning_rate": 9.847500000000001e-05,
479
+ "loss": 1.3054,
480
+ "step": 61
481
+ },
482
+ {
483
+ "epoch": 0.39,
484
+ "learning_rate": 9.845000000000001e-05,
485
+ "loss": 1.2689,
486
+ "step": 62
487
+ },
488
+ {
489
+ "epoch": 0.39,
490
+ "learning_rate": 9.842500000000001e-05,
491
+ "loss": 1.2455,
492
+ "step": 63
493
+ },
494
+ {
495
+ "epoch": 0.4,
496
+ "learning_rate": 9.84e-05,
497
+ "loss": 1.3569,
498
+ "step": 64
499
+ },
500
+ {
501
+ "epoch": 0.41,
502
+ "learning_rate": 9.8375e-05,
503
+ "loss": 1.3127,
504
+ "step": 65
505
+ },
506
+ {
507
+ "epoch": 0.41,
508
+ "learning_rate": 9.835e-05,
509
+ "loss": 1.2319,
510
+ "step": 66
511
+ },
512
+ {
513
+ "epoch": 0.42,
514
+ "learning_rate": 9.8325e-05,
515
+ "loss": 1.1801,
516
+ "step": 67
517
+ },
518
+ {
519
+ "epoch": 0.42,
520
+ "learning_rate": 9.83e-05,
521
+ "loss": 1.2914,
522
+ "step": 68
523
+ },
524
+ {
525
+ "epoch": 0.43,
526
+ "learning_rate": 9.8275e-05,
527
+ "loss": 1.2394,
528
+ "step": 69
529
+ },
530
+ {
531
+ "epoch": 0.44,
532
+ "learning_rate": 9.825e-05,
533
+ "loss": 1.1518,
534
+ "step": 70
535
+ },
536
+ {
537
+ "epoch": 0.44,
538
+ "learning_rate": 9.8225e-05,
539
+ "loss": 1.266,
540
+ "step": 71
541
+ },
542
+ {
543
+ "epoch": 0.45,
544
+ "learning_rate": 9.82e-05,
545
+ "loss": 1.234,
546
+ "step": 72
547
+ },
548
+ {
549
+ "epoch": 0.46,
550
+ "learning_rate": 9.8175e-05,
551
+ "loss": 1.1306,
552
+ "step": 73
553
+ },
554
+ {
555
+ "epoch": 0.46,
556
+ "learning_rate": 9.815000000000001e-05,
557
+ "loss": 1.106,
558
+ "step": 74
559
+ },
560
+ {
561
+ "epoch": 0.47,
562
+ "learning_rate": 9.8125e-05,
563
+ "loss": 1.0982,
564
+ "step": 75
565
+ },
566
+ {
567
+ "epoch": 0.47,
568
+ "learning_rate": 9.81e-05,
569
+ "loss": 1.2036,
570
+ "step": 76
571
+ },
572
+ {
573
+ "epoch": 0.48,
574
+ "learning_rate": 9.807500000000001e-05,
575
+ "loss": 1.1903,
576
+ "step": 77
577
+ },
578
+ {
579
+ "epoch": 0.49,
580
+ "learning_rate": 9.805000000000001e-05,
581
+ "loss": 1.2332,
582
+ "step": 78
583
+ },
584
+ {
585
+ "epoch": 0.49,
586
+ "learning_rate": 9.8025e-05,
587
+ "loss": 1.1477,
588
+ "step": 79
589
+ },
590
+ {
591
+ "epoch": 0.5,
592
+ "learning_rate": 9.8e-05,
593
+ "loss": 1.0981,
594
+ "step": 80
595
+ },
596
+ {
597
+ "epoch": 0.5,
598
+ "eval_accuracy_AK": 0.4862672557259677,
599
+ "eval_accuracy_AS": NaN,
600
+ "eval_accuracy_ASD": 0.0,
601
+ "eval_accuracy_LA": 0.07940032378468319,
602
+ "eval_accuracy_LV": 0.8150770489861409,
603
+ "eval_accuracy_MK": 0.0,
604
+ "eval_accuracy_RA": 0.008226589595375722,
605
+ "eval_accuracy_RV": 0.007904705873912139,
606
+ "eval_accuracy_TK": NaN,
607
+ "eval_accuracy_VS": 0.0,
608
+ "eval_accuracy_VSD": 0.0,
609
+ "eval_accuracy_unlabeled": NaN,
610
+ "eval_iou_AK": 0.4019844608304053,
611
+ "eval_iou_AS": NaN,
612
+ "eval_iou_ASD": 0.0,
613
+ "eval_iou_LA": 0.07502230751798765,
614
+ "eval_iou_LV": 0.47365251006325026,
615
+ "eval_iou_MK": 0.0,
616
+ "eval_iou_RA": 0.008195399704246958,
617
+ "eval_iou_RV": 0.00790374185772906,
618
+ "eval_iou_TK": NaN,
619
+ "eval_iou_VS": 0.0,
620
+ "eval_iou_VSD": 0.0,
621
+ "eval_iou_unlabeled": 0.0,
622
+ "eval_loss": 1.2207798957824707,
623
+ "eval_mean_accuracy": 0.15520843599623105,
624
+ "eval_mean_iou": 0.09667584199736193,
625
+ "eval_overall_accuracy": 0.3897765659817464,
626
+ "eval_runtime": 4.3143,
627
+ "eval_samples_per_second": 18.775,
628
+ "eval_steps_per_second": 9.503,
629
+ "step": 80
630
+ },
631
+ {
632
+ "epoch": 0.51,
633
+ "learning_rate": 9.797500000000001e-05,
634
+ "loss": 1.0832,
635
+ "step": 81
636
+ },
637
+ {
638
+ "epoch": 0.51,
639
+ "learning_rate": 9.795000000000001e-05,
640
+ "loss": 1.1685,
641
+ "step": 82
642
+ },
643
+ {
644
+ "epoch": 0.52,
645
+ "learning_rate": 9.7925e-05,
646
+ "loss": 1.1672,
647
+ "step": 83
648
+ },
649
+ {
650
+ "epoch": 0.53,
651
+ "learning_rate": 9.790000000000001e-05,
652
+ "loss": 1.0383,
653
+ "step": 84
654
+ },
655
+ {
656
+ "epoch": 0.53,
657
+ "learning_rate": 9.787500000000001e-05,
658
+ "loss": 1.0307,
659
+ "step": 85
660
+ },
661
+ {
662
+ "epoch": 0.54,
663
+ "learning_rate": 9.785e-05,
664
+ "loss": 1.0283,
665
+ "step": 86
666
+ },
667
+ {
668
+ "epoch": 0.54,
669
+ "learning_rate": 9.7825e-05,
670
+ "loss": 1.0414,
671
+ "step": 87
672
+ },
673
+ {
674
+ "epoch": 0.55,
675
+ "learning_rate": 9.78e-05,
676
+ "loss": 1.0416,
677
+ "step": 88
678
+ },
679
+ {
680
+ "epoch": 0.56,
681
+ "learning_rate": 9.7775e-05,
682
+ "loss": 0.9611,
683
+ "step": 89
684
+ },
685
+ {
686
+ "epoch": 0.56,
687
+ "learning_rate": 9.775e-05,
688
+ "loss": 0.9732,
689
+ "step": 90
690
+ },
691
+ {
692
+ "epoch": 0.57,
693
+ "learning_rate": 9.7725e-05,
694
+ "loss": 0.9857,
695
+ "step": 91
696
+ },
697
+ {
698
+ "epoch": 0.57,
699
+ "learning_rate": 9.77e-05,
700
+ "loss": 0.9002,
701
+ "step": 92
702
+ },
703
+ {
704
+ "epoch": 0.58,
705
+ "learning_rate": 9.7675e-05,
706
+ "loss": 1.0053,
707
+ "step": 93
708
+ },
709
+ {
710
+ "epoch": 0.59,
711
+ "learning_rate": 9.765e-05,
712
+ "loss": 0.9788,
713
+ "step": 94
714
+ },
715
+ {
716
+ "epoch": 0.59,
717
+ "learning_rate": 9.7625e-05,
718
+ "loss": 1.0315,
719
+ "step": 95
720
+ },
721
+ {
722
+ "epoch": 0.6,
723
+ "learning_rate": 9.76e-05,
724
+ "loss": 0.9625,
725
+ "step": 96
726
+ },
727
+ {
728
+ "epoch": 0.61,
729
+ "learning_rate": 9.7575e-05,
730
+ "loss": 1.0484,
731
+ "step": 97
732
+ },
733
+ {
734
+ "epoch": 0.61,
735
+ "learning_rate": 9.755000000000001e-05,
736
+ "loss": 0.9898,
737
+ "step": 98
738
+ },
739
+ {
740
+ "epoch": 0.62,
741
+ "learning_rate": 9.7525e-05,
742
+ "loss": 0.972,
743
+ "step": 99
744
+ },
745
+ {
746
+ "epoch": 0.62,
747
+ "learning_rate": 9.75e-05,
748
+ "loss": 1.0235,
749
+ "step": 100
750
+ },
751
+ {
752
+ "epoch": 0.62,
753
+ "eval_accuracy_AK": 0.4947409695532967,
754
+ "eval_accuracy_AS": NaN,
755
+ "eval_accuracy_ASD": 0.0,
756
+ "eval_accuracy_LA": 0.3015038749231761,
757
+ "eval_accuracy_LV": 0.8507606635581448,
758
+ "eval_accuracy_MK": 0.0,
759
+ "eval_accuracy_RA": 0.0422728323699422,
760
+ "eval_accuracy_RV": 0.01015217500484291,
761
+ "eval_accuracy_TK": NaN,
762
+ "eval_accuracy_VS": 0.0,
763
+ "eval_accuracy_VSD": 0.0,
764
+ "eval_accuracy_unlabeled": NaN,
765
+ "eval_iou_AK": 0.4058522236190776,
766
+ "eval_iou_AS": NaN,
767
+ "eval_iou_ASD": 0.0,
768
+ "eval_iou_LA": 0.22833526085383246,
769
+ "eval_iou_LV": 0.5319480132542299,
770
+ "eval_iou_MK": 0.0,
771
+ "eval_iou_RA": 0.041750206661581954,
772
+ "eval_iou_RV": 0.010136375439884668,
773
+ "eval_iou_TK": NaN,
774
+ "eval_iou_VS": 0.0,
775
+ "eval_iou_VSD": 0.0,
776
+ "eval_iou_unlabeled": 0.0,
777
+ "eval_loss": 0.9343371987342834,
778
+ "eval_mean_accuracy": 0.18882561282326696,
779
+ "eval_mean_iou": 0.12180220798286065,
780
+ "eval_overall_accuracy": 0.4418879928168235,
781
+ "eval_runtime": 4.3249,
782
+ "eval_samples_per_second": 18.729,
783
+ "eval_steps_per_second": 9.48,
784
+ "step": 100
785
+ }
786
+ ],
787
+ "logging_steps": 1,
788
+ "max_steps": 4000,
789
+ "num_input_tokens_seen": 0,
790
+ "num_train_epochs": 25,
791
+ "save_steps": 20,
792
+ "total_flos": 3508016460595200.0,
793
+ "train_batch_size": 2,
794
+ "trial_name": null,
795
+ "trial_params": null
796
+ }
tmp-checkpoint-100/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:240a7bc5922dec561dade824dd201689d9e6c30828528fa363211165f71097b9
3
+ size 4728
tmp-checkpoint-1000/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-1000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:864571f649718651887b70f21adf32ff151fb24795af6f0be7668cb2c30c60c6
3
+ size 14895064
tmp-checkpoint-1000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a6eb43ca4304e8a784e882da2423841fdb18e44db6027ad4fef0b37f14c3e16c
3
+ size 29908730
tmp-checkpoint-1000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ea5d4ea01d7a4684086422d49f27169612d0e881f863013c2dca6d439ace778c
3
+ size 14244
tmp-checkpoint-1000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0c95571717a68cc0150a9e3aa58414f166e0282aca232e92f44bf72b6955076b
3
+ size 1064
tmp-checkpoint-1000/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
tmp-checkpoint-1000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:240a7bc5922dec561dade824dd201689d9e6c30828528fa363211165f71097b9
3
+ size 4728
tmp-checkpoint-1020/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-1020/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fd7731a442476db180c3ea4254feec562ee701199ed81e716632225072c556a3
3
+ size 14895064
tmp-checkpoint-1020/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6b982e7224e1f97f650a9530f941dcf2c35176ffaaeaf862c46bcbdfe261f5c
3
+ size 29908730
tmp-checkpoint-1020/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1fe9d95b72eb34d07a37ccf5fec924edb6830347cdadbf20728b722a5d4fdfb4
3
+ size 14244
tmp-checkpoint-1020/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:50ce4f55721ee53d4cd44852b9f4c6232533b16f03f743a9ae8ff144fdc66f3a
3
+ size 1064
tmp-checkpoint-1020/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
tmp-checkpoint-1020/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:240a7bc5922dec561dade824dd201689d9e6c30828528fa363211165f71097b9
3
+ size 4728
tmp-checkpoint-1040/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-1040/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0d78f3a4788e82d1cf0b2f04d32c8a15a3a25686a1b5602c17a14b54308ea16d
3
+ size 14895064
tmp-checkpoint-1040/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:07df2a2c65784ddc9b3a0e3ee036d883aefbbd04323ec16e65978794fd1a680d
3
+ size 29908730
tmp-checkpoint-1040/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:048ebca5800adab5c9ee9bf5f110a1d319caab669765af310dcdf2e72caf6bc9
3
+ size 14244
tmp-checkpoint-1040/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:31fa9619abd2673f5c10cda000a43c29908645b0fba23d322a57a76d545bce9d
3
+ size 1064
tmp-checkpoint-1040/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
tmp-checkpoint-1040/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:240a7bc5922dec561dade824dd201689d9e6c30828528fa363211165f71097b9
3
+ size 4728
tmp-checkpoint-1060/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-1060/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7ccd810ffd210692f5be1896465136ecf5a852994e835736439789ede0763e98
3
+ size 14895064
tmp-checkpoint-1060/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:91cde7bf91351c7b804d12b5252c3e59c3ca9fece7b3305609b946e3c41e702d
3
+ size 29908730
tmp-checkpoint-1060/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:51e836f148c1db06eb55df51f5d09533edbee84f6a5ea66d7485a6d2ce9f8309
3
+ size 14244
tmp-checkpoint-1060/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6de915a67ff54f2aed56819e44d4f6cd83f1636422876605db842c2b3733d57b
3
+ size 1064
tmp-checkpoint-1060/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
tmp-checkpoint-1060/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:240a7bc5922dec561dade824dd201689d9e6c30828528fa363211165f71097b9
3
+ size 4728
tmp-checkpoint-1080/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-1080/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b33c7a9248130ccea0cb7495325ff03315a8d51b36b26e30ff0d522a92771084
3
+ size 14895064
tmp-checkpoint-1080/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63cfaadbc41039c3eec6e3043db3a28eac7e0fe13b9f410f55a6c1308db4a28a
3
+ size 29908730
tmp-checkpoint-1080/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e7680a3f9fec0adc9c14e32054f078047218cb495644244c24dca2566acf3bd1
3
+ size 14244
tmp-checkpoint-1080/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c3814e4d55d5f50299142c72b405833dde9c1fec2df1a0b5937cc6363a25415
3
+ size 1064
tmp-checkpoint-1080/trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
tmp-checkpoint-1080/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:240a7bc5922dec561dade824dd201689d9e6c30828528fa363211165f71097b9
3
+ size 4728
tmp-checkpoint-1100/config.json ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "unlabeled",
32
+ "1": "LV",
33
+ "2": "RV",
34
+ "3": "RA",
35
+ "4": "LA",
36
+ "5": "VS",
37
+ "6": "AS",
38
+ "7": "MK",
39
+ "8": "TK",
40
+ "9": "ASD",
41
+ "10": "VSD",
42
+ "11": "AK"
43
+ },
44
+ "image_size": 224,
45
+ "initializer_range": 0.02,
46
+ "label2id": {
47
+ "AK": 11,
48
+ "AS": 6,
49
+ "ASD": 9,
50
+ "LA": 4,
51
+ "LV": 1,
52
+ "MK": 7,
53
+ "RA": 3,
54
+ "RV": 2,
55
+ "TK": 8,
56
+ "VS": 5,
57
+ "VSD": 10,
58
+ "unlabeled": 0
59
+ },
60
+ "layer_norm_eps": 1e-06,
61
+ "mlp_ratios": [
62
+ 4,
63
+ 4,
64
+ 4,
65
+ 4
66
+ ],
67
+ "model_type": "segformer",
68
+ "num_attention_heads": [
69
+ 1,
70
+ 2,
71
+ 5,
72
+ 8
73
+ ],
74
+ "num_channels": 3,
75
+ "num_encoder_blocks": 4,
76
+ "patch_sizes": [
77
+ 7,
78
+ 3,
79
+ 3,
80
+ 3
81
+ ],
82
+ "reshape_last_stage": true,
83
+ "semantic_loss_ignore_index": 255,
84
+ "sr_ratios": [
85
+ 8,
86
+ 4,
87
+ 2,
88
+ 1
89
+ ],
90
+ "strides": [
91
+ 4,
92
+ 2,
93
+ 2,
94
+ 2
95
+ ],
96
+ "torch_dtype": "float32",
97
+ "transformers_version": "4.37.1"
98
+ }
tmp-checkpoint-1100/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7bf046aa523afb3a2a1efd1b19f6068e949a1b3720efe607b437bcbf107867ac
3
+ size 14895064
tmp-checkpoint-1100/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:82b33fc0c63d3b09bf1a96fa35f67bb3872eeb5228c979746ef71a31390e74e7
3
+ size 29908730
tmp-checkpoint-1100/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34631a866dd46391bb2a084f5cdc2b5669297e0a3960f85024794a4a74e7b1ba
3
+ size 14244
tmp-checkpoint-1100/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f96a15a2425ccde044fac1230698babdc30ae39b5436064f12341bd408dac18e
3
+ size 1064