chugz commited on
Commit
967ad6b
1 Parent(s): fbadbfa

End of training

Browse files
README.md CHANGED
@@ -1,199 +1,155 @@
1
  ---
2
- library_name: transformers
3
- tags: []
 
 
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
 
1
  ---
2
+ license: other
3
+ base_model: nvidia/mit-b0
4
+ tags:
5
+ - vision
6
+ - image-segmentation
7
+ - generated_from_trainer
8
+ model-index:
9
+ - name: segformer-b0-practice-7-11
10
+ results: []
11
  ---
12
 
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # segformer-b0-practice-7-11
17
+
18
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the chugz/SEM dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.1941
21
+ - Mean Iou: 0.5755
22
+ - Mean Accuracy: 0.7536
23
+ - Overall Accuracy: 0.9340
24
+ - Accuracy Background: nan
25
+ - Accuracy Silver: 0.9537
26
+ - Accuracy Glass: 0.0749
27
+ - Accuracy Silicon: 0.9909
28
+ - Accuracy Void: 0.8095
29
+ - Accuracy Interfacial void: 0.9391
30
+ - Iou Background: 0.0
31
+ - Iou Silver: 0.8956
32
+ - Iou Glass: 0.0717
33
+ - Iou Silicon: 0.9803
34
+ - Iou Void: 0.6885
35
+ - Iou Interfacial void: 0.8168
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 6e-05
55
+ - train_batch_size: 2
56
+ - eval_batch_size: 2
57
+ - seed: 42
58
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - num_epochs: 50
61
+
62
+ ### Training results
63
+
64
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Silver | Accuracy Glass | Accuracy Silicon | Accuracy Void | Accuracy Interfacial void | Iou Background | Iou Silver | Iou Glass | Iou Silicon | Iou Void | Iou Interfacial void |
65
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:---------------:|:--------------:|:----------------:|:-------------:|:-------------------------:|:--------------:|:----------:|:---------:|:-----------:|:--------:|:--------------------:|
66
+ | 1.2415 | 0.6061 | 20 | 1.3040 | 0.2800 | 0.4306 | 0.7396 | nan | 0.9612 | 0.0069 | 0.9153 | 0.0 | 0.2698 | 0.0 | 0.5882 | 0.0069 | 0.8301 | 0.0 | 0.2549 |
67
+ | 0.9185 | 1.2121 | 40 | 0.7957 | 0.3642 | 0.5203 | 0.8249 | nan | 0.9524 | 0.0 | 0.9539 | 0.0000 | 0.6951 | 0.0 | 0.7181 | 0.0 | 0.8980 | 0.0000 | 0.5691 |
68
+ | 0.6451 | 1.8182 | 60 | 0.6017 | 0.3679 | 0.5206 | 0.8131 | nan | 0.8835 | 0.0 | 0.9456 | 0.0049 | 0.7688 | 0.0 | 0.6888 | 0.0 | 0.9053 | 0.0049 | 0.6086 |
69
+ | 0.5145 | 2.4242 | 80 | 0.5779 | 0.4033 | 0.5635 | 0.8393 | nan | 0.9514 | 0.0 | 0.9371 | 0.1596 | 0.7693 | 0.0 | 0.7321 | 0.0 | 0.9199 | 0.1516 | 0.6165 |
70
+ | 0.9509 | 3.0303 | 100 | 0.5006 | 0.4017 | 0.5540 | 0.8420 | nan | 0.9510 | 0.0 | 0.9548 | 0.1047 | 0.7592 | 0.0 | 0.7312 | 0.0 | 0.9273 | 0.1010 | 0.6506 |
71
+ | 0.6075 | 3.6364 | 120 | 0.4274 | 0.4688 | 0.6372 | 0.8776 | nan | 0.9414 | 0.0 | 0.9613 | 0.4300 | 0.8535 | 0.0 | 0.7953 | 0.0 | 0.9395 | 0.3709 | 0.7070 |
72
+ | 0.4155 | 4.2424 | 140 | 0.3949 | 0.4994 | 0.6781 | 0.8912 | nan | 0.9260 | 0.0 | 0.9560 | 0.5908 | 0.9177 | 0.0 | 0.8258 | 0.0 | 0.9412 | 0.4981 | 0.7313 |
73
+ | 0.4072 | 4.8485 | 160 | 0.3538 | 0.5040 | 0.6794 | 0.8961 | nan | 0.9463 | 0.0 | 0.9668 | 0.6083 | 0.8758 | 0.0 | 0.8231 | 0.0 | 0.9540 | 0.4978 | 0.7493 |
74
+ | 0.3085 | 5.4545 | 180 | 0.3441 | 0.5033 | 0.6814 | 0.8919 | nan | 0.9506 | 0.0 | 0.9536 | 0.6344 | 0.8684 | 0.0 | 0.8170 | 0.0 | 0.9417 | 0.5214 | 0.7398 |
75
+ | 0.2885 | 6.0606 | 200 | 0.3357 | 0.5136 | 0.6967 | 0.8989 | nan | 0.9082 | 0.0 | 0.9780 | 0.6987 | 0.8987 | 0.0 | 0.8389 | 0.0 | 0.9476 | 0.5652 | 0.7302 |
76
+ | 0.2754 | 6.6667 | 220 | 0.3052 | 0.5253 | 0.7061 | 0.9094 | nan | 0.9355 | 0.0 | 0.9714 | 0.6918 | 0.9319 | 0.0 | 0.8553 | 0.0 | 0.9577 | 0.5742 | 0.7643 |
77
+ | 0.2942 | 7.2727 | 240 | 0.2893 | 0.5226 | 0.6948 | 0.9087 | nan | 0.9388 | 0.0 | 0.9805 | 0.6373 | 0.9175 | 0.0 | 0.8466 | 0.0 | 0.9618 | 0.5504 | 0.7766 |
78
+ | 0.2324 | 7.8788 | 260 | 0.3018 | 0.5221 | 0.7053 | 0.9053 | nan | 0.9031 | 0.0 | 0.9754 | 0.6931 | 0.9549 | 0.0 | 0.8499 | 0.0 | 0.9659 | 0.5796 | 0.7370 |
79
+ | 0.2872 | 8.4848 | 280 | 0.2758 | 0.5339 | 0.7129 | 0.9139 | nan | 0.9160 | 0.0 | 0.9880 | 0.7207 | 0.9396 | 0.0 | 0.8585 | 0.0 | 0.9739 | 0.6018 | 0.7694 |
80
+ | 0.2076 | 9.0909 | 300 | 0.2615 | 0.5317 | 0.7011 | 0.9168 | nan | 0.9431 | 0.0 | 0.9920 | 0.6451 | 0.9253 | 0.0 | 0.8670 | 0.0 | 0.9715 | 0.5706 | 0.7811 |
81
+ | 0.2796 | 9.6970 | 320 | 0.2718 | 0.5226 | 0.6956 | 0.9078 | nan | 0.9573 | 0.0 | 0.9720 | 0.6536 | 0.8953 | 0.0 | 0.8501 | 0.0 | 0.9577 | 0.5505 | 0.7771 |
82
+ | 0.3215 | 10.3030 | 340 | 0.2627 | 0.5414 | 0.7207 | 0.9193 | nan | 0.9303 | 0.0 | 0.9847 | 0.7407 | 0.9476 | 0.0 | 0.8680 | 0.0 | 0.9735 | 0.6193 | 0.7878 |
83
+ | 0.209 | 10.9091 | 360 | 0.2516 | 0.5337 | 0.7073 | 0.9189 | nan | 0.9487 | 0.0 | 0.9898 | 0.6761 | 0.9221 | 0.0 | 0.8733 | 0.0 | 0.9734 | 0.5741 | 0.7818 |
84
+ | 0.2928 | 11.5152 | 380 | 0.2606 | 0.5457 | 0.7354 | 0.9225 | nan | 0.9284 | 0.0 | 0.9831 | 0.8203 | 0.9452 | 0.0 | 0.8774 | 0.0 | 0.9733 | 0.6401 | 0.7831 |
85
+ | 0.2246 | 12.1212 | 400 | 0.2519 | 0.5378 | 0.7107 | 0.9180 | nan | 0.9373 | 0.0 | 0.9906 | 0.6993 | 0.9263 | 0.0 | 0.8686 | 0.0 | 0.9691 | 0.6136 | 0.7755 |
86
+ | 0.2386 | 12.7273 | 420 | 0.2477 | 0.5443 | 0.7290 | 0.9212 | nan | 0.9407 | 0.0001 | 0.9828 | 0.7968 | 0.9245 | 0.0 | 0.8733 | 0.0001 | 0.9703 | 0.6294 | 0.7929 |
87
+ | 0.1734 | 13.3333 | 440 | 0.2285 | 0.5466 | 0.7234 | 0.9234 | nan | 0.9479 | 0.0005 | 0.9879 | 0.7531 | 0.9274 | 0.0 | 0.8791 | 0.0005 | 0.9747 | 0.6310 | 0.7940 |
88
+ | 0.1809 | 13.9394 | 460 | 0.2304 | 0.5501 | 0.7254 | 0.9259 | nan | 0.9451 | 0.0002 | 0.9923 | 0.7544 | 0.9349 | 0.0 | 0.8799 | 0.0002 | 0.9774 | 0.6386 | 0.8047 |
89
+ | 0.2162 | 14.5455 | 480 | 0.2431 | 0.5514 | 0.7364 | 0.9257 | nan | 0.9451 | 0.0022 | 0.9792 | 0.8061 | 0.9493 | 0.0 | 0.8825 | 0.0022 | 0.9714 | 0.6526 | 0.7996 |
90
+ | 0.1973 | 15.1515 | 500 | 0.2519 | 0.5480 | 0.7321 | 0.9188 | nan | 0.9306 | 0.0002 | 0.9773 | 0.8158 | 0.9364 | 0.0 | 0.8645 | 0.0002 | 0.9631 | 0.6636 | 0.7967 |
91
+ | 0.1218 | 15.7576 | 520 | 0.2275 | 0.5513 | 0.7291 | 0.9245 | nan | 0.9463 | 0.0027 | 0.9827 | 0.7704 | 0.9436 | 0.0 | 0.8842 | 0.0027 | 0.9712 | 0.6524 | 0.7972 |
92
+ | 0.1809 | 16.3636 | 540 | 0.2323 | 0.5540 | 0.7339 | 0.9271 | nan | 0.9406 | 0.0014 | 0.9876 | 0.7905 | 0.9495 | 0.0 | 0.8827 | 0.0014 | 0.9744 | 0.6583 | 0.8071 |
93
+ | 0.1484 | 16.9697 | 560 | 0.2364 | 0.5527 | 0.7374 | 0.9223 | nan | 0.9343 | 0.0027 | 0.9759 | 0.8231 | 0.9509 | 0.0 | 0.8694 | 0.0027 | 0.9657 | 0.6680 | 0.8105 |
94
+ | 0.1464 | 17.5758 | 580 | 0.2338 | 0.5519 | 0.7376 | 0.9254 | nan | 0.9397 | 0.0030 | 0.9849 | 0.8248 | 0.9354 | 0.0 | 0.8782 | 0.0030 | 0.9752 | 0.6523 | 0.8029 |
95
+ | 0.1389 | 18.1818 | 600 | 0.2355 | 0.5541 | 0.7350 | 0.9227 | nan | 0.9281 | 0.0044 | 0.9822 | 0.8067 | 0.9537 | 0.0 | 0.8759 | 0.0044 | 0.9722 | 0.6756 | 0.7967 |
96
+ | 0.115 | 18.7879 | 620 | 0.2175 | 0.5478 | 0.7175 | 0.9242 | nan | 0.9554 | 0.0080 | 0.9932 | 0.7125 | 0.9180 | 0.0 | 0.8806 | 0.0079 | 0.9735 | 0.6302 | 0.7947 |
97
+ | 0.1704 | 19.3939 | 640 | 0.2246 | 0.5552 | 0.7320 | 0.9283 | nan | 0.9413 | 0.0079 | 0.9919 | 0.7671 | 0.9517 | 0.0 | 0.8883 | 0.0078 | 0.9815 | 0.6585 | 0.7951 |
98
+ | 0.1537 | 20.0 | 660 | 0.2222 | 0.5590 | 0.7370 | 0.9299 | nan | 0.9461 | 0.0193 | 0.9919 | 0.7830 | 0.9445 | 0.0 | 0.8923 | 0.0189 | 0.9803 | 0.6620 | 0.8002 |
99
+ | 0.1605 | 20.6061 | 680 | 0.2174 | 0.5578 | 0.7417 | 0.9273 | nan | 0.9388 | 0.0084 | 0.9865 | 0.8359 | 0.9390 | 0.0 | 0.8833 | 0.0083 | 0.9749 | 0.6706 | 0.8096 |
100
+ | 0.1237 | 21.2121 | 700 | 0.2154 | 0.5639 | 0.7384 | 0.9312 | nan | 0.9496 | 0.0219 | 0.9935 | 0.7867 | 0.9404 | 0.0 | 0.8913 | 0.0214 | 0.9812 | 0.6799 | 0.8096 |
101
+ | 0.1288 | 21.8182 | 720 | 0.2214 | 0.5627 | 0.7430 | 0.9303 | nan | 0.9406 | 0.0237 | 0.9907 | 0.8101 | 0.9497 | 0.0 | 0.8926 | 0.0233 | 0.9799 | 0.6771 | 0.8034 |
102
+ | 0.122 | 22.4242 | 740 | 0.2160 | 0.5675 | 0.7469 | 0.9309 | nan | 0.9367 | 0.0376 | 0.9922 | 0.8161 | 0.9519 | 0.0 | 0.8921 | 0.0361 | 0.9816 | 0.6884 | 0.8067 |
103
+ | 0.1608 | 23.0303 | 760 | 0.2112 | 0.5613 | 0.7412 | 0.9304 | nan | 0.9417 | 0.0160 | 0.9934 | 0.8137 | 0.9410 | 0.0 | 0.8907 | 0.0157 | 0.9802 | 0.6695 | 0.8116 |
104
+ | 0.1258 | 23.6364 | 780 | 0.2230 | 0.5611 | 0.7425 | 0.9293 | nan | 0.9367 | 0.0196 | 0.9898 | 0.8148 | 0.9518 | 0.0 | 0.8894 | 0.0192 | 0.9808 | 0.6733 | 0.8036 |
105
+ | 0.1333 | 24.2424 | 800 | 0.2171 | 0.5606 | 0.7377 | 0.9280 | nan | 0.9420 | 0.0207 | 0.9871 | 0.7868 | 0.9520 | 0.0 | 0.8865 | 0.0203 | 0.9773 | 0.6734 | 0.8063 |
106
+ | 0.1562 | 24.8485 | 820 | 0.2183 | 0.5644 | 0.7477 | 0.9302 | nan | 0.9301 | 0.0299 | 0.9929 | 0.8333 | 0.9523 | 0.0 | 0.8882 | 0.0291 | 0.9813 | 0.6798 | 0.8081 |
107
+ | 0.1039 | 25.4545 | 840 | 0.2106 | 0.5648 | 0.7389 | 0.9302 | nan | 0.9506 | 0.0412 | 0.9920 | 0.7729 | 0.9379 | 0.0 | 0.8890 | 0.0392 | 0.9804 | 0.6668 | 0.8136 |
108
+ | 0.1079 | 26.0606 | 860 | 0.2099 | 0.5625 | 0.7449 | 0.9299 | nan | 0.9385 | 0.0233 | 0.9896 | 0.8246 | 0.9486 | 0.0 | 0.8917 | 0.0227 | 0.9796 | 0.6752 | 0.8056 |
109
+ | 0.0995 | 26.6667 | 880 | 0.2122 | 0.5620 | 0.7403 | 0.9310 | nan | 0.9464 | 0.0210 | 0.9913 | 0.7954 | 0.9474 | 0.0 | 0.8916 | 0.0207 | 0.9793 | 0.6690 | 0.8115 |
110
+ | 0.1326 | 27.2727 | 900 | 0.2150 | 0.5611 | 0.7428 | 0.9291 | nan | 0.9412 | 0.0198 | 0.9879 | 0.8193 | 0.9459 | 0.0 | 0.8868 | 0.0195 | 0.9785 | 0.6684 | 0.8130 |
111
+ | 0.1602 | 27.8788 | 920 | 0.2108 | 0.5645 | 0.7438 | 0.9312 | nan | 0.9388 | 0.0249 | 0.9925 | 0.8091 | 0.9537 | 0.0 | 0.8935 | 0.0243 | 0.9812 | 0.6835 | 0.8044 |
112
+ | 0.1758 | 28.4848 | 940 | 0.2116 | 0.5682 | 0.7461 | 0.9326 | nan | 0.9520 | 0.0347 | 0.9891 | 0.8089 | 0.9459 | 0.0 | 0.9004 | 0.0336 | 0.9775 | 0.6888 | 0.8089 |
113
+ | 0.3986 | 29.0909 | 960 | 0.1990 | 0.5783 | 0.7581 | 0.9314 | nan | 0.9414 | 0.1021 | 0.9878 | 0.8095 | 0.9496 | 0.0 | 0.8923 | 0.0941 | 0.9785 | 0.6951 | 0.8099 |
114
+ | 0.1477 | 29.6970 | 980 | 0.2202 | 0.5627 | 0.7442 | 0.9290 | nan | 0.9450 | 0.0213 | 0.9847 | 0.8258 | 0.9442 | 0.0 | 0.8857 | 0.0210 | 0.9749 | 0.6807 | 0.8142 |
115
+ | 0.1221 | 30.3030 | 1000 | 0.2288 | 0.5621 | 0.7452 | 0.9292 | nan | 0.9340 | 0.0226 | 0.9869 | 0.8218 | 0.9606 | 0.0 | 0.8922 | 0.0224 | 0.9775 | 0.6855 | 0.7950 |
116
+ | 0.1092 | 30.9091 | 1020 | 0.2079 | 0.5696 | 0.7507 | 0.9297 | nan | 0.9404 | 0.0511 | 0.9902 | 0.8393 | 0.9322 | 0.0 | 0.8880 | 0.0478 | 0.9802 | 0.6916 | 0.8100 |
117
+ | 0.1488 | 31.5152 | 1040 | 0.2147 | 0.5675 | 0.7501 | 0.9306 | nan | 0.9439 | 0.0550 | 0.9867 | 0.8164 | 0.9484 | 0.0 | 0.8939 | 0.0528 | 0.9773 | 0.6731 | 0.8078 |
118
+ | 0.1177 | 32.1212 | 1060 | 0.2240 | 0.5688 | 0.7470 | 0.9311 | nan | 0.9409 | 0.0466 | 0.9899 | 0.8028 | 0.9550 | 0.0 | 0.8952 | 0.0444 | 0.9801 | 0.6910 | 0.8022 |
119
+ | 0.1121 | 32.7273 | 1080 | 0.2043 | 0.5700 | 0.7471 | 0.9327 | nan | 0.9542 | 0.0467 | 0.9895 | 0.8043 | 0.9409 | 0.0 | 0.8949 | 0.0449 | 0.9791 | 0.6885 | 0.8129 |
120
+ | 0.1263 | 33.3333 | 1100 | 0.2120 | 0.5679 | 0.7516 | 0.9299 | nan | 0.9341 | 0.0447 | 0.9888 | 0.8415 | 0.9489 | 0.0 | 0.8896 | 0.0434 | 0.9785 | 0.6851 | 0.8110 |
121
+ | 0.0922 | 33.9394 | 1120 | 0.2104 | 0.5721 | 0.7507 | 0.9331 | nan | 0.9512 | 0.0605 | 0.9902 | 0.8086 | 0.9427 | 0.0 | 0.8984 | 0.0574 | 0.9797 | 0.6855 | 0.8118 |
122
+ | 0.1549 | 34.5455 | 1140 | 0.2276 | 0.5624 | 0.7428 | 0.9296 | nan | 0.9371 | 0.0369 | 0.9895 | 0.7912 | 0.9593 | 0.0 | 0.8951 | 0.0359 | 0.9800 | 0.6682 | 0.7951 |
123
+ | 0.1493 | 35.1515 | 1160 | 0.1981 | 0.5739 | 0.7532 | 0.9336 | nan | 0.9495 | 0.0723 | 0.9915 | 0.8094 | 0.9431 | 0.0 | 0.8946 | 0.0698 | 0.9817 | 0.6780 | 0.8197 |
124
+ | 0.1176 | 35.7576 | 1180 | 0.2030 | 0.5757 | 0.7552 | 0.9351 | nan | 0.9506 | 0.0684 | 0.9925 | 0.8196 | 0.9447 | 0.0 | 0.9026 | 0.0648 | 0.9820 | 0.6921 | 0.8129 |
125
+ | 0.229 | 36.3636 | 1200 | 0.2046 | 0.5730 | 0.7524 | 0.9337 | nan | 0.9468 | 0.0560 | 0.9917 | 0.8208 | 0.9466 | 0.0 | 0.8978 | 0.0539 | 0.9816 | 0.6888 | 0.8158 |
126
+ | 0.1419 | 36.9697 | 1220 | 0.2069 | 0.5695 | 0.7491 | 0.9322 | nan | 0.9410 | 0.0449 | 0.9909 | 0.8142 | 0.9546 | 0.0 | 0.8945 | 0.0439 | 0.9812 | 0.6880 | 0.8093 |
127
+ | 0.0725 | 37.5758 | 1240 | 0.2001 | 0.5724 | 0.7554 | 0.9322 | nan | 0.9400 | 0.0641 | 0.9916 | 0.8397 | 0.9418 | 0.0 | 0.8937 | 0.0612 | 0.9806 | 0.6830 | 0.8157 |
128
+ | 0.0653 | 38.1818 | 1260 | 0.2039 | 0.5729 | 0.7530 | 0.9344 | nan | 0.9474 | 0.0577 | 0.9916 | 0.8173 | 0.9510 | 0.0 | 0.8999 | 0.0561 | 0.9823 | 0.6843 | 0.8151 |
129
+ | 0.1446 | 38.7879 | 1280 | 0.2051 | 0.5729 | 0.7554 | 0.9325 | nan | 0.9349 | 0.0637 | 0.9926 | 0.8325 | 0.9531 | 0.0 | 0.8944 | 0.0619 | 0.9823 | 0.6853 | 0.8138 |
130
+ | 0.1172 | 39.3939 | 1300 | 0.2099 | 0.5719 | 0.7498 | 0.9320 | nan | 0.9499 | 0.0707 | 0.9881 | 0.7914 | 0.9486 | 0.0 | 0.8959 | 0.0674 | 0.9797 | 0.6778 | 0.8108 |
131
+ | 0.0907 | 40.0 | 1320 | 0.2099 | 0.5691 | 0.7500 | 0.9323 | nan | 0.9445 | 0.0536 | 0.9894 | 0.8095 | 0.9528 | 0.0 | 0.8966 | 0.0520 | 0.9803 | 0.6778 | 0.8078 |
132
+ | 0.1174 | 40.6061 | 1340 | 0.2221 | 0.5677 | 0.7461 | 0.9332 | nan | 0.9491 | 0.0481 | 0.9913 | 0.7887 | 0.9535 | 0.0 | 0.9008 | 0.0464 | 0.9819 | 0.6707 | 0.8064 |
133
+ | 0.1053 | 41.2121 | 1360 | 0.2092 | 0.5699 | 0.7493 | 0.9320 | nan | 0.9447 | 0.0571 | 0.9904 | 0.8045 | 0.9496 | 0.0 | 0.8970 | 0.0545 | 0.9810 | 0.6814 | 0.8056 |
134
+ | 0.1026 | 41.8182 | 1380 | 0.2012 | 0.5752 | 0.7541 | 0.9338 | nan | 0.9489 | 0.0759 | 0.9926 | 0.8129 | 0.9404 | 0.0 | 0.8954 | 0.0720 | 0.9819 | 0.6856 | 0.8164 |
135
+ | 0.1371 | 42.4242 | 1400 | 0.1945 | 0.5783 | 0.7575 | 0.9336 | nan | 0.9466 | 0.0953 | 0.9927 | 0.8133 | 0.9399 | 0.0 | 0.8959 | 0.0889 | 0.9823 | 0.6863 | 0.8162 |
136
+ | 0.0799 | 43.0303 | 1420 | 0.2107 | 0.5712 | 0.7508 | 0.9332 | nan | 0.9492 | 0.0619 | 0.9902 | 0.8045 | 0.9483 | 0.0 | 0.8972 | 0.0595 | 0.9816 | 0.6766 | 0.8124 |
137
+ | 0.1458 | 43.6364 | 1440 | 0.1906 | 0.5788 | 0.7577 | 0.9339 | nan | 0.9489 | 0.0892 | 0.9909 | 0.8185 | 0.9410 | 0.0 | 0.8943 | 0.0843 | 0.9805 | 0.6925 | 0.8213 |
138
+ | 0.112 | 44.2424 | 1460 | 0.2091 | 0.5726 | 0.7541 | 0.9329 | nan | 0.9418 | 0.0579 | 0.9901 | 0.8288 | 0.9518 | 0.0 | 0.8953 | 0.0560 | 0.9802 | 0.6893 | 0.8149 |
139
+ | 0.1311 | 44.8485 | 1480 | 0.2022 | 0.5730 | 0.7519 | 0.9326 | nan | 0.9512 | 0.0656 | 0.9888 | 0.8131 | 0.9408 | 0.0 | 0.8959 | 0.0622 | 0.9791 | 0.6862 | 0.8146 |
140
+ | 0.1474 | 45.4545 | 1500 | 0.2001 | 0.5743 | 0.7578 | 0.9328 | nan | 0.9410 | 0.0698 | 0.9908 | 0.8441 | 0.9436 | 0.0 | 0.8939 | 0.0671 | 0.9805 | 0.6887 | 0.8157 |
141
+ | 0.0953 | 46.0606 | 1520 | 0.2072 | 0.5764 | 0.7568 | 0.9338 | nan | 0.9420 | 0.0672 | 0.9906 | 0.8310 | 0.9530 | 0.0 | 0.8954 | 0.0651 | 0.9815 | 0.7000 | 0.8167 |
142
+ | 0.1038 | 46.6667 | 1540 | 0.2003 | 0.5760 | 0.7544 | 0.9339 | nan | 0.9465 | 0.0760 | 0.9921 | 0.8092 | 0.9480 | 0.0 | 0.8963 | 0.0728 | 0.9824 | 0.6886 | 0.8157 |
143
+ | 0.1362 | 47.2727 | 1560 | 0.1978 | 0.5755 | 0.7533 | 0.9327 | nan | 0.9482 | 0.0793 | 0.9908 | 0.8065 | 0.9416 | 0.0 | 0.8950 | 0.0751 | 0.9804 | 0.6895 | 0.8130 |
144
+ | 0.1052 | 47.8788 | 1580 | 0.2080 | 0.5750 | 0.7526 | 0.9336 | nan | 0.9484 | 0.0676 | 0.9896 | 0.8045 | 0.9528 | 0.0 | 0.8969 | 0.0653 | 0.9802 | 0.6947 | 0.8129 |
145
+ | 0.0813 | 48.4848 | 1600 | 0.2054 | 0.5744 | 0.7548 | 0.9338 | nan | 0.9439 | 0.0579 | 0.9919 | 0.8332 | 0.9471 | 0.0 | 0.8961 | 0.0559 | 0.9819 | 0.6965 | 0.8163 |
146
+ | 0.0992 | 49.0909 | 1620 | 0.2016 | 0.5765 | 0.7563 | 0.9344 | nan | 0.9467 | 0.0730 | 0.9922 | 0.8243 | 0.9455 | 0.0 | 0.8952 | 0.0702 | 0.9826 | 0.6912 | 0.8200 |
147
+ | 0.1604 | 49.6970 | 1640 | 0.1941 | 0.5755 | 0.7536 | 0.9340 | nan | 0.9537 | 0.0749 | 0.9909 | 0.8095 | 0.9391 | 0.0 | 0.8956 | 0.0717 | 0.9803 | 0.6885 | 0.8168 |
148
+
149
+
150
+ ### Framework versions
151
+
152
+ - Transformers 4.42.4
153
+ - Pytorch 2.3.0+cu121
154
+ - Datasets 2.20.0
155
+ - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "background",
32
+ "1": "silver",
33
+ "2": "glass",
34
+ "3": "silicon",
35
+ "4": "void",
36
+ "5": "interfacial void"
37
+ },
38
+ "image_size": 224,
39
+ "initializer_range": 0.02,
40
+ "label2id": {
41
+ "background": 0,
42
+ "glass": 2,
43
+ "interfacial void": 5,
44
+ "silicon": 3,
45
+ "silver": 1,
46
+ "void": 4
47
+ },
48
+ "layer_norm_eps": 1e-06,
49
+ "mlp_ratios": [
50
+ 4,
51
+ 4,
52
+ 4,
53
+ 4
54
+ ],
55
+ "model_type": "segformer",
56
+ "num_attention_heads": [
57
+ 1,
58
+ 2,
59
+ 5,
60
+ 8
61
+ ],
62
+ "num_channels": 3,
63
+ "num_encoder_blocks": 4,
64
+ "patch_sizes": [
65
+ 7,
66
+ 3,
67
+ 3,
68
+ 3
69
+ ],
70
+ "reshape_last_stage": true,
71
+ "semantic_loss_ignore_index": 255,
72
+ "sr_ratios": [
73
+ 8,
74
+ 4,
75
+ 2,
76
+ 1
77
+ ],
78
+ "strides": [
79
+ 4,
80
+ 2,
81
+ 2,
82
+ 2
83
+ ],
84
+ "torch_dtype": "float32",
85
+ "transformers_version": "4.42.4"
86
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ef4536ee527cc5ae9f9401ef86e5054d632c3af72db128842a1b55383d456e7a
3
+ size 14888896
runs/Jul11_17-57-26_ae9983dcd873/events.out.tfevents.1720720649.ae9983dcd873.404.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae4ac932101d1d24167e8e01053ef35a7a3ba031a43c564511cee67ce40e1ab7
3
+ size 447010
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c1911047ac9efebb5f1c7c4d056380f6e9330d633ac97e696ad7b2698fd0c65
3
+ size 5176