update model card README.md
Browse files
README.md
CHANGED
@@ -1,8 +1,6 @@
|
|
1 |
---
|
2 |
license: other
|
3 |
tags:
|
4 |
-
- vision
|
5 |
-
- image-segmentation
|
6 |
- generated_from_trainer
|
7 |
model-index:
|
8 |
- name: safety-utcustom-train-SF-RGBD-b0
|
@@ -14,17 +12,17 @@ should probably proofread and complete it, then remove this comment. -->
|
|
14 |
|
15 |
# safety-utcustom-train-SF-RGBD-b0
|
16 |
|
17 |
-
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 0.
|
20 |
-
- Mean Iou: 0.
|
21 |
-
- Mean Accuracy: 0.
|
22 |
- Overall Accuracy: 0.9815
|
23 |
- Accuracy Unlabeled: nan
|
24 |
-
- Accuracy Safe: 0.
|
25 |
-
- Accuracy Unsafe: 0.
|
26 |
- Iou Unlabeled: nan
|
27 |
-
- Iou Safe: 0.
|
28 |
- Iou Unsafe: 0.9812
|
29 |
|
30 |
## Model description
|
@@ -51,7 +49,7 @@ The following hyperparameters were used during training:
|
|
51 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
52 |
- lr_scheduler_type: linear
|
53 |
- lr_scheduler_warmup_ratio: 0.05
|
54 |
-
- num_epochs:
|
55 |
|
56 |
### Training results
|
57 |
|
@@ -141,42 +139,52 @@ The following hyperparameters were used during training:
|
|
141 |
| 0.1218 | 82.0 | 820 | 0.4864 | nan | 0.9960 | 0.4300 | nan | 0.9807 | 0.1313 | 0.7412 | 0.7054 | 0.9810 |
|
142 |
| 0.138 | 83.0 | 830 | 0.5097 | nan | 0.9955 | 0.4445 | nan | 0.9809 | 0.1318 | 0.7526 | 0.7127 | 0.9812 |
|
143 |
| 0.1399 | 84.0 | 840 | 0.5067 | nan | 0.9957 | 0.4441 | nan | 0.9810 | 0.1290 | 0.7512 | 0.7126 | 0.9813 |
|
144 |
-
| 0.1455 | 85.0 | 850 | 0.
|
145 |
-
| 0.1466 | 86.0 | 860 | 0.
|
146 |
-
| 0.1769 | 87.0 | 870 | 0.
|
147 |
-
| 0.1453 | 88.0 | 880 | 0.
|
148 |
-
| 0.133 | 89.0 | 890 | 0.
|
149 |
-
| 0.1288 | 90.0 | 900 | 0.
|
150 |
-
| 0.1318 | 91.0 | 910 | 0.
|
151 |
-
| 0.1211 | 92.0 | 920 | 0.
|
152 |
-
| 0.1137 | 93.0 | 930 | 0.
|
153 |
-
| 0.1312 | 94.0 | 940 | 0.
|
154 |
-
| 0.1591 | 95.0 | 950 | 0.
|
155 |
-
| 0.1207 | 96.0 | 960 | 0.
|
156 |
-
| 0.1203 | 97.0 | 970 | 0.
|
157 |
-
| 0.1196 | 98.0 | 980 | 0.
|
158 |
-
| 0.1304 | 99.0 | 990 | 0.
|
159 |
-
| 0.1058 | 100.0 | 1000 | 0.
|
160 |
-
| 0.1135 | 101.0 | 1010 | 0.
|
161 |
-
| 0.1116 | 102.0 | 1020 | 0.
|
162 |
-
| 0.1036 | 103.0 | 1030 | 0.
|
163 |
-
| 0.1195 | 104.0 | 1040 | 0.
|
164 |
-
| 0.1205 | 105.0 | 1050 | 0.
|
165 |
-
| 0.1453 | 106.0 | 1060 | 0.
|
166 |
-
| 0.0965 | 107.0 | 1070 | 0.
|
167 |
-
| 0.1058 | 108.0 | 1080 | 0.
|
168 |
-
| 0.1163 | 109.0 | 1090 | 0.
|
169 |
-
| 0.1145 | 110.0 | 1100 | 0.
|
170 |
-
| 0.1031 | 111.0 | 1110 | 0.
|
171 |
-
| 0.1042 | 112.0 | 1120 | 0.
|
172 |
-
| 0.1119 | 113.0 | 1130 | 0.
|
173 |
-
| 0.1116 | 114.0 | 1140 | 0.
|
174 |
-
| 0.1063 | 115.0 | 1150 | 0.
|
175 |
-
| 0.1054 | 116.0 | 1160 | 0.
|
176 |
-
| 0.1613 | 117.0 | 1170 | 0.
|
177 |
-
| 0.1542 | 118.0 | 1180 | 0.
|
178 |
-
| 0.1226 | 119.0 | 1190 | 0.
|
179 |
-
| 0.1295 | 120.0 | 1200 | 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
180 |
|
181 |
|
182 |
### Framework versions
|
|
|
1 |
---
|
2 |
license: other
|
3 |
tags:
|
|
|
|
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
- name: safety-utcustom-train-SF-RGBD-b0
|
|
|
12 |
|
13 |
# safety-utcustom-train-SF-RGBD-b0
|
14 |
|
15 |
+
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Loss: 0.1043
|
18 |
+
- Mean Iou: 0.7188
|
19 |
+
- Mean Accuracy: 0.7607
|
20 |
- Overall Accuracy: 0.9815
|
21 |
- Accuracy Unlabeled: nan
|
22 |
+
- Accuracy Safe: 0.5261
|
23 |
+
- Accuracy Unsafe: 0.9953
|
24 |
- Iou Unlabeled: nan
|
25 |
+
- Iou Safe: 0.4564
|
26 |
- Iou Unsafe: 0.9812
|
27 |
|
28 |
## Model description
|
|
|
49 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
- lr_scheduler_type: linear
|
51 |
- lr_scheduler_warmup_ratio: 0.05
|
52 |
+
- num_epochs: 130
|
53 |
|
54 |
### Training results
|
55 |
|
|
|
139 |
| 0.1218 | 82.0 | 820 | 0.4864 | nan | 0.9960 | 0.4300 | nan | 0.9807 | 0.1313 | 0.7412 | 0.7054 | 0.9810 |
|
140 |
| 0.138 | 83.0 | 830 | 0.5097 | nan | 0.9955 | 0.4445 | nan | 0.9809 | 0.1318 | 0.7526 | 0.7127 | 0.9812 |
|
141 |
| 0.1399 | 84.0 | 840 | 0.5067 | nan | 0.9957 | 0.4441 | nan | 0.9810 | 0.1290 | 0.7512 | 0.7126 | 0.9813 |
|
142 |
+
| 0.1455 | 85.0 | 850 | 0.5024 | nan | 0.9957 | 0.4404 | nan | 0.9809 | 0.1277 | 0.7491 | 0.7106 | 0.9811 |
|
143 |
+
| 0.1466 | 86.0 | 860 | 0.4920 | nan | 0.9959 | 0.4341 | nan | 0.9808 | 0.1243 | 0.7440 | 0.7074 | 0.9811 |
|
144 |
+
| 0.1769 | 87.0 | 870 | 0.5737 | nan | 0.9924 | 0.4592 | nan | 0.9797 | 0.1317 | 0.7831 | 0.7194 | 0.9800 |
|
145 |
+
| 0.1453 | 88.0 | 880 | 0.3341 | nan | 0.9978 | 0.3115 | nan | 0.9780 | 0.1254 | 0.6659 | 0.6447 | 0.9782 |
|
146 |
+
| 0.133 | 89.0 | 890 | 0.5257 | nan | 0.9950 | 0.4518 | nan | 0.9809 | 0.1283 | 0.7603 | 0.7163 | 0.9812 |
|
147 |
+
| 0.1288 | 90.0 | 900 | 0.5049 | nan | 0.9957 | 0.4420 | nan | 0.9809 | 0.1221 | 0.7503 | 0.7115 | 0.9812 |
|
148 |
+
| 0.1318 | 91.0 | 910 | 0.4838 | nan | 0.9961 | 0.4290 | nan | 0.9807 | 0.1219 | 0.7400 | 0.7049 | 0.9810 |
|
149 |
+
| 0.1211 | 92.0 | 920 | 0.5355 | nan | 0.9950 | 0.4596 | nan | 0.9811 | 0.1242 | 0.7652 | 0.7203 | 0.9814 |
|
150 |
+
| 0.1137 | 93.0 | 930 | 0.5135 | nan | 0.9958 | 0.4517 | nan | 0.9813 | 0.1181 | 0.7547 | 0.7165 | 0.9816 |
|
151 |
+
| 0.1312 | 94.0 | 940 | 0.4775 | nan | 0.9963 | 0.4262 | nan | 0.9807 | 0.1199 | 0.7369 | 0.7035 | 0.9810 |
|
152 |
+
| 0.1591 | 95.0 | 950 | 0.5115 | nan | 0.9956 | 0.4473 | nan | 0.9810 | 0.1182 | 0.7536 | 0.7142 | 0.9813 |
|
153 |
+
| 0.1207 | 96.0 | 960 | 0.5206 | nan | 0.9956 | 0.4544 | nan | 0.9812 | 0.1156 | 0.7581 | 0.7178 | 0.9815 |
|
154 |
+
| 0.1203 | 97.0 | 970 | 0.5054 | nan | 0.9958 | 0.4439 | nan | 0.9810 | 0.1165 | 0.7506 | 0.7124 | 0.9813 |
|
155 |
+
| 0.1196 | 98.0 | 980 | 0.5296 | nan | 0.9953 | 0.4585 | nan | 0.9812 | 0.1131 | 0.7624 | 0.7199 | 0.9815 |
|
156 |
+
| 0.1304 | 99.0 | 990 | 0.5269 | nan | 0.9953 | 0.4568 | nan | 0.9812 | 0.1155 | 0.7611 | 0.7190 | 0.9815 |
|
157 |
+
| 0.1058 | 100.0 | 1000 | 0.5163 | nan | 0.9955 | 0.4496 | nan | 0.9810 | 0.1144 | 0.7559 | 0.7153 | 0.9813 |
|
158 |
+
| 0.1135 | 101.0 | 1010 | 0.4934 | nan | 0.9961 | 0.4368 | nan | 0.9809 | 0.1113 | 0.7447 | 0.7089 | 0.9812 |
|
159 |
+
| 0.1116 | 102.0 | 1020 | 0.5878 | nan | 0.9932 | 0.4799 | nan | 0.9808 | 0.1128 | 0.7905 | 0.7304 | 0.9812 |
|
160 |
+
| 0.1036 | 103.0 | 1030 | 0.4826 | nan | 0.9963 | 0.4304 | nan | 0.9809 | 0.1078 | 0.7394 | 0.7056 | 0.9811 |
|
161 |
+
| 0.1195 | 104.0 | 1040 | 0.4364 | nan | 0.9966 | 0.3930 | nan | 0.9798 | 0.1110 | 0.7165 | 0.6864 | 0.9801 |
|
162 |
+
| 0.1205 | 105.0 | 1050 | 0.5793 | nan | 0.9934 | 0.4762 | nan | 0.9808 | 0.1120 | 0.7864 | 0.7285 | 0.9812 |
|
163 |
+
| 0.1453 | 106.0 | 1060 | 0.4707 | nan | 0.9964 | 0.4205 | nan | 0.9806 | 0.1110 | 0.7336 | 0.7005 | 0.9808 |
|
164 |
+
| 0.0965 | 107.0 | 1070 | 0.5638 | nan | 0.9941 | 0.4723 | nan | 0.9811 | 0.1091 | 0.7789 | 0.7267 | 0.9814 |
|
165 |
+
| 0.1058 | 108.0 | 1080 | 0.4881 | nan | 0.9962 | 0.4337 | nan | 0.9809 | 0.1085 | 0.7422 | 0.7073 | 0.9812 |
|
166 |
+
| 0.1163 | 109.0 | 1090 | 0.5128 | nan | 0.9957 | 0.4493 | nan | 0.9811 | 0.1077 | 0.7542 | 0.7152 | 0.9814 |
|
167 |
+
| 0.1145 | 110.0 | 1100 | 0.5228 | nan | 0.9954 | 0.4547 | nan | 0.9812 | 0.1081 | 0.7591 | 0.7179 | 0.9815 |
|
168 |
+
| 0.1031 | 111.0 | 1110 | 0.5522 | nan | 0.9945 | 0.4673 | nan | 0.9811 | 0.1073 | 0.7733 | 0.7242 | 0.9814 |
|
169 |
+
| 0.1042 | 112.0 | 1120 | 0.5490 | nan | 0.9947 | 0.4669 | nan | 0.9812 | 0.1064 | 0.7718 | 0.7241 | 0.9815 |
|
170 |
+
| 0.1119 | 113.0 | 1130 | 0.5064 | nan | 0.9958 | 0.4449 | nan | 0.9811 | 0.1063 | 0.7511 | 0.7130 | 0.9813 |
|
171 |
+
| 0.1116 | 114.0 | 1140 | 0.5172 | nan | 0.9956 | 0.4520 | nan | 0.9812 | 0.1074 | 0.7564 | 0.7166 | 0.9815 |
|
172 |
+
| 0.1063 | 115.0 | 1150 | 0.5163 | nan | 0.9956 | 0.4511 | nan | 0.9812 | 0.1072 | 0.7560 | 0.7161 | 0.9814 |
|
173 |
+
| 0.1054 | 116.0 | 1160 | 0.4994 | nan | 0.9960 | 0.4408 | nan | 0.9810 | 0.1065 | 0.7477 | 0.7109 | 0.9813 |
|
174 |
+
| 0.1613 | 117.0 | 1170 | 0.5251 | nan | 0.9955 | 0.4576 | nan | 0.9813 | 0.1060 | 0.7603 | 0.7195 | 0.9816 |
|
175 |
+
| 0.1542 | 118.0 | 1180 | 0.5454 | nan | 0.9947 | 0.4649 | nan | 0.9812 | 0.1058 | 0.7701 | 0.7230 | 0.9815 |
|
176 |
+
| 0.1226 | 119.0 | 1190 | 0.5469 | nan | 0.9947 | 0.4658 | nan | 0.9812 | 0.1064 | 0.7708 | 0.7235 | 0.9815 |
|
177 |
+
| 0.1295 | 120.0 | 1200 | 0.5437 | nan | 0.9948 | 0.4646 | nan | 0.9812 | 0.1060 | 0.7693 | 0.7229 | 0.9815 |
|
178 |
+
| 0.1438 | 121.0 | 1210 | 0.1076 | 0.7084 | 0.7435 | 0.9812 | nan | 0.4909 | 0.9962 | nan | 0.4358 | 0.9809 |
|
179 |
+
| 0.1391 | 122.0 | 1220 | 0.1081 | 0.7221 | 0.7683 | 0.9814 | nan | 0.5417 | 0.9948 | nan | 0.4630 | 0.9811 |
|
180 |
+
| 0.1756 | 123.0 | 1230 | 0.1041 | 0.7233 | 0.7710 | 0.9814 | nan | 0.5473 | 0.9947 | nan | 0.4655 | 0.9811 |
|
181 |
+
| 0.1174 | 124.0 | 1240 | 0.1029 | 0.7189 | 0.7614 | 0.9815 | nan | 0.5275 | 0.9953 | nan | 0.4566 | 0.9812 |
|
182 |
+
| 0.1025 | 125.0 | 1250 | 0.1043 | 0.7100 | 0.7470 | 0.9812 | nan | 0.4980 | 0.9959 | nan | 0.4391 | 0.9809 |
|
183 |
+
| 0.0997 | 126.0 | 1260 | 0.1038 | 0.7211 | 0.7638 | 0.9816 | nan | 0.5323 | 0.9953 | nan | 0.4609 | 0.9813 |
|
184 |
+
| 0.1768 | 127.0 | 1270 | 0.1037 | 0.7204 | 0.7617 | 0.9817 | nan | 0.5279 | 0.9955 | nan | 0.4594 | 0.9814 |
|
185 |
+
| 0.1527 | 128.0 | 1280 | 0.1027 | 0.7167 | 0.7564 | 0.9815 | nan | 0.5171 | 0.9956 | nan | 0.4522 | 0.9812 |
|
186 |
+
| 0.1269 | 129.0 | 1290 | 0.1041 | 0.7178 | 0.7583 | 0.9815 | nan | 0.5211 | 0.9955 | nan | 0.4543 | 0.9812 |
|
187 |
+
| 0.0968 | 130.0 | 1300 | 0.1043 | 0.7188 | 0.7607 | 0.9815 | nan | 0.5261 | 0.9953 | nan | 0.4564 | 0.9812 |
|
188 |
|
189 |
|
190 |
### Framework versions
|