update model card README.md
Browse files
README.md
CHANGED
@@ -1,8 +1,6 @@
|
|
1 |
---
|
2 |
license: other
|
3 |
tags:
|
4 |
-
- vision
|
5 |
-
- image-segmentation
|
6 |
- generated_from_trainer
|
7 |
model-index:
|
8 |
- name: safety-utcustom-train-SF-RGBD-b0
|
@@ -14,18 +12,18 @@ should probably proofread and complete it, then remove this comment. -->
|
|
14 |
|
15 |
# safety-utcustom-train-SF-RGBD-b0
|
16 |
|
17 |
-
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 0.
|
20 |
-
- Mean Iou: 0.
|
21 |
-
- Mean Accuracy: 0.
|
22 |
-
- Overall Accuracy: 0.
|
23 |
- Accuracy Unlabeled: nan
|
24 |
-
- Accuracy Safe: 0.
|
25 |
-
- Accuracy Unsafe: 0.
|
26 |
- Iou Unlabeled: nan
|
27 |
-
- Iou Safe: 0.
|
28 |
-
- Iou Unsafe: 0.
|
29 |
|
30 |
## Model description
|
31 |
|
@@ -51,7 +49,7 @@ The following hyperparameters were used during training:
|
|
51 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
52 |
- lr_scheduler_type: linear
|
53 |
- lr_scheduler_warmup_ratio: 0.05
|
54 |
-
- num_epochs:
|
55 |
|
56 |
### Training results
|
57 |
|
@@ -125,23 +123,58 @@ The following hyperparameters were used during training:
|
|
125 |
| 0.151 | 66.0 | 660 | 0.4718 | nan | 0.9962 | 0.4195 | nan | 0.9804 | 0.1423 | 0.7340 | 0.7000 | 0.9807 |
|
126 |
| 0.166 | 67.0 | 670 | 0.4721 | nan | 0.9963 | 0.4208 | nan | 0.9805 | 0.1427 | 0.7342 | 0.7007 | 0.9808 |
|
127 |
| 0.1561 | 68.0 | 680 | 0.4916 | nan | 0.9959 | 0.4332 | nan | 0.9807 | 0.1420 | 0.7437 | 0.7070 | 0.9810 |
|
128 |
-
| 0.1501 | 69.0 | 690 | 0.
|
129 |
-
| 0.1598 | 70.0 | 700 | 0.
|
130 |
-
| 0.1431 | 71.0 | 710 | 0.
|
131 |
-
| 0.164 | 72.0 | 720 | 0.
|
132 |
-
| 0.1555 | 73.0 | 730 | 0.
|
133 |
-
| 0.1924 | 74.0 | 740 | 0.
|
134 |
-
| 0.1612 | 75.0 | 750 | 0.
|
135 |
-
| 0.1234 | 76.0 | 760 | 0.
|
136 |
-
| 0.1679 | 77.0 | 770 | 0.
|
137 |
-
| 0.1375 | 78.0 | 780 | 0.
|
138 |
-
| 0.1839 | 79.0 | 790 | 0.
|
139 |
-
| 0.155 | 80.0 | 800 | 0.
|
140 |
-
| 0.1219 | 81.0 | 810 | 0.
|
141 |
-
| 0.1218 | 82.0 | 820 | 0.
|
142 |
-
| 0.138 | 83.0 | 830 | 0.
|
143 |
-
| 0.1399 | 84.0 | 840 | 0.
|
144 |
-
| 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
145 |
|
146 |
|
147 |
### Framework versions
|
|
|
1 |
---
|
2 |
license: other
|
3 |
tags:
|
|
|
|
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
- name: safety-utcustom-train-SF-RGBD-b0
|
|
|
12 |
|
13 |
# safety-utcustom-train-SF-RGBD-b0
|
14 |
|
15 |
+
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Loss: 0.1060
|
18 |
+
- Mean Iou: 0.7229
|
19 |
+
- Mean Accuracy: 0.7693
|
20 |
+
- Overall Accuracy: 0.9815
|
21 |
- Accuracy Unlabeled: nan
|
22 |
+
- Accuracy Safe: 0.5437
|
23 |
+
- Accuracy Unsafe: 0.9948
|
24 |
- Iou Unlabeled: nan
|
25 |
+
- Iou Safe: 0.4646
|
26 |
+
- Iou Unsafe: 0.9812
|
27 |
|
28 |
## Model description
|
29 |
|
|
|
49 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
- lr_scheduler_type: linear
|
51 |
- lr_scheduler_warmup_ratio: 0.05
|
52 |
+
- num_epochs: 120
|
53 |
|
54 |
### Training results
|
55 |
|
|
|
123 |
| 0.151 | 66.0 | 660 | 0.4718 | nan | 0.9962 | 0.4195 | nan | 0.9804 | 0.1423 | 0.7340 | 0.7000 | 0.9807 |
|
124 |
| 0.166 | 67.0 | 670 | 0.4721 | nan | 0.9963 | 0.4208 | nan | 0.9805 | 0.1427 | 0.7342 | 0.7007 | 0.9808 |
|
125 |
| 0.1561 | 68.0 | 680 | 0.4916 | nan | 0.9959 | 0.4332 | nan | 0.9807 | 0.1420 | 0.7437 | 0.7070 | 0.9810 |
|
126 |
+
| 0.1501 | 69.0 | 690 | 0.4906 | nan | 0.9958 | 0.4311 | nan | 0.9806 | 0.1437 | 0.7432 | 0.7058 | 0.9809 |
|
127 |
+
| 0.1598 | 70.0 | 700 | 0.3445 | nan | 0.9977 | 0.3204 | nan | 0.9782 | 0.1379 | 0.6711 | 0.6493 | 0.9784 |
|
128 |
+
| 0.1431 | 71.0 | 710 | 0.4898 | nan | 0.9960 | 0.4325 | nan | 0.9807 | 0.1400 | 0.7429 | 0.7066 | 0.9810 |
|
129 |
+
| 0.164 | 72.0 | 720 | 0.4698 | nan | 0.9964 | 0.4196 | nan | 0.9805 | 0.1347 | 0.7331 | 0.7001 | 0.9808 |
|
130 |
+
| 0.1555 | 73.0 | 730 | 0.5271 | nan | 0.9937 | 0.4364 | nan | 0.9796 | 0.1368 | 0.7604 | 0.7080 | 0.9799 |
|
131 |
+
| 0.1924 | 74.0 | 740 | 0.4638 | nan | 0.9965 | 0.4159 | nan | 0.9805 | 0.1312 | 0.7301 | 0.6982 | 0.9808 |
|
132 |
+
| 0.1612 | 75.0 | 750 | 0.5052 | nan | 0.9956 | 0.4409 | nan | 0.9808 | 0.1340 | 0.7504 | 0.7108 | 0.9811 |
|
133 |
+
| 0.1234 | 76.0 | 760 | 0.5301 | nan | 0.9946 | 0.4501 | nan | 0.9806 | 0.1354 | 0.7624 | 0.7153 | 0.9809 |
|
134 |
+
| 0.1679 | 77.0 | 770 | 0.4644 | nan | 0.9964 | 0.4156 | nan | 0.9804 | 0.1323 | 0.7304 | 0.6980 | 0.9807 |
|
135 |
+
| 0.1375 | 78.0 | 780 | 0.4804 | nan | 0.9961 | 0.4263 | nan | 0.9806 | 0.1355 | 0.7383 | 0.7035 | 0.9809 |
|
136 |
+
| 0.1839 | 79.0 | 790 | 0.5070 | nan | 0.9955 | 0.4422 | nan | 0.9808 | 0.1319 | 0.7512 | 0.7115 | 0.9811 |
|
137 |
+
| 0.155 | 80.0 | 800 | 0.4846 | nan | 0.9961 | 0.4295 | nan | 0.9807 | 0.1298 | 0.7403 | 0.7051 | 0.9810 |
|
138 |
+
| 0.1219 | 81.0 | 810 | 0.4671 | nan | 0.9963 | 0.4167 | nan | 0.9804 | 0.1302 | 0.7317 | 0.6986 | 0.9807 |
|
139 |
+
| 0.1218 | 82.0 | 820 | 0.4864 | nan | 0.9960 | 0.4300 | nan | 0.9807 | 0.1313 | 0.7412 | 0.7054 | 0.9810 |
|
140 |
+
| 0.138 | 83.0 | 830 | 0.5097 | nan | 0.9955 | 0.4445 | nan | 0.9809 | 0.1318 | 0.7526 | 0.7127 | 0.9812 |
|
141 |
+
| 0.1399 | 84.0 | 840 | 0.5067 | nan | 0.9957 | 0.4441 | nan | 0.9810 | 0.1290 | 0.7512 | 0.7126 | 0.9813 |
|
142 |
+
| 0.1455 | 85.0 | 850 | 0.1277 | 0.7106 | 0.7491 | 0.9811 | nan | 0.5024 | 0.9957 | nan | 0.4404 | 0.9809 |
|
143 |
+
| 0.1466 | 86.0 | 860 | 0.1243 | 0.7074 | 0.7440 | 0.9811 | nan | 0.4920 | 0.9959 | nan | 0.4341 | 0.9808 |
|
144 |
+
| 0.1769 | 87.0 | 870 | 0.1317 | 0.7194 | 0.7831 | 0.9800 | nan | 0.5737 | 0.9924 | nan | 0.4592 | 0.9797 |
|
145 |
+
| 0.1453 | 88.0 | 880 | 0.1254 | 0.6447 | 0.6659 | 0.9782 | nan | 0.3341 | 0.9978 | nan | 0.3115 | 0.9780 |
|
146 |
+
| 0.133 | 89.0 | 890 | 0.1283 | 0.7163 | 0.7603 | 0.9812 | nan | 0.5257 | 0.9950 | nan | 0.4518 | 0.9809 |
|
147 |
+
| 0.1288 | 90.0 | 900 | 0.1221 | 0.7115 | 0.7503 | 0.9812 | nan | 0.5049 | 0.9957 | nan | 0.4420 | 0.9809 |
|
148 |
+
| 0.1318 | 91.0 | 910 | 0.1219 | 0.7049 | 0.7400 | 0.9810 | nan | 0.4838 | 0.9961 | nan | 0.4290 | 0.9807 |
|
149 |
+
| 0.1211 | 92.0 | 920 | 0.1242 | 0.7203 | 0.7652 | 0.9814 | nan | 0.5355 | 0.9950 | nan | 0.4596 | 0.9811 |
|
150 |
+
| 0.1137 | 93.0 | 930 | 0.1181 | 0.7165 | 0.7547 | 0.9816 | nan | 0.5135 | 0.9958 | nan | 0.4517 | 0.9813 |
|
151 |
+
| 0.1312 | 94.0 | 940 | 0.1199 | 0.7035 | 0.7369 | 0.9810 | nan | 0.4775 | 0.9963 | nan | 0.4262 | 0.9807 |
|
152 |
+
| 0.1591 | 95.0 | 950 | 0.1182 | 0.7142 | 0.7536 | 0.9813 | nan | 0.5115 | 0.9956 | nan | 0.4473 | 0.9810 |
|
153 |
+
| 0.1207 | 96.0 | 960 | 0.1156 | 0.7178 | 0.7581 | 0.9815 | nan | 0.5206 | 0.9956 | nan | 0.4544 | 0.9812 |
|
154 |
+
| 0.1203 | 97.0 | 970 | 0.1165 | 0.7124 | 0.7506 | 0.9813 | nan | 0.5054 | 0.9958 | nan | 0.4439 | 0.9810 |
|
155 |
+
| 0.1196 | 98.0 | 980 | 0.1131 | 0.7199 | 0.7624 | 0.9815 | nan | 0.5296 | 0.9953 | nan | 0.4585 | 0.9812 |
|
156 |
+
| 0.1304 | 99.0 | 990 | 0.1155 | 0.7190 | 0.7611 | 0.9815 | nan | 0.5269 | 0.9953 | nan | 0.4568 | 0.9812 |
|
157 |
+
| 0.1058 | 100.0 | 1000 | 0.1144 | 0.7153 | 0.7559 | 0.9813 | nan | 0.5163 | 0.9955 | nan | 0.4496 | 0.9810 |
|
158 |
+
| 0.1135 | 101.0 | 1010 | 0.1113 | 0.7089 | 0.7447 | 0.9812 | nan | 0.4934 | 0.9961 | nan | 0.4368 | 0.9809 |
|
159 |
+
| 0.1116 | 102.0 | 1020 | 0.1128 | 0.7304 | 0.7905 | 0.9812 | nan | 0.5878 | 0.9932 | nan | 0.4799 | 0.9808 |
|
160 |
+
| 0.1036 | 103.0 | 1030 | 0.1078 | 0.7056 | 0.7394 | 0.9811 | nan | 0.4826 | 0.9963 | nan | 0.4304 | 0.9809 |
|
161 |
+
| 0.1195 | 104.0 | 1040 | 0.1110 | 0.6864 | 0.7165 | 0.9801 | nan | 0.4364 | 0.9966 | nan | 0.3930 | 0.9798 |
|
162 |
+
| 0.1205 | 105.0 | 1050 | 0.1120 | 0.7285 | 0.7864 | 0.9812 | nan | 0.5793 | 0.9934 | nan | 0.4762 | 0.9808 |
|
163 |
+
| 0.1453 | 106.0 | 1060 | 0.1110 | 0.7005 | 0.7336 | 0.9808 | nan | 0.4707 | 0.9964 | nan | 0.4205 | 0.9806 |
|
164 |
+
| 0.0965 | 107.0 | 1070 | 0.1091 | 0.7267 | 0.7789 | 0.9814 | nan | 0.5638 | 0.9941 | nan | 0.4723 | 0.9811 |
|
165 |
+
| 0.1058 | 108.0 | 1080 | 0.1085 | 0.7073 | 0.7422 | 0.9812 | nan | 0.4881 | 0.9962 | nan | 0.4337 | 0.9809 |
|
166 |
+
| 0.1163 | 109.0 | 1090 | 0.1077 | 0.7152 | 0.7542 | 0.9814 | nan | 0.5128 | 0.9957 | nan | 0.4493 | 0.9811 |
|
167 |
+
| 0.1145 | 110.0 | 1100 | 0.1081 | 0.7179 | 0.7591 | 0.9815 | nan | 0.5228 | 0.9954 | nan | 0.4547 | 0.9812 |
|
168 |
+
| 0.1031 | 111.0 | 1110 | 0.1073 | 0.7242 | 0.7733 | 0.9814 | nan | 0.5522 | 0.9945 | nan | 0.4673 | 0.9811 |
|
169 |
+
| 0.1042 | 112.0 | 1120 | 0.1064 | 0.7241 | 0.7718 | 0.9815 | nan | 0.5490 | 0.9947 | nan | 0.4669 | 0.9812 |
|
170 |
+
| 0.1119 | 113.0 | 1130 | 0.1063 | 0.7130 | 0.7511 | 0.9813 | nan | 0.5064 | 0.9958 | nan | 0.4449 | 0.9811 |
|
171 |
+
| 0.1116 | 114.0 | 1140 | 0.1074 | 0.7166 | 0.7564 | 0.9815 | nan | 0.5172 | 0.9956 | nan | 0.4520 | 0.9812 |
|
172 |
+
| 0.1063 | 115.0 | 1150 | 0.1072 | 0.7161 | 0.7560 | 0.9814 | nan | 0.5163 | 0.9956 | nan | 0.4511 | 0.9812 |
|
173 |
+
| 0.1054 | 116.0 | 1160 | 0.1065 | 0.7109 | 0.7477 | 0.9813 | nan | 0.4994 | 0.9960 | nan | 0.4408 | 0.9810 |
|
174 |
+
| 0.1613 | 117.0 | 1170 | 0.1060 | 0.7195 | 0.7603 | 0.9816 | nan | 0.5251 | 0.9955 | nan | 0.4576 | 0.9813 |
|
175 |
+
| 0.1542 | 118.0 | 1180 | 0.1058 | 0.7230 | 0.7701 | 0.9815 | nan | 0.5454 | 0.9947 | nan | 0.4649 | 0.9812 |
|
176 |
+
| 0.1226 | 119.0 | 1190 | 0.1064 | 0.7235 | 0.7708 | 0.9815 | nan | 0.5469 | 0.9947 | nan | 0.4658 | 0.9812 |
|
177 |
+
| 0.1295 | 120.0 | 1200 | 0.1060 | 0.7229 | 0.7693 | 0.9815 | nan | 0.5437 | 0.9948 | nan | 0.4646 | 0.9812 |
|
178 |
|
179 |
|
180 |
### Framework versions
|