nateraw commited on
Commit
988d8a1
1 Parent(s): 1590036

Training in progress epoch 5

Browse files
Files changed (2) hide show
  1. README.md +53 -30
  2. tf_model.h5 +1 -1
README.md CHANGED
@@ -14,24 +14,30 @@ probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Train Loss: 0.5609
18
- - Validation Loss: 0.5099
19
- - Validation Mean Iou: 0.2920
20
- - Validation Mean Accuracy: 0.3599
21
- - Validation Overall Accuracy: 0.8385
22
- - Validation Per Category Iou: [0. 0.70817583 0.84131144 0.66573523 0.81449696 0.38891117
23
- nan 0.28124784 0.42659255 0. 0.80855146 0.
24
- 0. 0. 0. 0.46011866 0. 0.
25
- 0.65458792 0. 0.28411565 0.46758138 0. nan
26
- 0. 0.21849067 0. 0. 0.83829062 0.71207623
27
- 0.89929169 0. 0.02846127 0.13782635 0. ]
28
- - Validation Per Category Accuracy: [0. 0.88632871 0.91269832 0.79044294 0.88368528 0.57405218
29
- nan 0.35035973 0.77610775 0. 0.8889696 0.
30
- 0. 0. 0. 0.6020786 0. 0.
31
- 0.74586521 0. 0.61602403 0.54519561 0. nan
32
- 0. 0.28447396 0. 0. 0.94520232 0.85544414
33
- 0.95994042 0. 0.04680851 0.21407134 0. ]
34
- - Epoch: 4
 
 
 
 
 
 
35
 
36
  ## Model description
37
 
@@ -55,63 +61,80 @@ The following hyperparameters were used during training:
55
 
56
  ### Training results
57
 
58
- | Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Per Category Iou | Validation Per Category Accuracy | Epoch |
59
- |:----------:|:---------------:|:-------------------:|:------------------------:|:---------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----:|
60
  | 1.4089 | 0.8220 | 0.1975 | 0.2427 | 0.7701 | [0. 0.58353931 0.7655921 0.04209491 0.53135026 0.11779776
61
  nan 0.07709853 0.15950712 0. 0.69634813 0.
62
  0. 0. 0. 0. 0. 0.
63
  0.61456822 0. 0.24971248 0.27129675 0. nan
64
  0. 0.07697324 0. 0. 0.78576516 0.61267064
65
- 0.84564576 0. 0. 0.08904216 0. ] | [0. 0.88026971 0.93475302 0.04216372 0.5484085 0.13285614
66
  nan 0.08669707 0.19044773 0. 0.90089024 0.
67
  0. 0. 0. 0. 0. 0.
68
  0.76783975 0. 0.42102101 0.28659817 0. nan
69
  0. 0.08671771 0. 0. 0.89590301 0.74932576
70
- 0.9434814 0. 0. 0.14245566 0. ] | 0 |
71
  | 0.8462 | 0.6135 | 0.2551 | 0.2960 | 0.8200 | [0. 0.66967645 0.80571406 0.56416239 0.66692248 0.24744912
72
  nan 0.23994505 0.28962463 0. 0.76504783 0.
73
  0. 0. 0. 0.14111353 0. 0.
74
  0.6924468 0. 0.27988701 0.41876094 0. nan
75
  0. 0.14755829 0. 0. 0.81614463 0.68429711
76
- 0.87710938 0. 0. 0.11234171 0. ] | [0. 0.83805933 0.94928385 0.59586511 0.72913519 0.30595504
77
  nan 0.3128234 0.34805831 0. 0.87847495 0.
78
  0. 0. 0. 0.14205167 0. 0.
79
  0.87543619 0. 0.36001144 0.49498574 0. nan
80
  0. 0.18179115 0. 0. 0.92867923 0.7496178
81
- 0.92220166 0. 0. 0.15398549 0. ] | 1 |
82
  | 0.7134 | 0.5660 | 0.2780 | 0.3320 | 0.8286 | [0. 0.64791461 0.83800512 0.67301044 0.68120631 0.27361472
83
  nan 0.26715802 0.43596999 0. 0.78649287 0.
84
  0. 0. 0. 0.41256964 0. 0.
85
  0.71114766 0. 0.31646321 0.44682442 0. nan
86
  0. 0.17132551 0. 0. 0.81845697 0.67536699
87
- 0.88940936 0. 0. 0.1304862 0. ] | [0. 0.85958877 0.92084269 0.82341633 0.74725972 0.33495972
88
  nan 0.40755277 0.56591531 0. 0.90641721 0.
89
  0. 0. 0. 0.48144408 0. 0.
90
  0.88294811 0. 0.46962078 0.47517397 0. nan
91
  0. 0.20631607 0. 0. 0.90956851 0.85856042
92
- 0.94107052 0. 0. 0.16669713 0. ] | 2 |
93
  | 0.6320 | 0.5173 | 0.2894 | 0.3454 | 0.8435 | [0. 0.70789146 0.84902296 0.65266358 0.76099965 0.32934391
94
  nan 0.29576422 0.43988204 0. 0.79276447 0.
95
  0. 0. 0. 0.42668367 0. 0.
96
  0.71717911 0. 0.32151249 0.50084444 0. nan
97
  0. 0.18711455 0. 0. 0.82903803 0.68990498
98
- 0.8990059 0. 0.00213015 0.14819771 0. ] | [0. 0.84048763 0.93514369 0.68355212 0.88302113 0.458816
99
  nan 0.38623272 0.69456442 0. 0.92379471 0.
100
  0. 0. 0. 0.50677438 0. 0.
101
  0.90362965 0. 0.4662386 0.57368294 0. nan
102
  0. 0.23281768 0. 0. 0.9001526 0.86786434
103
- 0.95195314 0. 0.00333751 0.18532191 0. ] | 3 |
104
  | 0.5609 | 0.5099 | 0.2920 | 0.3599 | 0.8385 | [0. 0.70817583 0.84131144 0.66573523 0.81449696 0.38891117
105
  nan 0.28124784 0.42659255 0. 0.80855146 0.
106
  0. 0. 0. 0.46011866 0. 0.
107
  0.65458792 0. 0.28411565 0.46758138 0. nan
108
  0. 0.21849067 0. 0. 0.83829062 0.71207623
109
- 0.89929169 0. 0.02846127 0.13782635 0. ] | [0. 0.88632871 0.91269832 0.79044294 0.88368528 0.57405218
110
  nan 0.35035973 0.77610775 0. 0.8889696 0.
111
  0. 0. 0. 0.6020786 0. 0.
112
  0.74586521 0. 0.61602403 0.54519561 0. nan
113
  0. 0.28447396 0. 0. 0.94520232 0.85544414
114
- 0.95994042 0. 0.04680851 0.21407134 0. ] | 4 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
115
 
116
 
117
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Train Loss: 0.5256
18
+ - Validation Loss: 0.4741
19
+ - Validation Mean Iou: 0.3045
20
+ - Validation Mean Accuracy: 0.3598
21
+ - Validation Overall Accuracy: 0.8558
22
+ - Validation Per Category Iou: [0.00000000e+00 7.50159008e-01 8.53654462e-01 6.44928131e-01
23
+ 7.90455244e-01 4.33599913e-01 nan 3.33472954e-01
24
+ 4.74502513e-01 0.00000000e+00 8.01366017e-01 0.00000000e+00
25
+ 0.00000000e+00 0.00000000e+00 0.00000000e+00 4.67653814e-01
26
+ 0.00000000e+00 0.00000000e+00 7.27412479e-01 0.00000000e+00
27
+ 4.18946113e-01 5.04714837e-01 0.00000000e+00 nan
28
+ 0.00000000e+00 2.00373855e-01 0.00000000e+00 0.00000000e+00
29
+ 8.50200795e-01 7.41636173e-01 9.08320534e-01 2.77259907e-04
30
+ 0.00000000e+00 1.45430716e-01 0.00000000e+00]
31
+ - Validation Per Category Accuracy: [0.00000000e+00 8.86487233e-01 9.05201886e-01 7.23139265e-01
32
+ 8.91929263e-01 7.26675641e-01 nan 4.36386295e-01
33
+ 6.64378543e-01 0.00000000e+00 8.89056843e-01 0.00000000e+00
34
+ 0.00000000e+00 0.00000000e+00 0.00000000e+00 5.65450644e-01
35
+ 0.00000000e+00 0.00000000e+00 9.27446136e-01 0.00000000e+00
36
+ 5.36031025e-01 5.84198054e-01 0.00000000e+00 nan
37
+ 0.00000000e+00 2.42514534e-01 0.00000000e+00 0.00000000e+00
38
+ 9.31954754e-01 8.26849708e-01 9.59880377e-01 2.79039335e-04
39
+ 0.00000000e+00 1.77106051e-01 0.00000000e+00]
40
+ - Epoch: 5
41
 
42
  ## Model description
43
 
 
61
 
62
  ### Training results
63
 
64
+ | Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Per Category Iou | Validation Per Category Accuracy | Epoch |
65
+ |:----------:|:---------------:|:-------------------:|:------------------------:|:---------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-----:|
66
  | 1.4089 | 0.8220 | 0.1975 | 0.2427 | 0.7701 | [0. 0.58353931 0.7655921 0.04209491 0.53135026 0.11779776
67
  nan 0.07709853 0.15950712 0. 0.69634813 0.
68
  0. 0. 0. 0. 0. 0.
69
  0.61456822 0. 0.24971248 0.27129675 0. nan
70
  0. 0.07697324 0. 0. 0.78576516 0.61267064
71
+ 0.84564576 0. 0. 0.08904216 0. ] | [0. 0.88026971 0.93475302 0.04216372 0.5484085 0.13285614
72
  nan 0.08669707 0.19044773 0. 0.90089024 0.
73
  0. 0. 0. 0. 0. 0.
74
  0.76783975 0. 0.42102101 0.28659817 0. nan
75
  0. 0.08671771 0. 0. 0.89590301 0.74932576
76
+ 0.9434814 0. 0. 0.14245566 0. ] | 0 |
77
  | 0.8462 | 0.6135 | 0.2551 | 0.2960 | 0.8200 | [0. 0.66967645 0.80571406 0.56416239 0.66692248 0.24744912
78
  nan 0.23994505 0.28962463 0. 0.76504783 0.
79
  0. 0. 0. 0.14111353 0. 0.
80
  0.6924468 0. 0.27988701 0.41876094 0. nan
81
  0. 0.14755829 0. 0. 0.81614463 0.68429711
82
+ 0.87710938 0. 0. 0.11234171 0. ] | [0. 0.83805933 0.94928385 0.59586511 0.72913519 0.30595504
83
  nan 0.3128234 0.34805831 0. 0.87847495 0.
84
  0. 0. 0. 0.14205167 0. 0.
85
  0.87543619 0. 0.36001144 0.49498574 0. nan
86
  0. 0.18179115 0. 0. 0.92867923 0.7496178
87
+ 0.92220166 0. 0. 0.15398549 0. ] | 1 |
88
  | 0.7134 | 0.5660 | 0.2780 | 0.3320 | 0.8286 | [0. 0.64791461 0.83800512 0.67301044 0.68120631 0.27361472
89
  nan 0.26715802 0.43596999 0. 0.78649287 0.
90
  0. 0. 0. 0.41256964 0. 0.
91
  0.71114766 0. 0.31646321 0.44682442 0. nan
92
  0. 0.17132551 0. 0. 0.81845697 0.67536699
93
+ 0.88940936 0. 0. 0.1304862 0. ] | [0. 0.85958877 0.92084269 0.82341633 0.74725972 0.33495972
94
  nan 0.40755277 0.56591531 0. 0.90641721 0.
95
  0. 0. 0. 0.48144408 0. 0.
96
  0.88294811 0. 0.46962078 0.47517397 0. nan
97
  0. 0.20631607 0. 0. 0.90956851 0.85856042
98
+ 0.94107052 0. 0. 0.16669713 0. ] | 2 |
99
  | 0.6320 | 0.5173 | 0.2894 | 0.3454 | 0.8435 | [0. 0.70789146 0.84902296 0.65266358 0.76099965 0.32934391
100
  nan 0.29576422 0.43988204 0. 0.79276447 0.
101
  0. 0. 0. 0.42668367 0. 0.
102
  0.71717911 0. 0.32151249 0.50084444 0. nan
103
  0. 0.18711455 0. 0. 0.82903803 0.68990498
104
+ 0.8990059 0. 0.00213015 0.14819771 0. ] | [0. 0.84048763 0.93514369 0.68355212 0.88302113 0.458816
105
  nan 0.38623272 0.69456442 0. 0.92379471 0.
106
  0. 0. 0. 0.50677438 0. 0.
107
  0.90362965 0. 0.4662386 0.57368294 0. nan
108
  0. 0.23281768 0. 0. 0.9001526 0.86786434
109
+ 0.95195314 0. 0.00333751 0.18532191 0. ] | 3 |
110
  | 0.5609 | 0.5099 | 0.2920 | 0.3599 | 0.8385 | [0. 0.70817583 0.84131144 0.66573523 0.81449696 0.38891117
111
  nan 0.28124784 0.42659255 0. 0.80855146 0.
112
  0. 0. 0. 0.46011866 0. 0.
113
  0.65458792 0. 0.28411565 0.46758138 0. nan
114
  0. 0.21849067 0. 0. 0.83829062 0.71207623
115
+ 0.89929169 0. 0.02846127 0.13782635 0. ] | [0. 0.88632871 0.91269832 0.79044294 0.88368528 0.57405218
116
  nan 0.35035973 0.77610775 0. 0.8889696 0.
117
  0. 0. 0. 0.6020786 0. 0.
118
  0.74586521 0. 0.61602403 0.54519561 0. nan
119
  0. 0.28447396 0. 0. 0.94520232 0.85544414
120
+ 0.95994042 0. 0.04680851 0.21407134 0. ] | 4 |
121
+ | 0.5256 | 0.4741 | 0.3045 | 0.3598 | 0.8558 | [0.00000000e+00 7.50159008e-01 8.53654462e-01 6.44928131e-01
122
+ 7.90455244e-01 4.33599913e-01 nan 3.33472954e-01
123
+ 4.74502513e-01 0.00000000e+00 8.01366017e-01 0.00000000e+00
124
+ 0.00000000e+00 0.00000000e+00 0.00000000e+00 4.67653814e-01
125
+ 0.00000000e+00 0.00000000e+00 7.27412479e-01 0.00000000e+00
126
+ 4.18946113e-01 5.04714837e-01 0.00000000e+00 nan
127
+ 0.00000000e+00 2.00373855e-01 0.00000000e+00 0.00000000e+00
128
+ 8.50200795e-01 7.41636173e-01 9.08320534e-01 2.77259907e-04
129
+ 0.00000000e+00 1.45430716e-01 0.00000000e+00] | [0.00000000e+00 8.86487233e-01 9.05201886e-01 7.23139265e-01
130
+ 8.91929263e-01 7.26675641e-01 nan 4.36386295e-01
131
+ 6.64378543e-01 0.00000000e+00 8.89056843e-01 0.00000000e+00
132
+ 0.00000000e+00 0.00000000e+00 0.00000000e+00 5.65450644e-01
133
+ 0.00000000e+00 0.00000000e+00 9.27446136e-01 0.00000000e+00
134
+ 5.36031025e-01 5.84198054e-01 0.00000000e+00 nan
135
+ 0.00000000e+00 2.42514534e-01 0.00000000e+00 0.00000000e+00
136
+ 9.31954754e-01 8.26849708e-01 9.59880377e-01 2.79039335e-04
137
+ 0.00000000e+00 1.77106051e-01 0.00000000e+00] | 5 |
138
 
139
 
140
  ### Framework versions
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e9d1e06bfc9072c99e20cb94871f766a50ddbba394aa4ba7de16eb778f725b35
3
  size 15167588
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:290c60be67641026a83a4db3209f088433f519c85b0838182118a1e7448c614f
3
  size 15167588