HorcruxNo13 commited on
Commit
0bd5730
1 Parent(s): efd7168

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +64 -56
README.md CHANGED
@@ -14,14 +14,14 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.0236
18
- - Mean Iou: 0.4952
19
- - Mean Accuracy: 0.9903
20
- - Overall Accuracy: 0.9903
21
  - Accuracy Unlabeled: nan
22
- - Accuracy Tool: 0.9903
23
  - Iou Unlabeled: 0.0
24
- - Iou Tool: 0.9903
25
 
26
  ## Model description
27
 
@@ -40,65 +40,73 @@ More information needed
40
  ### Training hyperparameters
41
 
42
  The following hyperparameters were used during training:
43
- - learning_rate: 6e-05
44
- - train_batch_size: 2
45
- - eval_batch_size: 2
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 50
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Iou Unlabeled | Iou Tool |
54
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
55
- | 0.1696 | 1.18 | 20 | 0.3490 | 0.4962 | 0.9924 | 0.9924 | nan | 0.9924 | 0.0 | 0.9924 |
56
- | 0.1045 | 2.35 | 40 | 0.0977 | 0.4878 | 0.9755 | 0.9755 | nan | 0.9755 | 0.0 | 0.9755 |
57
- | 0.0871 | 3.53 | 60 | 0.0650 | 0.4953 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 |
58
- | 0.0542 | 4.71 | 80 | 0.0652 | 0.4956 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 |
59
- | 0.0507 | 5.88 | 100 | 0.0573 | 0.4952 | 0.9905 | 0.9905 | nan | 0.9905 | 0.0 | 0.9905 |
60
- | 0.0702 | 7.06 | 120 | 0.0510 | 0.4942 | 0.9883 | 0.9883 | nan | 0.9883 | 0.0 | 0.9883 |
61
- | 0.0455 | 8.24 | 140 | 0.0487 | 0.4892 | 0.9784 | 0.9784 | nan | 0.9784 | 0.0 | 0.9784 |
62
- | 0.049 | 9.41 | 160 | 0.0430 | 0.4934 | 0.9867 | 0.9867 | nan | 0.9867 | 0.0 | 0.9867 |
63
- | 0.048 | 10.59 | 180 | 0.0409 | 0.4940 | 0.9881 | 0.9881 | nan | 0.9881 | 0.0 | 0.9881 |
64
- | 0.0476 | 11.76 | 200 | 0.0347 | 0.4965 | 0.9931 | 0.9931 | nan | 0.9931 | 0.0 | 0.9931 |
65
- | 0.048 | 12.94 | 220 | 0.0366 | 0.4972 | 0.9944 | 0.9944 | nan | 0.9944 | 0.0 | 0.9944 |
66
- | 0.0242 | 14.12 | 240 | 0.0341 | 0.4963 | 0.9926 | 0.9926 | nan | 0.9926 | 0.0 | 0.9926 |
67
- | 0.0274 | 15.29 | 260 | 0.0305 | 0.4966 | 0.9933 | 0.9933 | nan | 0.9933 | 0.0 | 0.9933 |
68
- | 0.0192 | 16.47 | 280 | 0.0318 | 0.4956 | 0.9913 | 0.9913 | nan | 0.9913 | 0.0 | 0.9913 |
69
- | 0.0388 | 17.65 | 300 | 0.0280 | 0.4966 | 0.9932 | 0.9932 | nan | 0.9932 | 0.0 | 0.9932 |
70
- | 0.0245 | 18.82 | 320 | 0.0280 | 0.4947 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | 0.9894 |
71
- | 0.0268 | 20.0 | 340 | 0.0268 | 0.4949 | 0.9899 | 0.9899 | nan | 0.9899 | 0.0 | 0.9899 |
72
- | 0.0173 | 21.18 | 360 | 0.0278 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 |
73
- | 0.0275 | 22.35 | 380 | 0.0270 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 |
74
- | 0.0269 | 23.53 | 400 | 0.0271 | 0.4950 | 0.9899 | 0.9899 | nan | 0.9899 | 0.0 | 0.9899 |
75
- | 0.0371 | 24.71 | 420 | 0.0252 | 0.4938 | 0.9876 | 0.9876 | nan | 0.9876 | 0.0 | 0.9876 |
76
- | 0.0233 | 25.88 | 440 | 0.0264 | 0.4933 | 0.9867 | 0.9867 | nan | 0.9867 | 0.0 | 0.9867 |
77
- | 0.0181 | 27.06 | 460 | 0.0257 | 0.4959 | 0.9918 | 0.9918 | nan | 0.9918 | 0.0 | 0.9918 |
78
- | 0.0243 | 28.24 | 480 | 0.0255 | 0.4952 | 0.9904 | 0.9904 | nan | 0.9904 | 0.0 | 0.9904 |
79
- | 0.0144 | 29.41 | 500 | 0.0244 | 0.4956 | 0.9912 | 0.9912 | nan | 0.9912 | 0.0 | 0.9912 |
80
- | 0.0158 | 30.59 | 520 | 0.0251 | 0.4947 | 0.9894 | 0.9894 | nan | 0.9894 | 0.0 | 0.9894 |
81
- | 0.017 | 31.76 | 540 | 0.0247 | 0.4955 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 |
82
- | 0.0179 | 32.94 | 560 | 0.0237 | 0.4965 | 0.9930 | 0.9930 | nan | 0.9930 | 0.0 | 0.9930 |
83
- | 0.0162 | 34.12 | 580 | 0.0238 | 0.4956 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 |
84
- | 0.0191 | 35.29 | 600 | 0.0241 | 0.4950 | 0.9901 | 0.9901 | nan | 0.9901 | 0.0 | 0.9901 |
85
- | 0.0133 | 36.47 | 620 | 0.0241 | 0.4956 | 0.9911 | 0.9911 | nan | 0.9911 | 0.0 | 0.9911 |
86
- | 0.0118 | 37.65 | 640 | 0.0244 | 0.4948 | 0.9896 | 0.9896 | nan | 0.9896 | 0.0 | 0.9896 |
87
- | 0.0133 | 38.82 | 660 | 0.0228 | 0.4960 | 0.9921 | 0.9921 | nan | 0.9921 | 0.0 | 0.9921 |
88
- | 0.0197 | 40.0 | 680 | 0.0234 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 |
89
- | 0.0168 | 41.18 | 700 | 0.0232 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 |
90
- | 0.0119 | 42.35 | 720 | 0.0234 | 0.4957 | 0.9914 | 0.9914 | nan | 0.9914 | 0.0 | 0.9914 |
91
- | 0.0155 | 43.53 | 740 | 0.0243 | 0.4950 | 0.9900 | 0.9900 | nan | 0.9900 | 0.0 | 0.9900 |
92
- | 0.0126 | 44.71 | 760 | 0.0242 | 0.4949 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 |
93
- | 0.0129 | 45.88 | 780 | 0.0242 | 0.4955 | 0.9910 | 0.9910 | nan | 0.9910 | 0.0 | 0.9910 |
94
- | 0.0116 | 47.06 | 800 | 0.0238 | 0.4953 | 0.9906 | 0.9906 | nan | 0.9906 | 0.0 | 0.9906 |
95
- | 0.0122 | 48.24 | 820 | 0.0239 | 0.4954 | 0.9908 | 0.9908 | nan | 0.9908 | 0.0 | 0.9908 |
96
- | 0.0164 | 49.41 | 840 | 0.0236 | 0.4952 | 0.9903 | 0.9903 | nan | 0.9903 | 0.0 | 0.9903 |
 
 
 
 
 
 
 
 
97
 
98
 
99
  ### Framework versions
100
 
101
  - Transformers 4.28.0
102
- - Pytorch 2.1.0+cu121
103
- - Datasets 2.16.0
104
  - Tokenizers 0.13.3
 
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.0316
18
+ - Mean Iou: 0.4930
19
+ - Mean Accuracy: 0.9859
20
+ - Overall Accuracy: 0.9859
21
  - Accuracy Unlabeled: nan
22
+ - Accuracy Outline: 0.9859
23
  - Iou Unlabeled: 0.0
24
+ - Iou Outline: 0.9859
25
 
26
  ## Model description
27
 
 
40
  ### Training hyperparameters
41
 
42
  The following hyperparameters were used during training:
43
+ - learning_rate: 0.0001
44
+ - train_batch_size: 24
45
+ - eval_batch_size: 24
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
+ - num_epochs: 40
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Outline | Iou Unlabeled | Iou Outline |
54
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|
55
+ | 0.1277 | 0.8 | 20 | 0.1795 | 0.4907 | 0.9814 | 0.9814 | nan | 0.9814 | 0.0 | 0.9814 |
56
+ | 0.1026 | 1.6 | 40 | 0.0920 | 0.4764 | 0.9529 | 0.9529 | nan | 0.9529 | 0.0 | 0.9529 |
57
+ | 0.0934 | 2.4 | 60 | 0.0782 | 0.4859 | 0.9718 | 0.9718 | nan | 0.9718 | 0.0 | 0.9718 |
58
+ | 0.0682 | 3.2 | 80 | 0.0656 | 0.4862 | 0.9724 | 0.9724 | nan | 0.9724 | 0.0 | 0.9724 |
59
+ | 0.054 | 4.0 | 100 | 0.0584 | 0.4885 | 0.9769 | 0.9769 | nan | 0.9769 | 0.0 | 0.9769 |
60
+ | 0.0529 | 4.8 | 120 | 0.0528 | 0.4894 | 0.9787 | 0.9787 | nan | 0.9787 | 0.0 | 0.9787 |
61
+ | 0.0586 | 5.6 | 140 | 0.0498 | 0.4885 | 0.9771 | 0.9771 | nan | 0.9771 | 0.0 | 0.9771 |
62
+ | 0.0538 | 6.4 | 160 | 0.0464 | 0.4878 | 0.9756 | 0.9756 | nan | 0.9756 | 0.0 | 0.9756 |
63
+ | 0.0422 | 7.2 | 180 | 0.0443 | 0.4926 | 0.9851 | 0.9851 | nan | 0.9851 | 0.0 | 0.9851 |
64
+ | 0.0517 | 8.0 | 200 | 0.0443 | 0.4914 | 0.9828 | 0.9828 | nan | 0.9828 | 0.0 | 0.9828 |
65
+ | 0.0439 | 8.8 | 220 | 0.0409 | 0.4912 | 0.9824 | 0.9824 | nan | 0.9824 | 0.0 | 0.9824 |
66
+ | 0.0357 | 9.6 | 240 | 0.0394 | 0.4899 | 0.9799 | 0.9799 | nan | 0.9799 | 0.0 | 0.9799 |
67
+ | 0.0381 | 10.4 | 260 | 0.0393 | 0.4901 | 0.9801 | 0.9801 | nan | 0.9801 | 0.0 | 0.9801 |
68
+ | 0.0362 | 11.2 | 280 | 0.0396 | 0.4931 | 0.9863 | 0.9863 | nan | 0.9863 | 0.0 | 0.9863 |
69
+ | 0.0317 | 12.0 | 300 | 0.0373 | 0.4922 | 0.9844 | 0.9844 | nan | 0.9844 | 0.0 | 0.9844 |
70
+ | 0.0342 | 12.8 | 320 | 0.0423 | 0.4950 | 0.9899 | 0.9899 | nan | 0.9899 | 0.0 | 0.9899 |
71
+ | 0.0341 | 13.6 | 340 | 0.0374 | 0.4925 | 0.9849 | 0.9849 | nan | 0.9849 | 0.0 | 0.9849 |
72
+ | 0.0347 | 14.4 | 360 | 0.0358 | 0.4921 | 0.9842 | 0.9842 | nan | 0.9842 | 0.0 | 0.9842 |
73
+ | 0.0351 | 15.2 | 380 | 0.0358 | 0.4928 | 0.9855 | 0.9855 | nan | 0.9855 | 0.0 | 0.9855 |
74
+ | 0.0589 | 16.0 | 400 | 0.0346 | 0.4908 | 0.9816 | 0.9816 | nan | 0.9816 | 0.0 | 0.9816 |
75
+ | 0.0354 | 16.8 | 420 | 0.0353 | 0.4945 | 0.9891 | 0.9891 | nan | 0.9891 | 0.0 | 0.9891 |
76
+ | 0.0349 | 17.6 | 440 | 0.0346 | 0.4899 | 0.9797 | 0.9797 | nan | 0.9797 | 0.0 | 0.9797 |
77
+ | 0.0357 | 18.4 | 460 | 0.0340 | 0.4927 | 0.9855 | 0.9855 | nan | 0.9855 | 0.0 | 0.9855 |
78
+ | 0.032 | 19.2 | 480 | 0.0348 | 0.4904 | 0.9808 | 0.9808 | nan | 0.9808 | 0.0 | 0.9808 |
79
+ | 0.0365 | 20.0 | 500 | 0.0337 | 0.4924 | 0.9849 | 0.9849 | nan | 0.9849 | 0.0 | 0.9849 |
80
+ | 0.0361 | 20.8 | 520 | 0.0334 | 0.4932 | 0.9863 | 0.9863 | nan | 0.9863 | 0.0 | 0.9863 |
81
+ | 0.0411 | 21.6 | 540 | 0.0324 | 0.4921 | 0.9843 | 0.9843 | nan | 0.9843 | 0.0 | 0.9843 |
82
+ | 0.0335 | 22.4 | 560 | 0.0329 | 0.4932 | 0.9864 | 0.9864 | nan | 0.9864 | 0.0 | 0.9864 |
83
+ | 0.0285 | 23.2 | 580 | 0.0327 | 0.4924 | 0.9847 | 0.9847 | nan | 0.9847 | 0.0 | 0.9847 |
84
+ | 0.0339 | 24.0 | 600 | 0.0328 | 0.4913 | 0.9827 | 0.9827 | nan | 0.9827 | 0.0 | 0.9827 |
85
+ | 0.034 | 24.8 | 620 | 0.0323 | 0.4934 | 0.9869 | 0.9869 | nan | 0.9869 | 0.0 | 0.9869 |
86
+ | 0.0314 | 25.6 | 640 | 0.0336 | 0.4940 | 0.9880 | 0.9880 | nan | 0.9880 | 0.0 | 0.9880 |
87
+ | 0.029 | 26.4 | 660 | 0.0324 | 0.4926 | 0.9853 | 0.9853 | nan | 0.9853 | 0.0 | 0.9853 |
88
+ | 0.0371 | 27.2 | 680 | 0.0324 | 0.4917 | 0.9833 | 0.9833 | nan | 0.9833 | 0.0 | 0.9833 |
89
+ | 0.0288 | 28.0 | 700 | 0.0322 | 0.4931 | 0.9862 | 0.9862 | nan | 0.9862 | 0.0 | 0.9862 |
90
+ | 0.0297 | 28.8 | 720 | 0.0320 | 0.4925 | 0.9849 | 0.9849 | nan | 0.9849 | 0.0 | 0.9849 |
91
+ | 0.0256 | 29.6 | 740 | 0.0321 | 0.4923 | 0.9846 | 0.9846 | nan | 0.9846 | 0.0 | 0.9846 |
92
+ | 0.033 | 30.4 | 760 | 0.0317 | 0.4926 | 0.9852 | 0.9852 | nan | 0.9852 | 0.0 | 0.9852 |
93
+ | 0.0251 | 31.2 | 780 | 0.0328 | 0.4943 | 0.9887 | 0.9887 | nan | 0.9887 | 0.0 | 0.9887 |
94
+ | 0.0286 | 32.0 | 800 | 0.0322 | 0.4938 | 0.9876 | 0.9876 | nan | 0.9876 | 0.0 | 0.9876 |
95
+ | 0.0273 | 32.8 | 820 | 0.0318 | 0.4930 | 0.9859 | 0.9859 | nan | 0.9859 | 0.0 | 0.9859 |
96
+ | 0.0289 | 33.6 | 840 | 0.0325 | 0.4937 | 0.9873 | 0.9873 | nan | 0.9873 | 0.0 | 0.9873 |
97
+ | 0.0279 | 34.4 | 860 | 0.0325 | 0.4937 | 0.9874 | 0.9874 | nan | 0.9874 | 0.0 | 0.9874 |
98
+ | 0.0284 | 35.2 | 880 | 0.0325 | 0.4940 | 0.9879 | 0.9879 | nan | 0.9879 | 0.0 | 0.9879 |
99
+ | 0.0229 | 36.0 | 900 | 0.0317 | 0.4931 | 0.9861 | 0.9861 | nan | 0.9861 | 0.0 | 0.9861 |
100
+ | 0.0256 | 36.8 | 920 | 0.0316 | 0.4927 | 0.9854 | 0.9854 | nan | 0.9854 | 0.0 | 0.9854 |
101
+ | 0.0278 | 37.6 | 940 | 0.0319 | 0.4933 | 0.9867 | 0.9867 | nan | 0.9867 | 0.0 | 0.9867 |
102
+ | 0.0301 | 38.4 | 960 | 0.0318 | 0.4932 | 0.9865 | 0.9865 | nan | 0.9865 | 0.0 | 0.9865 |
103
+ | 0.0233 | 39.2 | 980 | 0.0319 | 0.4934 | 0.9868 | 0.9868 | nan | 0.9868 | 0.0 | 0.9868 |
104
+ | 0.0256 | 40.0 | 1000 | 0.0316 | 0.4930 | 0.9859 | 0.9859 | nan | 0.9859 | 0.0 | 0.9859 |
105
 
106
 
107
  ### Framework versions
108
 
109
  - Transformers 4.28.0
110
+ - Pytorch 2.2.1+cu121
111
+ - Datasets 2.18.0
112
  - Tokenizers 0.13.3