yijisuk commited on
Commit
69cedf9
1 Parent(s): 195b0a6

End of training

Browse files
README.md CHANGED
@@ -17,14 +17,15 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the yijisuk/ic-chip-sample dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 0.1915
21
- - Mean Iou: 0.4765
22
- - Mean Accuracy: 0.9531
23
- - Overall Accuracy: 0.9531
24
  - Accuracy Unlabeled: nan
25
- - Accuracy Circuit: 0.9531
26
  - Iou Unlabeled: 0.0
27
- - Iou Circuit: 0.9531
 
28
 
29
  ## Model description
30
 
@@ -53,58 +54,48 @@ The following hyperparameters were used during training:
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit |
57
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|
58
- | 0.7961 | 1.0 | 20 | 0.5776 | 0.3160 | 0.6320 | 0.6320 | nan | 0.6320 | 0.0 | 0.6320 |
59
- | 0.7261 | 2.0 | 40 | 0.4222 | 0.4655 | 0.9310 | 0.9310 | nan | 0.9310 | 0.0 | 0.9310 |
60
- | 0.3132 | 3.0 | 60 | 0.2869 | 0.4478 | 0.8956 | 0.8956 | nan | 0.8956 | 0.0 | 0.8956 |
61
- | 0.2224 | 4.0 | 80 | 0.2898 | 0.4817 | 0.9635 | 0.9635 | nan | 0.9635 | 0.0 | 0.9635 |
62
- | 0.1641 | 5.0 | 100 | 0.2861 | 0.4733 | 0.9466 | 0.9466 | nan | 0.9466 | 0.0 | 0.9466 |
63
- | 0.9802 | 6.0 | 120 | 0.3005 | 0.4790 | 0.9581 | 0.9581 | nan | 0.9581 | 0.0 | 0.9581 |
64
- | 0.1633 | 7.0 | 140 | 0.2953 | 0.4397 | 0.8794 | 0.8794 | nan | 0.8794 | 0.0 | 0.8794 |
65
- | 0.3674 | 8.0 | 160 | 0.2951 | 0.4809 | 0.9619 | 0.9619 | nan | 0.9619 | 0.0 | 0.9619 |
66
- | 0.1632 | 9.0 | 180 | 0.3007 | 0.4740 | 0.9480 | 0.9480 | nan | 0.9480 | 0.0 | 0.9480 |
67
- | 0.3719 | 10.0 | 200 | 0.2633 | 0.4687 | 0.9374 | 0.9374 | nan | 0.9374 | 0.0 | 0.9374 |
68
- | 0.2061 | 11.0 | 220 | 0.2544 | 0.4575 | 0.9150 | 0.9150 | nan | 0.9150 | 0.0 | 0.9150 |
69
- | 0.1756 | 12.0 | 240 | 0.2587 | 0.4856 | 0.9711 | 0.9711 | nan | 0.9711 | 0.0 | 0.9711 |
70
- | 0.366 | 13.0 | 260 | 0.2458 | 0.4883 | 0.9765 | 0.9765 | nan | 0.9765 | 0.0 | 0.9765 |
71
- | 0.2532 | 14.0 | 280 | 0.2742 | 0.4771 | 0.9543 | 0.9543 | nan | 0.9543 | 0.0 | 0.9543 |
72
- | 0.144 | 15.0 | 300 | 0.2424 | 0.4612 | 0.9223 | 0.9223 | nan | 0.9223 | 0.0 | 0.9223 |
73
- | 0.1314 | 16.0 | 320 | 0.2130 | 0.4745 | 0.9489 | 0.9489 | nan | 0.9489 | 0.0 | 0.9489 |
74
- | 1.4391 | 17.0 | 340 | 0.2156 | 0.4813 | 0.9626 | 0.9626 | nan | 0.9626 | 0.0 | 0.9626 |
75
- | 0.211 | 18.0 | 360 | 0.1995 | 0.4767 | 0.9533 | 0.9533 | nan | 0.9533 | 0.0 | 0.9533 |
76
- | 0.0792 | 19.0 | 380 | 0.2052 | 0.4855 | 0.9710 | 0.9710 | nan | 0.9710 | 0.0 | 0.9710 |
77
- | 1.1 | 20.0 | 400 | 0.1972 | 0.4712 | 0.9424 | 0.9424 | nan | 0.9424 | 0.0 | 0.9424 |
78
- | 0.067 | 21.0 | 420 | 0.2015 | 0.4697 | 0.9394 | 0.9394 | nan | 0.9394 | 0.0 | 0.9394 |
79
- | 0.1783 | 22.0 | 440 | 0.2100 | 0.4821 | 0.9642 | 0.9642 | nan | 0.9642 | 0.0 | 0.9642 |
80
- | 0.1594 | 23.0 | 460 | 0.1989 | 0.4746 | 0.9491 | 0.9491 | nan | 0.9491 | 0.0 | 0.9491 |
81
- | 0.2306 | 24.0 | 480 | 0.1957 | 0.4668 | 0.9337 | 0.9337 | nan | 0.9337 | 0.0 | 0.9337 |
82
- | 0.9809 | 25.0 | 500 | 0.1971 | 0.4802 | 0.9603 | 0.9603 | nan | 0.9603 | 0.0 | 0.9603 |
83
- | 0.1154 | 26.0 | 520 | 0.1957 | 0.4792 | 0.9585 | 0.9585 | nan | 0.9585 | 0.0 | 0.9585 |
84
- | 0.2142 | 27.0 | 540 | 0.1945 | 0.4827 | 0.9655 | 0.9655 | nan | 0.9655 | 0.0 | 0.9655 |
85
- | 0.177 | 28.0 | 560 | 0.1930 | 0.4725 | 0.9451 | 0.9451 | nan | 0.9451 | 0.0 | 0.9451 |
86
- | 0.2003 | 29.0 | 580 | 0.1965 | 0.4827 | 0.9654 | 0.9654 | nan | 0.9654 | 0.0 | 0.9654 |
87
- | 0.1977 | 30.0 | 600 | 0.1995 | 0.4861 | 0.9722 | 0.9722 | nan | 0.9722 | 0.0 | 0.9722 |
88
- | 0.1671 | 31.0 | 620 | 0.1946 | 0.4760 | 0.9520 | 0.9520 | nan | 0.9520 | 0.0 | 0.9520 |
89
- | 0.1449 | 32.0 | 640 | 0.1895 | 0.4642 | 0.9285 | 0.9285 | nan | 0.9285 | 0.0 | 0.9285 |
90
- | 0.2587 | 33.0 | 660 | 0.1920 | 0.4810 | 0.9619 | 0.9619 | nan | 0.9619 | 0.0 | 0.9619 |
91
- | 1.2053 | 34.0 | 680 | 0.1931 | 0.4790 | 0.9579 | 0.9579 | nan | 0.9579 | 0.0 | 0.9579 |
92
- | 0.1107 | 35.0 | 700 | 0.1951 | 0.4824 | 0.9647 | 0.9647 | nan | 0.9647 | 0.0 | 0.9647 |
93
- | 0.0821 | 36.0 | 720 | 0.1926 | 0.4788 | 0.9577 | 0.9577 | nan | 0.9577 | 0.0 | 0.9577 |
94
- | 0.5034 | 37.0 | 740 | 0.1903 | 0.4656 | 0.9311 | 0.9311 | nan | 0.9311 | 0.0 | 0.9311 |
95
- | 0.137 | 38.0 | 760 | 0.1892 | 0.4684 | 0.9368 | 0.9368 | nan | 0.9368 | 0.0 | 0.9368 |
96
- | 0.2861 | 39.0 | 780 | 0.1911 | 0.4762 | 0.9524 | 0.9524 | nan | 0.9524 | 0.0 | 0.9524 |
97
- | 0.965 | 40.0 | 800 | 0.1928 | 0.4716 | 0.9432 | 0.9432 | nan | 0.9432 | 0.0 | 0.9432 |
98
- | 0.138 | 41.0 | 820 | 0.1926 | 0.4742 | 0.9483 | 0.9483 | nan | 0.9483 | 0.0 | 0.9483 |
99
- | 0.0291 | 42.0 | 840 | 0.1888 | 0.4689 | 0.9378 | 0.9378 | nan | 0.9378 | 0.0 | 0.9378 |
100
- | 0.0624 | 43.0 | 860 | 0.1895 | 0.4684 | 0.9369 | 0.9369 | nan | 0.9369 | 0.0 | 0.9369 |
101
- | 0.0611 | 44.0 | 880 | 0.1915 | 0.4772 | 0.9545 | 0.9545 | nan | 0.9545 | 0.0 | 0.9545 |
102
- | 0.0322 | 45.0 | 900 | 0.1893 | 0.4670 | 0.9340 | 0.9340 | nan | 0.9340 | 0.0 | 0.9340 |
103
- | 0.0927 | 46.0 | 920 | 0.1901 | 0.4714 | 0.9428 | 0.9428 | nan | 0.9428 | 0.0 | 0.9428 |
104
- | 0.1752 | 47.0 | 940 | 0.1897 | 0.4758 | 0.9516 | 0.9516 | nan | 0.9516 | 0.0 | 0.9516 |
105
- | 0.1343 | 48.0 | 960 | 0.1906 | 0.4779 | 0.9559 | 0.9559 | nan | 0.9559 | 0.0 | 0.9559 |
106
- | 0.0765 | 49.0 | 980 | 0.1903 | 0.4732 | 0.9464 | 0.9464 | nan | 0.9464 | 0.0 | 0.9464 |
107
- | 0.048 | 50.0 | 1000 | 0.1915 | 0.4765 | 0.9531 | 0.9531 | nan | 0.9531 | 0.0 | 0.9531 |
108
 
109
 
110
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the yijisuk/ic-chip-sample dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.2325
21
+ - Mean Iou: 0.4242
22
+ - Mean Accuracy: 0.8484
23
+ - Overall Accuracy: 0.8484
24
  - Accuracy Unlabeled: nan
25
+ - Accuracy Circuit: 0.8484
26
  - Iou Unlabeled: 0.0
27
+ - Iou Circuit: 0.8484
28
+ - Dice Coefficient: 0.8124
29
 
30
  ## Model description
31
 
 
54
 
55
  ### Training results
56
 
57
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Circuit | Iou Unlabeled | Iou Circuit | Dice Coefficient |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:-------------:|:-----------:|:----------------:|
59
+ | 0.6204 | 1.25 | 100 | 0.4305 | 0.2780 | 0.5561 | 0.5561 | nan | 0.5561 | 0.0 | 0.5561 | 0.4505 |
60
+ | 0.5362 | 2.5 | 200 | 0.3695 | 0.3415 | 0.6830 | 0.6830 | nan | 0.6830 | 0.0 | 0.6830 | 0.6151 |
61
+ | 0.5427 | 3.75 | 300 | 0.3586 | 0.2156 | 0.4312 | 0.4312 | nan | 0.4312 | 0.0 | 0.4312 | 0.3380 |
62
+ | 0.5232 | 5.0 | 400 | 0.3678 | 0.3097 | 0.6193 | 0.6193 | nan | 0.6193 | 0.0 | 0.6193 | 0.6195 |
63
+ | 0.5173 | 6.25 | 500 | 0.3283 | 0.3075 | 0.6151 | 0.6151 | nan | 0.6151 | 0.0 | 0.6151 | 0.5814 |
64
+ | 0.4834 | 7.5 | 600 | 0.2894 | 0.2849 | 0.5698 | 0.5698 | nan | 0.5698 | 0.0 | 0.5698 | 0.5510 |
65
+ | 0.4761 | 8.75 | 700 | 0.2636 | 0.4560 | 0.9120 | 0.9120 | nan | 0.9120 | 0.0 | 0.9120 | 0.8203 |
66
+ | 0.4556 | 10.0 | 800 | 0.2597 | 0.4140 | 0.8279 | 0.8279 | nan | 0.8279 | 0.0 | 0.8279 | 0.7991 |
67
+ | 0.46 | 11.25 | 900 | 0.2564 | 0.3546 | 0.7093 | 0.7093 | nan | 0.7093 | 0.0 | 0.7093 | 0.7022 |
68
+ | 0.4607 | 12.5 | 1000 | 0.2497 | 0.3432 | 0.6865 | 0.6865 | nan | 0.6865 | 0.0 | 0.6865 | 0.6894 |
69
+ | 0.437 | 13.75 | 1100 | 0.2284 | 0.4269 | 0.8539 | 0.8539 | nan | 0.8539 | 0.0 | 0.8539 | 0.8058 |
70
+ | 0.4405 | 15.0 | 1200 | 0.2368 | 0.3750 | 0.7500 | 0.7500 | nan | 0.7500 | 0.0 | 0.7500 | 0.7423 |
71
+ | 0.4203 | 16.25 | 1300 | 0.2590 | 0.3002 | 0.6004 | 0.6004 | nan | 0.6004 | 0.0 | 0.6004 | 0.6110 |
72
+ | 0.4292 | 17.5 | 1400 | 0.2272 | 0.4050 | 0.8100 | 0.8100 | nan | 0.8100 | 0.0 | 0.8100 | 0.7895 |
73
+ | 0.4222 | 18.75 | 1500 | 0.2373 | 0.3702 | 0.7403 | 0.7403 | nan | 0.7403 | 0.0 | 0.7403 | 0.7421 |
74
+ | 0.396 | 20.0 | 1600 | 0.2266 | 0.4013 | 0.8025 | 0.8025 | nan | 0.8025 | 0.0 | 0.8025 | 0.7815 |
75
+ | 0.406 | 21.25 | 1700 | 0.2344 | 0.4000 | 0.8000 | 0.8000 | nan | 0.8000 | 0.0 | 0.8000 | 0.7823 |
76
+ | 0.3998 | 22.5 | 1800 | 0.2204 | 0.4185 | 0.8369 | 0.8369 | nan | 0.8369 | 0.0 | 0.8369 | 0.8059 |
77
+ | 0.3915 | 23.75 | 1900 | 0.2418 | 0.3367 | 0.6734 | 0.6734 | nan | 0.6734 | 0.0 | 0.6734 | 0.6917 |
78
+ | 0.3836 | 25.0 | 2000 | 0.2231 | 0.3937 | 0.7874 | 0.7874 | nan | 0.7874 | 0.0 | 0.7874 | 0.7747 |
79
+ | 0.3824 | 26.25 | 2100 | 0.2249 | 0.4043 | 0.8086 | 0.8086 | nan | 0.8086 | 0.0 | 0.8086 | 0.7848 |
80
+ | 0.3869 | 27.5 | 2200 | 0.2233 | 0.3705 | 0.7411 | 0.7411 | nan | 0.7411 | 0.0 | 0.7411 | 0.7408 |
81
+ | 0.3707 | 28.75 | 2300 | 0.2259 | 0.4543 | 0.9086 | 0.9086 | nan | 0.9086 | 0.0 | 0.9086 | 0.8453 |
82
+ | 0.3671 | 30.0 | 2400 | 0.2335 | 0.4435 | 0.8870 | 0.8870 | nan | 0.8870 | 0.0 | 0.8870 | 0.8331 |
83
+ | 0.3758 | 31.25 | 2500 | 0.2324 | 0.4316 | 0.8631 | 0.8631 | nan | 0.8631 | 0.0 | 0.8631 | 0.8205 |
84
+ | 0.3768 | 32.5 | 2600 | 0.2324 | 0.3643 | 0.7286 | 0.7286 | nan | 0.7286 | 0.0 | 0.7286 | 0.7372 |
85
+ | 0.3657 | 33.75 | 2700 | 0.2357 | 0.3689 | 0.7378 | 0.7378 | nan | 0.7378 | 0.0 | 0.7378 | 0.7381 |
86
+ | 0.3558 | 35.0 | 2800 | 0.2264 | 0.3836 | 0.7673 | 0.7673 | nan | 0.7673 | 0.0 | 0.7673 | 0.7593 |
87
+ | 0.3586 | 36.25 | 2900 | 0.2265 | 0.4049 | 0.8098 | 0.8098 | nan | 0.8098 | 0.0 | 0.8098 | 0.7887 |
88
+ | 0.3435 | 37.5 | 3000 | 0.2269 | 0.4124 | 0.8248 | 0.8248 | nan | 0.8248 | 0.0 | 0.8248 | 0.7985 |
89
+ | 0.3659 | 38.75 | 3100 | 0.2282 | 0.3803 | 0.7606 | 0.7606 | nan | 0.7606 | 0.0 | 0.7606 | 0.7571 |
90
+ | 0.3482 | 40.0 | 3200 | 0.2233 | 0.4160 | 0.8320 | 0.8320 | nan | 0.8320 | 0.0 | 0.8320 | 0.8040 |
91
+ | 0.3452 | 41.25 | 3300 | 0.2208 | 0.4222 | 0.8445 | 0.8445 | nan | 0.8445 | 0.0 | 0.8445 | 0.8151 |
92
+ | 0.3582 | 42.5 | 3400 | 0.2332 | 0.4016 | 0.8032 | 0.8032 | nan | 0.8032 | 0.0 | 0.8032 | 0.7845 |
93
+ | 0.3254 | 43.75 | 3500 | 0.2171 | 0.4157 | 0.8314 | 0.8314 | nan | 0.8314 | 0.0 | 0.8314 | 0.8053 |
94
+ | 0.3485 | 45.0 | 3600 | 0.2422 | 0.4345 | 0.8690 | 0.8690 | nan | 0.8690 | 0.0 | 0.8690 | 0.8163 |
95
+ | 0.3401 | 46.25 | 3700 | 0.2263 | 0.4178 | 0.8356 | 0.8356 | nan | 0.8356 | 0.0 | 0.8356 | 0.8032 |
96
+ | 0.3276 | 47.5 | 3800 | 0.2226 | 0.4347 | 0.8694 | 0.8694 | nan | 0.8694 | 0.0 | 0.8694 | 0.8254 |
97
+ | 0.3637 | 48.75 | 3900 | 0.2284 | 0.4164 | 0.8329 | 0.8329 | nan | 0.8329 | 0.0 | 0.8329 | 0.8036 |
98
+ | 0.3181 | 50.0 | 4000 | 0.2325 | 0.4242 | 0.8484 | 0.8484 | nan | 0.8484 | 0.0 | 0.8484 | 0.8124 |
 
 
 
 
 
 
 
 
 
 
99
 
100
 
101
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b0000c9158f01fd43a3442ec71a5c9e8b91bedfbd3f904b064d769544158be8a
3
  size 54737376
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9016abe384a16713528f0645f9be02699d1ffb8d52e56d69591e7d13f4c0338
3
  size 54737376
runs/Jun24_21-20-34_Centauri/events.out.tfevents.1719231644.Centauri.34720.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:47825684efff18f1795a980fd83a7013a0e67dd8acbdff0ad43bd1da508cf7c7
3
+ size 40855
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d7b7038b7b516e3ac795730df2bf7de1cd62210ed1e3b411751a572c49169e28
3
  size 4271
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:276c0a026ad750f95df60b3b61b685754307c26342ca6a184015f2e8827ab7d6
3
  size 4271