HorcruxNo13 commited on
Commit
f24e927
1 Parent(s): 5727563

Model save

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.0522
19
- - Mean Iou: 0.3485
20
- - Mean Accuracy: 0.6969
21
- - Overall Accuracy: 0.6969
22
  - Accuracy Unlabeled: nan
23
- - Accuracy Mass: 0.6969
24
  - Iou Unlabeled: 0.0
25
- - Iou Mass: 0.6969
26
 
27
  ## Model description
28
 
@@ -53,42 +53,42 @@ The following hyperparameters were used during training:
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Mass | Iou Unlabeled | Iou Mass |
55
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
56
- | 0.4117 | 1.25 | 20 | 0.4147 | 0.0579 | 0.1158 | 0.1158 | nan | 0.1158 | 0.0 | 0.1158 |
57
- | 0.2756 | 2.5 | 40 | 0.2454 | 0.1520 | 0.3040 | 0.3040 | nan | 0.3040 | 0.0 | 0.3040 |
58
- | 0.2029 | 3.75 | 60 | 0.1873 | 0.3150 | 0.6301 | 0.6301 | nan | 0.6301 | 0.0 | 0.6301 |
59
- | 0.1506 | 5.0 | 80 | 0.1403 | 0.3616 | 0.7232 | 0.7232 | nan | 0.7232 | 0.0 | 0.7232 |
60
- | 0.1177 | 6.25 | 100 | 0.1077 | 0.1634 | 0.3269 | 0.3269 | nan | 0.3269 | 0.0 | 0.3269 |
61
- | 0.088 | 7.5 | 120 | 0.0972 | 0.2268 | 0.4536 | 0.4536 | nan | 0.4536 | 0.0 | 0.4536 |
62
- | 0.0796 | 8.75 | 140 | 0.0895 | 0.3776 | 0.7551 | 0.7551 | nan | 0.7551 | 0.0 | 0.7551 |
63
- | 0.0702 | 10.0 | 160 | 0.0754 | 0.3785 | 0.7570 | 0.7570 | nan | 0.7570 | 0.0 | 0.7570 |
64
- | 0.0643 | 11.25 | 180 | 0.0654 | 0.3207 | 0.6414 | 0.6414 | nan | 0.6414 | 0.0 | 0.6414 |
65
- | 0.0566 | 12.5 | 200 | 0.0635 | 0.3408 | 0.6815 | 0.6815 | nan | 0.6815 | 0.0 | 0.6815 |
66
- | 0.0467 | 13.75 | 220 | 0.0684 | 0.3971 | 0.7942 | 0.7942 | nan | 0.7942 | 0.0 | 0.7942 |
67
- | 0.0481 | 15.0 | 240 | 0.0599 | 0.3713 | 0.7425 | 0.7425 | nan | 0.7425 | 0.0 | 0.7425 |
68
- | 0.0465 | 16.25 | 260 | 0.0603 | 0.3121 | 0.6241 | 0.6241 | nan | 0.6241 | 0.0 | 0.6241 |
69
- | 0.0409 | 17.5 | 280 | 0.0569 | 0.3441 | 0.6882 | 0.6882 | nan | 0.6882 | 0.0 | 0.6882 |
70
- | 0.0392 | 18.75 | 300 | 0.0565 | 0.3568 | 0.7135 | 0.7135 | nan | 0.7135 | 0.0 | 0.7135 |
71
- | 0.0287 | 20.0 | 320 | 0.0571 | 0.3237 | 0.6474 | 0.6474 | nan | 0.6474 | 0.0 | 0.6474 |
72
- | 0.032 | 21.25 | 340 | 0.0574 | 0.3209 | 0.6419 | 0.6419 | nan | 0.6419 | 0.0 | 0.6419 |
73
- | 0.0308 | 22.5 | 360 | 0.0551 | 0.3371 | 0.6742 | 0.6742 | nan | 0.6742 | 0.0 | 0.6742 |
74
- | 0.0274 | 23.75 | 380 | 0.0546 | 0.3561 | 0.7122 | 0.7122 | nan | 0.7122 | 0.0 | 0.7122 |
75
- | 0.0246 | 25.0 | 400 | 0.0534 | 0.3491 | 0.6981 | 0.6981 | nan | 0.6981 | 0.0 | 0.6981 |
76
- | 0.0252 | 26.25 | 420 | 0.0533 | 0.3661 | 0.7322 | 0.7322 | nan | 0.7322 | 0.0 | 0.7322 |
77
- | 0.0251 | 27.5 | 440 | 0.0542 | 0.3507 | 0.7014 | 0.7014 | nan | 0.7014 | 0.0 | 0.7014 |
78
- | 0.027 | 28.75 | 460 | 0.0527 | 0.3531 | 0.7062 | 0.7062 | nan | 0.7062 | 0.0 | 0.7062 |
79
- | 0.0259 | 30.0 | 480 | 0.0539 | 0.3757 | 0.7514 | 0.7514 | nan | 0.7514 | 0.0 | 0.7514 |
80
- | 0.0212 | 31.25 | 500 | 0.0537 | 0.3283 | 0.6565 | 0.6565 | nan | 0.6565 | 0.0 | 0.6565 |
81
- | 0.0223 | 32.5 | 520 | 0.0517 | 0.3511 | 0.7022 | 0.7022 | nan | 0.7022 | 0.0 | 0.7022 |
82
- | 0.027 | 33.75 | 540 | 0.0504 | 0.3552 | 0.7103 | 0.7103 | nan | 0.7103 | 0.0 | 0.7103 |
83
- | 0.026 | 35.0 | 560 | 0.0516 | 0.3596 | 0.7192 | 0.7192 | nan | 0.7192 | 0.0 | 0.7192 |
84
- | 0.0239 | 36.25 | 580 | 0.0525 | 0.3559 | 0.7119 | 0.7119 | nan | 0.7119 | 0.0 | 0.7119 |
85
- | 0.0218 | 37.5 | 600 | 0.0532 | 0.3374 | 0.6748 | 0.6748 | nan | 0.6748 | 0.0 | 0.6748 |
86
- | 0.0214 | 38.75 | 620 | 0.0513 | 0.3591 | 0.7183 | 0.7183 | nan | 0.7183 | 0.0 | 0.7183 |
87
- | 0.0187 | 40.0 | 640 | 0.0517 | 0.3660 | 0.7320 | 0.7320 | nan | 0.7320 | 0.0 | 0.7320 |
88
- | 0.0201 | 41.25 | 660 | 0.0521 | 0.3647 | 0.7295 | 0.7295 | nan | 0.7295 | 0.0 | 0.7295 |
89
- | 0.024 | 42.5 | 680 | 0.0520 | 0.3485 | 0.6970 | 0.6970 | nan | 0.6970 | 0.0 | 0.6970 |
90
- | 0.0198 | 43.75 | 700 | 0.0516 | 0.3623 | 0.7247 | 0.7247 | nan | 0.7247 | 0.0 | 0.7247 |
91
- | 0.0236 | 45.0 | 720 | 0.0522 | 0.3485 | 0.6969 | 0.6969 | nan | 0.6969 | 0.0 | 0.6969 |
92
 
93
 
94
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.0491
19
+ - Mean Iou: 0.3531
20
+ - Mean Accuracy: 0.7062
21
+ - Overall Accuracy: 0.7062
22
  - Accuracy Unlabeled: nan
23
+ - Accuracy Mass: 0.7062
24
  - Iou Unlabeled: 0.0
25
+ - Iou Mass: 0.7062
26
 
27
  ## Model description
28
 
 
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Mass | Iou Unlabeled | Iou Mass |
55
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
56
+ | 0.3512 | 1.25 | 20 | 0.3893 | 0.0773 | 0.1545 | 0.1545 | nan | 0.1545 | 0.0 | 0.1545 |
57
+ | 0.2286 | 2.5 | 40 | 0.2047 | 0.1937 | 0.3874 | 0.3874 | nan | 0.3874 | 0.0 | 0.3874 |
58
+ | 0.1657 | 3.75 | 60 | 0.1423 | 0.2491 | 0.4982 | 0.4982 | nan | 0.4982 | 0.0 | 0.4982 |
59
+ | 0.1581 | 5.0 | 80 | 0.1117 | 0.2649 | 0.5299 | 0.5299 | nan | 0.5299 | 0.0 | 0.5299 |
60
+ | 0.1063 | 6.25 | 100 | 0.0943 | 0.3327 | 0.6653 | 0.6653 | nan | 0.6653 | 0.0 | 0.6653 |
61
+ | 0.0829 | 7.5 | 120 | 0.0782 | 0.2983 | 0.5966 | 0.5966 | nan | 0.5966 | 0.0 | 0.5966 |
62
+ | 0.0808 | 8.75 | 140 | 0.0740 | 0.3257 | 0.6515 | 0.6515 | nan | 0.6515 | 0.0 | 0.6515 |
63
+ | 0.0694 | 10.0 | 160 | 0.0725 | 0.3503 | 0.7005 | 0.7005 | nan | 0.7005 | 0.0 | 0.7005 |
64
+ | 0.0589 | 11.25 | 180 | 0.0663 | 0.2629 | 0.5259 | 0.5259 | nan | 0.5259 | 0.0 | 0.5259 |
65
+ | 0.0473 | 12.5 | 200 | 0.0604 | 0.3685 | 0.7369 | 0.7369 | nan | 0.7369 | 0.0 | 0.7369 |
66
+ | 0.0433 | 13.75 | 220 | 0.0569 | 0.3055 | 0.6109 | 0.6109 | nan | 0.6109 | 0.0 | 0.6109 |
67
+ | 0.0511 | 15.0 | 240 | 0.0546 | 0.3572 | 0.7145 | 0.7145 | nan | 0.7145 | 0.0 | 0.7145 |
68
+ | 0.04 | 16.25 | 260 | 0.0536 | 0.3234 | 0.6467 | 0.6467 | nan | 0.6467 | 0.0 | 0.6467 |
69
+ | 0.0365 | 17.5 | 280 | 0.0555 | 0.3086 | 0.6171 | 0.6171 | nan | 0.6171 | 0.0 | 0.6171 |
70
+ | 0.0314 | 18.75 | 300 | 0.0505 | 0.3595 | 0.7191 | 0.7191 | nan | 0.7191 | 0.0 | 0.7191 |
71
+ | 0.0295 | 20.0 | 320 | 0.0536 | 0.3079 | 0.6159 | 0.6159 | nan | 0.6159 | 0.0 | 0.6159 |
72
+ | 0.0337 | 21.25 | 340 | 0.0490 | 0.3446 | 0.6891 | 0.6891 | nan | 0.6891 | 0.0 | 0.6891 |
73
+ | 0.0325 | 22.5 | 360 | 0.0489 | 0.3946 | 0.7891 | 0.7891 | nan | 0.7891 | 0.0 | 0.7891 |
74
+ | 0.0314 | 23.75 | 380 | 0.0514 | 0.3184 | 0.6368 | 0.6368 | nan | 0.6368 | 0.0 | 0.6368 |
75
+ | 0.0267 | 25.0 | 400 | 0.0485 | 0.3572 | 0.7144 | 0.7144 | nan | 0.7144 | 0.0 | 0.7144 |
76
+ | 0.0321 | 26.25 | 420 | 0.0490 | 0.3787 | 0.7573 | 0.7573 | nan | 0.7573 | 0.0 | 0.7573 |
77
+ | 0.025 | 27.5 | 440 | 0.0474 | 0.3615 | 0.7230 | 0.7230 | nan | 0.7230 | 0.0 | 0.7230 |
78
+ | 0.0225 | 28.75 | 460 | 0.0472 | 0.3660 | 0.7319 | 0.7319 | nan | 0.7319 | 0.0 | 0.7319 |
79
+ | 0.0247 | 30.0 | 480 | 0.0502 | 0.3488 | 0.6976 | 0.6976 | nan | 0.6976 | 0.0 | 0.6976 |
80
+ | 0.0216 | 31.25 | 500 | 0.0483 | 0.3536 | 0.7072 | 0.7072 | nan | 0.7072 | 0.0 | 0.7072 |
81
+ | 0.0195 | 32.5 | 520 | 0.0508 | 0.3289 | 0.6578 | 0.6578 | nan | 0.6578 | 0.0 | 0.6578 |
82
+ | 0.0259 | 33.75 | 540 | 0.0496 | 0.3846 | 0.7692 | 0.7692 | nan | 0.7692 | 0.0 | 0.7692 |
83
+ | 0.0242 | 35.0 | 560 | 0.0487 | 0.3464 | 0.6928 | 0.6928 | nan | 0.6928 | 0.0 | 0.6928 |
84
+ | 0.0217 | 36.25 | 580 | 0.0503 | 0.3325 | 0.6650 | 0.6650 | nan | 0.6650 | 0.0 | 0.6650 |
85
+ | 0.0204 | 37.5 | 600 | 0.0502 | 0.3429 | 0.6858 | 0.6858 | nan | 0.6858 | 0.0 | 0.6858 |
86
+ | 0.0204 | 38.75 | 620 | 0.0507 | 0.3457 | 0.6913 | 0.6913 | nan | 0.6913 | 0.0 | 0.6913 |
87
+ | 0.0191 | 40.0 | 640 | 0.0494 | 0.3494 | 0.6988 | 0.6988 | nan | 0.6988 | 0.0 | 0.6988 |
88
+ | 0.0204 | 41.25 | 660 | 0.0503 | 0.3426 | 0.6852 | 0.6852 | nan | 0.6852 | 0.0 | 0.6852 |
89
+ | 0.019 | 42.5 | 680 | 0.0485 | 0.3616 | 0.7232 | 0.7232 | nan | 0.7232 | 0.0 | 0.7232 |
90
+ | 0.0198 | 43.75 | 700 | 0.0494 | 0.3504 | 0.7008 | 0.7008 | nan | 0.7008 | 0.0 | 0.7008 |
91
+ | 0.0212 | 45.0 | 720 | 0.0491 | 0.3531 | 0.7062 | 0.7062 | nan | 0.7062 | 0.0 | 0.7062 |
92
 
93
 
94
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e31be23a2a4150de0bb3e9360ef1fae693d40bd5d55dd36c54e2684b4b6d3466
3
  size 14884776
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:488877b086542683ec3eddc27e62d34c4d0b3db349dbbf8df51ea558d03d6029
3
  size 14884776
runs/Mar27_15-31-12_7287f272094f/events.out.tfevents.1711553477.7287f272094f.438.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6755ff8c41086cbd7c86d5da0c36c89db9662d66010288991cf3d702c7969fff
3
+ size 180859
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6680295615b8eaee6ab75f97efdcd1cdc57418ff46ea19e5bab1bb5d55929465
3
  size 4984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6ebf06f49edbc4fed3fcf1ab75ab136daf86155c38e6ae797384ac1b3a49b8ab
3
  size 4984