HorcruxNo13 commited on
Commit
5727563
1 Parent(s): e4d595c

Model save

Browse files
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
  license: other
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
@@ -14,14 +15,14 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.0799
18
- - Mean Iou: 0.4629
19
- - Mean Accuracy: 0.9258
20
- - Overall Accuracy: 0.9258
21
  - Accuracy Unlabeled: nan
22
- - Accuracy Liver: 0.9258
23
  - Iou Unlabeled: 0.0
24
- - Iou Liver: 0.9258
25
 
26
  ## Model description
27
 
@@ -41,65 +42,58 @@ More information needed
41
 
42
  The following hyperparameters were used during training:
43
  - learning_rate: 0.0001
44
- - train_batch_size: 24
45
- - eval_batch_size: 24
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 35
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Liver | Iou Unlabeled | Iou Liver |
54
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:-------------:|:---------:|
55
- | 0.2837 | 0.8 | 20 | 0.3699 | 0.3876 | 0.7752 | 0.7752 | nan | 0.7752 | 0.0 | 0.7752 |
56
- | 0.2264 | 1.6 | 40 | 0.1982 | 0.4222 | 0.8444 | 0.8444 | nan | 0.8444 | 0.0 | 0.8444 |
57
- | 0.1687 | 2.4 | 60 | 0.1594 | 0.3988 | 0.7977 | 0.7977 | nan | 0.7977 | 0.0 | 0.7977 |
58
- | 0.1489 | 3.2 | 80 | 0.1396 | 0.4050 | 0.8100 | 0.8100 | nan | 0.8100 | 0.0 | 0.8100 |
59
- | 0.1111 | 4.0 | 100 | 0.1203 | 0.4223 | 0.8446 | 0.8446 | nan | 0.8446 | 0.0 | 0.8446 |
60
- | 0.1115 | 4.8 | 120 | 0.1160 | 0.4512 | 0.9023 | 0.9023 | nan | 0.9023 | 0.0 | 0.9023 |
61
- | 0.1081 | 5.6 | 140 | 0.1053 | 0.4504 | 0.9009 | 0.9009 | nan | 0.9009 | 0.0 | 0.9009 |
62
- | 0.1111 | 6.4 | 160 | 0.0960 | 0.4526 | 0.9051 | 0.9051 | nan | 0.9051 | 0.0 | 0.9051 |
63
- | 0.0904 | 7.2 | 180 | 0.0954 | 0.4646 | 0.9292 | 0.9292 | nan | 0.9292 | 0.0 | 0.9292 |
64
- | 0.0868 | 8.0 | 200 | 0.0925 | 0.4593 | 0.9187 | 0.9187 | nan | 0.9187 | 0.0 | 0.9187 |
65
- | 0.092 | 8.8 | 220 | 0.0852 | 0.4630 | 0.9261 | 0.9261 | nan | 0.9261 | 0.0 | 0.9261 |
66
- | 0.0686 | 9.6 | 240 | 0.0897 | 0.4631 | 0.9263 | 0.9263 | nan | 0.9263 | 0.0 | 0.9263 |
67
- | 0.0684 | 10.4 | 260 | 0.0939 | 0.4727 | 0.9455 | 0.9455 | nan | 0.9455 | 0.0 | 0.9455 |
68
- | 0.0634 | 11.2 | 280 | 0.0919 | 0.4241 | 0.8483 | 0.8483 | nan | 0.8483 | 0.0 | 0.8483 |
69
- | 0.059 | 12.0 | 300 | 0.0886 | 0.4727 | 0.9455 | 0.9455 | nan | 0.9455 | 0.0 | 0.9455 |
70
- | 0.052 | 12.8 | 320 | 0.0764 | 0.4554 | 0.9108 | 0.9108 | nan | 0.9108 | 0.0 | 0.9108 |
71
- | 0.0558 | 13.6 | 340 | 0.0769 | 0.4629 | 0.9258 | 0.9258 | nan | 0.9258 | 0.0 | 0.9258 |
72
- | 0.0594 | 14.4 | 360 | 0.0770 | 0.4616 | 0.9231 | 0.9231 | nan | 0.9231 | 0.0 | 0.9231 |
73
- | 0.0641 | 15.2 | 380 | 0.0844 | 0.4709 | 0.9417 | 0.9417 | nan | 0.9417 | 0.0 | 0.9417 |
74
- | 0.0645 | 16.0 | 400 | 0.0790 | 0.4632 | 0.9263 | 0.9263 | nan | 0.9263 | 0.0 | 0.9263 |
75
- | 0.0545 | 16.8 | 420 | 0.0776 | 0.4610 | 0.9220 | 0.9220 | nan | 0.9220 | 0.0 | 0.9220 |
76
- | 0.056 | 17.6 | 440 | 0.0780 | 0.4541 | 0.9082 | 0.9082 | nan | 0.9082 | 0.0 | 0.9082 |
77
- | 0.0472 | 18.4 | 460 | 0.0742 | 0.4595 | 0.9189 | 0.9189 | nan | 0.9189 | 0.0 | 0.9189 |
78
- | 0.0478 | 19.2 | 480 | 0.0806 | 0.4690 | 0.9380 | 0.9380 | nan | 0.9380 | 0.0 | 0.9380 |
79
- | 0.0523 | 20.0 | 500 | 0.0741 | 0.4550 | 0.9100 | 0.9100 | nan | 0.9100 | 0.0 | 0.9100 |
80
- | 0.0401 | 20.8 | 520 | 0.0794 | 0.4637 | 0.9274 | 0.9274 | nan | 0.9274 | 0.0 | 0.9274 |
81
- | 0.041 | 21.6 | 540 | 0.0772 | 0.4631 | 0.9262 | 0.9262 | nan | 0.9262 | 0.0 | 0.9262 |
82
- | 0.0386 | 22.4 | 560 | 0.0795 | 0.4620 | 0.9240 | 0.9240 | nan | 0.9240 | 0.0 | 0.9240 |
83
- | 0.0386 | 23.2 | 580 | 0.0761 | 0.4616 | 0.9232 | 0.9232 | nan | 0.9232 | 0.0 | 0.9232 |
84
- | 0.0628 | 24.0 | 600 | 0.0778 | 0.4636 | 0.9271 | 0.9271 | nan | 0.9271 | 0.0 | 0.9271 |
85
- | 0.0387 | 24.8 | 620 | 0.0782 | 0.4613 | 0.9226 | 0.9226 | nan | 0.9226 | 0.0 | 0.9226 |
86
- | 0.0422 | 25.6 | 640 | 0.0778 | 0.4616 | 0.9233 | 0.9233 | nan | 0.9233 | 0.0 | 0.9233 |
87
- | 0.0392 | 26.4 | 660 | 0.0838 | 0.4696 | 0.9393 | 0.9393 | nan | 0.9393 | 0.0 | 0.9393 |
88
- | 0.04 | 27.2 | 680 | 0.0809 | 0.4658 | 0.9315 | 0.9315 | nan | 0.9315 | 0.0 | 0.9315 |
89
- | 0.0341 | 28.0 | 700 | 0.0822 | 0.4667 | 0.9335 | 0.9335 | nan | 0.9335 | 0.0 | 0.9335 |
90
- | 0.0329 | 28.8 | 720 | 0.0797 | 0.4639 | 0.9278 | 0.9278 | nan | 0.9278 | 0.0 | 0.9278 |
91
- | 0.0373 | 29.6 | 740 | 0.0769 | 0.4582 | 0.9163 | 0.9163 | nan | 0.9163 | 0.0 | 0.9163 |
92
- | 0.0366 | 30.4 | 760 | 0.0804 | 0.4632 | 0.9264 | 0.9264 | nan | 0.9264 | 0.0 | 0.9264 |
93
- | 0.0432 | 31.2 | 780 | 0.0793 | 0.4587 | 0.9174 | 0.9174 | nan | 0.9174 | 0.0 | 0.9174 |
94
- | 0.0328 | 32.0 | 800 | 0.0838 | 0.4688 | 0.9377 | 0.9377 | nan | 0.9377 | 0.0 | 0.9377 |
95
- | 0.0377 | 32.8 | 820 | 0.0805 | 0.4643 | 0.9286 | 0.9286 | nan | 0.9286 | 0.0 | 0.9286 |
96
- | 0.0327 | 33.6 | 840 | 0.0784 | 0.4614 | 0.9228 | 0.9228 | nan | 0.9228 | 0.0 | 0.9228 |
97
- | 0.032 | 34.4 | 860 | 0.0799 | 0.4629 | 0.9258 | 0.9258 | nan | 0.9258 | 0.0 | 0.9258 |
98
 
99
 
100
  ### Framework versions
101
 
102
- - Transformers 4.28.0
103
  - Pytorch 2.2.1+cu121
104
  - Datasets 2.18.0
105
- - Tokenizers 0.13.3
 
1
  ---
2
  license: other
3
+ base_model: nvidia/mit-b0
4
  tags:
5
  - generated_from_trainer
6
  model-index:
 
15
 
16
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.0522
19
+ - Mean Iou: 0.3485
20
+ - Mean Accuracy: 0.6969
21
+ - Overall Accuracy: 0.6969
22
  - Accuracy Unlabeled: nan
23
+ - Accuracy Mass: 0.6969
24
  - Iou Unlabeled: 0.0
25
+ - Iou Mass: 0.6969
26
 
27
  ## Model description
28
 
 
42
 
43
  The following hyperparameters were used during training:
44
  - learning_rate: 0.0001
45
+ - train_batch_size: 32
46
+ - eval_batch_size: 32
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 45
51
 
52
  ### Training results
53
 
54
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Mass | Iou Unlabeled | Iou Mass |
55
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
56
+ | 0.4117 | 1.25 | 20 | 0.4147 | 0.0579 | 0.1158 | 0.1158 | nan | 0.1158 | 0.0 | 0.1158 |
57
+ | 0.2756 | 2.5 | 40 | 0.2454 | 0.1520 | 0.3040 | 0.3040 | nan | 0.3040 | 0.0 | 0.3040 |
58
+ | 0.2029 | 3.75 | 60 | 0.1873 | 0.3150 | 0.6301 | 0.6301 | nan | 0.6301 | 0.0 | 0.6301 |
59
+ | 0.1506 | 5.0 | 80 | 0.1403 | 0.3616 | 0.7232 | 0.7232 | nan | 0.7232 | 0.0 | 0.7232 |
60
+ | 0.1177 | 6.25 | 100 | 0.1077 | 0.1634 | 0.3269 | 0.3269 | nan | 0.3269 | 0.0 | 0.3269 |
61
+ | 0.088 | 7.5 | 120 | 0.0972 | 0.2268 | 0.4536 | 0.4536 | nan | 0.4536 | 0.0 | 0.4536 |
62
+ | 0.0796 | 8.75 | 140 | 0.0895 | 0.3776 | 0.7551 | 0.7551 | nan | 0.7551 | 0.0 | 0.7551 |
63
+ | 0.0702 | 10.0 | 160 | 0.0754 | 0.3785 | 0.7570 | 0.7570 | nan | 0.7570 | 0.0 | 0.7570 |
64
+ | 0.0643 | 11.25 | 180 | 0.0654 | 0.3207 | 0.6414 | 0.6414 | nan | 0.6414 | 0.0 | 0.6414 |
65
+ | 0.0566 | 12.5 | 200 | 0.0635 | 0.3408 | 0.6815 | 0.6815 | nan | 0.6815 | 0.0 | 0.6815 |
66
+ | 0.0467 | 13.75 | 220 | 0.0684 | 0.3971 | 0.7942 | 0.7942 | nan | 0.7942 | 0.0 | 0.7942 |
67
+ | 0.0481 | 15.0 | 240 | 0.0599 | 0.3713 | 0.7425 | 0.7425 | nan | 0.7425 | 0.0 | 0.7425 |
68
+ | 0.0465 | 16.25 | 260 | 0.0603 | 0.3121 | 0.6241 | 0.6241 | nan | 0.6241 | 0.0 | 0.6241 |
69
+ | 0.0409 | 17.5 | 280 | 0.0569 | 0.3441 | 0.6882 | 0.6882 | nan | 0.6882 | 0.0 | 0.6882 |
70
+ | 0.0392 | 18.75 | 300 | 0.0565 | 0.3568 | 0.7135 | 0.7135 | nan | 0.7135 | 0.0 | 0.7135 |
71
+ | 0.0287 | 20.0 | 320 | 0.0571 | 0.3237 | 0.6474 | 0.6474 | nan | 0.6474 | 0.0 | 0.6474 |
72
+ | 0.032 | 21.25 | 340 | 0.0574 | 0.3209 | 0.6419 | 0.6419 | nan | 0.6419 | 0.0 | 0.6419 |
73
+ | 0.0308 | 22.5 | 360 | 0.0551 | 0.3371 | 0.6742 | 0.6742 | nan | 0.6742 | 0.0 | 0.6742 |
74
+ | 0.0274 | 23.75 | 380 | 0.0546 | 0.3561 | 0.7122 | 0.7122 | nan | 0.7122 | 0.0 | 0.7122 |
75
+ | 0.0246 | 25.0 | 400 | 0.0534 | 0.3491 | 0.6981 | 0.6981 | nan | 0.6981 | 0.0 | 0.6981 |
76
+ | 0.0252 | 26.25 | 420 | 0.0533 | 0.3661 | 0.7322 | 0.7322 | nan | 0.7322 | 0.0 | 0.7322 |
77
+ | 0.0251 | 27.5 | 440 | 0.0542 | 0.3507 | 0.7014 | 0.7014 | nan | 0.7014 | 0.0 | 0.7014 |
78
+ | 0.027 | 28.75 | 460 | 0.0527 | 0.3531 | 0.7062 | 0.7062 | nan | 0.7062 | 0.0 | 0.7062 |
79
+ | 0.0259 | 30.0 | 480 | 0.0539 | 0.3757 | 0.7514 | 0.7514 | nan | 0.7514 | 0.0 | 0.7514 |
80
+ | 0.0212 | 31.25 | 500 | 0.0537 | 0.3283 | 0.6565 | 0.6565 | nan | 0.6565 | 0.0 | 0.6565 |
81
+ | 0.0223 | 32.5 | 520 | 0.0517 | 0.3511 | 0.7022 | 0.7022 | nan | 0.7022 | 0.0 | 0.7022 |
82
+ | 0.027 | 33.75 | 540 | 0.0504 | 0.3552 | 0.7103 | 0.7103 | nan | 0.7103 | 0.0 | 0.7103 |
83
+ | 0.026 | 35.0 | 560 | 0.0516 | 0.3596 | 0.7192 | 0.7192 | nan | 0.7192 | 0.0 | 0.7192 |
84
+ | 0.0239 | 36.25 | 580 | 0.0525 | 0.3559 | 0.7119 | 0.7119 | nan | 0.7119 | 0.0 | 0.7119 |
85
+ | 0.0218 | 37.5 | 600 | 0.0532 | 0.3374 | 0.6748 | 0.6748 | nan | 0.6748 | 0.0 | 0.6748 |
86
+ | 0.0214 | 38.75 | 620 | 0.0513 | 0.3591 | 0.7183 | 0.7183 | nan | 0.7183 | 0.0 | 0.7183 |
87
+ | 0.0187 | 40.0 | 640 | 0.0517 | 0.3660 | 0.7320 | 0.7320 | nan | 0.7320 | 0.0 | 0.7320 |
88
+ | 0.0201 | 41.25 | 660 | 0.0521 | 0.3647 | 0.7295 | 0.7295 | nan | 0.7295 | 0.0 | 0.7295 |
89
+ | 0.024 | 42.5 | 680 | 0.0520 | 0.3485 | 0.6970 | 0.6970 | nan | 0.6970 | 0.0 | 0.6970 |
90
+ | 0.0198 | 43.75 | 700 | 0.0516 | 0.3623 | 0.7247 | 0.7247 | nan | 0.7247 | 0.0 | 0.7247 |
91
+ | 0.0236 | 45.0 | 720 | 0.0522 | 0.3485 | 0.6969 | 0.6969 | nan | 0.6969 | 0.0 | 0.6969 |
 
 
 
 
 
 
 
92
 
93
 
94
  ### Framework versions
95
 
96
+ - Transformers 4.38.2
97
  - Pytorch 2.2.1+cu121
98
  - Datasets 2.18.0
99
+ - Tokenizers 0.15.2
config.json CHANGED
@@ -29,12 +29,12 @@
29
  ],
30
  "id2label": {
31
  "0": "unlabeled",
32
- "1": "liver"
33
  },
34
  "image_size": 224,
35
  "initializer_range": 0.02,
36
  "label2id": {
37
- "liver": 1,
38
  "unlabeled": 0
39
  },
40
  "layer_norm_eps": 1e-06,
@@ -74,5 +74,5 @@
74
  2
75
  ],
76
  "torch_dtype": "float32",
77
- "transformers_version": "4.28.0"
78
  }
 
29
  ],
30
  "id2label": {
31
  "0": "unlabeled",
32
+ "1": "mass"
33
  },
34
  "image_size": 224,
35
  "initializer_range": 0.02,
36
  "label2id": {
37
+ "mass": 1,
38
  "unlabeled": 0
39
  },
40
  "layer_norm_eps": 1e-06,
 
74
  2
75
  ],
76
  "torch_dtype": "float32",
77
+ "transformers_version": "4.38.2"
78
  }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e31be23a2a4150de0bb3e9360ef1fae693d40bd5d55dd36c54e2684b4b6d3466
3
+ size 14884776
runs/Mar27_13-00-55_b68f219c92b0/events.out.tfevents.1711544458.b68f219c92b0.5853.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:30c2409dd3d2e2f997c311db49aedc6578d5e2cc98f665b3e698df9641f6057c
3
+ size 180859
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:65df6fcfcbee4d9d0c1e3828cecf4b5d11285c72401707644462c596266e9e5e
3
- size 4088
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6680295615b8eaee6ab75f97efdcd1cdc57418ff46ea19e5bab1bb5d55929465
3
+ size 4984