yolo12138 commited on
Commit
fde2297
1 Parent(s): 80cfd52

yolo12138/segformer-b2-cloth-parse-9

Browse files
README.md ADDED
@@ -0,0 +1,125 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: mattmdjaga/segformer_b2_clothes
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - cloth_parsing_mix
8
+ model-index:
9
+ - name: segformer-b2-cloth-parse-9
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # segformer-b2-cloth-parse-9
17
+
18
+ This model is a fine-tuned version of [mattmdjaga/segformer_b2_clothes](https://huggingface.co/mattmdjaga/segformer_b2_clothes) on the cloth_parsing_mix dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0433
21
+ - Mean Iou: 0.8611
22
+ - Mean Accuracy: 0.9107
23
+ - Overall Accuracy: 0.9846
24
+ - Accuracy Background: 0.9964
25
+ - Accuracy Upper Torso: 0.9857
26
+ - Accuracy Left Pants: 0.9654
27
+ - Accuracy Right Patns: 0.9664
28
+ - Accuracy Skirts: 0.9065
29
+ - Accuracy Left Sleeve: 0.9591
30
+ - Accuracy Right Sleeve: 0.9662
31
+ - Accuracy Outer Collar: 0.6491
32
+ - Accuracy Inner Collar: 0.8015
33
+ - Iou Background: 0.9923
34
+ - Iou Upper Torso: 0.9655
35
+ - Iou Left Pants: 0.9017
36
+ - Iou Right Patns: 0.9085
37
+ - Iou Skirts: 0.8749
38
+ - Iou Left Sleeve: 0.9223
39
+ - Iou Right Sleeve: 0.9289
40
+ - Iou Outer Collar: 0.5394
41
+ - Iou Inner Collar: 0.7160
42
+
43
+ ## Model description
44
+
45
+ More information needed
46
+
47
+ ## Intended uses & limitations
48
+
49
+ More information needed
50
+
51
+ ## Training and evaluation data
52
+
53
+ More information needed
54
+
55
+ ## Training procedure
56
+
57
+ ### Training hyperparameters
58
+
59
+ The following hyperparameters were used during training:
60
+ - learning_rate: 1e-05
61
+ - train_batch_size: 12
62
+ - eval_batch_size: 12
63
+ - seed: 42
64
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
65
+ - lr_scheduler_type: linear
66
+ - num_epochs: 5
67
+
68
+ ### Training results
69
+
70
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Upper Torso | Accuracy Left Pants | Accuracy Right Patns | Accuracy Skirts | Accuracy Left Sleeve | Accuracy Right Sleeve | Accuracy Outer Collar | Accuracy Inner Collar | Iou Background | Iou Upper Torso | Iou Left Pants | Iou Right Patns | Iou Skirts | Iou Left Sleeve | Iou Right Sleeve | Iou Outer Collar | Iou Inner Collar |
71
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------------:|:-------------------:|:--------------------:|:---------------:|:--------------------:|:---------------------:|:---------------------:|:---------------------:|:--------------:|:---------------:|:--------------:|:---------------:|:----------:|:---------------:|:----------------:|:----------------:|:----------------:|
72
+ | 0.1054 | 0.11 | 500 | 0.1180 | 0.7305 | 0.7971 | 0.9670 | 0.9902 | 0.9720 | 0.9654 | 0.9756 | 0.8036 | 0.9226 | 0.9289 | 0.0716 | 0.5444 | 0.9830 | 0.9234 | 0.8752 | 0.8765 | 0.7370 | 0.8236 | 0.8232 | 0.0703 | 0.4628 |
73
+ | 0.1033 | 0.22 | 1000 | 0.0851 | 0.7862 | 0.8418 | 0.9746 | 0.9924 | 0.9829 | 0.9665 | 0.9653 | 0.8491 | 0.9145 | 0.9226 | 0.3219 | 0.6608 | 0.9866 | 0.9424 | 0.8858 | 0.8875 | 0.8105 | 0.8538 | 0.8614 | 0.2833 | 0.5642 |
74
+ | 0.0944 | 0.32 | 1500 | 0.0713 | 0.8077 | 0.8595 | 0.9773 | 0.9941 | 0.9833 | 0.9566 | 0.9625 | 0.8924 | 0.9094 | 0.9181 | 0.4414 | 0.6774 | 0.9880 | 0.9481 | 0.8937 | 0.8950 | 0.8437 | 0.8668 | 0.8751 | 0.3629 | 0.5958 |
75
+ | 0.0746 | 0.43 | 2000 | 0.0683 | 0.8190 | 0.8770 | 0.9783 | 0.9941 | 0.9796 | 0.9652 | 0.9722 | 0.8656 | 0.9480 | 0.9562 | 0.4882 | 0.7236 | 0.9888 | 0.9497 | 0.9070 | 0.9127 | 0.8306 | 0.8790 | 0.8870 | 0.3945 | 0.6218 |
76
+ | 0.0548 | 0.54 | 2500 | 0.0666 | 0.8187 | 0.8713 | 0.9787 | 0.9951 | 0.9831 | 0.9580 | 0.9606 | 0.8651 | 0.9215 | 0.9453 | 0.4839 | 0.7293 | 0.9893 | 0.9514 | 0.8939 | 0.9006 | 0.8245 | 0.8812 | 0.8964 | 0.4010 | 0.6298 |
77
+ | 0.0728 | 0.65 | 3000 | 0.0591 | 0.8271 | 0.8806 | 0.9804 | 0.9945 | 0.9839 | 0.9624 | 0.9659 | 0.8982 | 0.9399 | 0.9430 | 0.4884 | 0.7493 | 0.9900 | 0.9551 | 0.8940 | 0.8966 | 0.8583 | 0.8930 | 0.9011 | 0.4100 | 0.6458 |
78
+ | 0.0505 | 0.75 | 3500 | 0.0648 | 0.8218 | 0.8745 | 0.9797 | 0.9947 | 0.9847 | 0.9858 | 0.9905 | 0.8402 | 0.9500 | 0.9587 | 0.4480 | 0.7178 | 0.9900 | 0.9534 | 0.9022 | 0.9037 | 0.8223 | 0.8944 | 0.9017 | 0.3881 | 0.6402 |
79
+ | 0.0601 | 0.86 | 4000 | 0.0568 | 0.8415 | 0.8951 | 0.9817 | 0.9952 | 0.9817 | 0.9632 | 0.9640 | 0.9170 | 0.9521 | 0.9541 | 0.5781 | 0.7508 | 0.9903 | 0.9576 | 0.9138 | 0.9199 | 0.8716 | 0.9010 | 0.9106 | 0.4562 | 0.6529 |
80
+ | 0.0438 | 0.97 | 4500 | 0.0569 | 0.8431 | 0.8925 | 0.9815 | 0.9947 | 0.9844 | 0.9764 | 0.9838 | 0.8870 | 0.9492 | 0.9595 | 0.5561 | 0.7416 | 0.9903 | 0.9560 | 0.9287 | 0.9370 | 0.8585 | 0.9000 | 0.9089 | 0.4524 | 0.6559 |
81
+ | 0.0617 | 1.08 | 5000 | 0.0529 | 0.8417 | 0.8933 | 0.9816 | 0.9952 | 0.9841 | 0.9602 | 0.9631 | 0.8922 | 0.9475 | 0.9533 | 0.5797 | 0.7642 | 0.9907 | 0.9571 | 0.9097 | 0.9126 | 0.8488 | 0.9044 | 0.9158 | 0.4687 | 0.6678 |
82
+ | 0.0452 | 1.19 | 5500 | 0.0557 | 0.8351 | 0.8935 | 0.9812 | 0.9949 | 0.9842 | 0.9644 | 0.9667 | 0.8781 | 0.9494 | 0.9604 | 0.5961 | 0.7471 | 0.9906 | 0.9588 | 0.8803 | 0.8885 | 0.8349 | 0.9069 | 0.9169 | 0.4743 | 0.6645 |
83
+ | 0.0571 | 1.29 | 6000 | 0.0551 | 0.8351 | 0.8934 | 0.9810 | 0.9957 | 0.9831 | 0.9652 | 0.9693 | 0.8562 | 0.9593 | 0.9569 | 0.5959 | 0.7586 | 0.9910 | 0.9579 | 0.8842 | 0.8879 | 0.8188 | 0.9084 | 0.9155 | 0.4774 | 0.6749 |
84
+ | 0.0778 | 1.4 | 6500 | 0.0537 | 0.8430 | 0.8994 | 0.9818 | 0.9948 | 0.9839 | 0.9872 | 0.9921 | 0.8702 | 0.9587 | 0.9635 | 0.5790 | 0.7656 | 0.9911 | 0.9579 | 0.9044 | 0.9093 | 0.8458 | 0.9060 | 0.9157 | 0.4760 | 0.6808 |
85
+ | 0.0392 | 1.51 | 7000 | 0.0491 | 0.8503 | 0.9069 | 0.9830 | 0.9954 | 0.9823 | 0.9645 | 0.9666 | 0.9205 | 0.9534 | 0.9599 | 0.6214 | 0.7984 | 0.9916 | 0.9607 | 0.9123 | 0.9139 | 0.8755 | 0.9072 | 0.9180 | 0.4907 | 0.6830 |
86
+ | 0.0376 | 1.62 | 7500 | 0.0514 | 0.8442 | 0.9010 | 0.9819 | 0.9954 | 0.9832 | 0.9652 | 0.9660 | 0.8850 | 0.9525 | 0.9598 | 0.6257 | 0.7762 | 0.9914 | 0.9586 | 0.8944 | 0.9053 | 0.8355 | 0.9104 | 0.9215 | 0.4965 | 0.6838 |
87
+ | 0.0391 | 1.73 | 8000 | 0.0492 | 0.8422 | 0.8993 | 0.9819 | 0.9958 | 0.9836 | 0.9641 | 0.9671 | 0.8692 | 0.9561 | 0.9661 | 0.6159 | 0.7756 | 0.9916 | 0.9596 | 0.8882 | 0.8930 | 0.8338 | 0.9103 | 0.9189 | 0.4982 | 0.6860 |
88
+ | 0.0446 | 1.83 | 8500 | 0.0491 | 0.8515 | 0.9079 | 0.9829 | 0.9960 | 0.9836 | 0.9890 | 0.9913 | 0.8770 | 0.9505 | 0.9631 | 0.6458 | 0.7751 | 0.9916 | 0.9603 | 0.9114 | 0.9161 | 0.8559 | 0.9100 | 0.9217 | 0.5096 | 0.6867 |
89
+ | 0.041 | 1.94 | 9000 | 0.0482 | 0.8464 | 0.8978 | 0.9825 | 0.9958 | 0.9848 | 0.9619 | 0.9668 | 0.8822 | 0.9569 | 0.9659 | 0.5961 | 0.7703 | 0.9916 | 0.9602 | 0.8958 | 0.9018 | 0.8438 | 0.9148 | 0.9231 | 0.4966 | 0.6899 |
90
+ | 0.0744 | 2.05 | 9500 | 0.0474 | 0.8523 | 0.9018 | 0.9834 | 0.9961 | 0.9840 | 0.9598 | 0.9633 | 0.9195 | 0.9471 | 0.9644 | 0.6055 | 0.7766 | 0.9919 | 0.9619 | 0.9095 | 0.9125 | 0.8697 | 0.9113 | 0.9238 | 0.5010 | 0.6889 |
91
+ | 0.0433 | 2.16 | 10000 | 0.0471 | 0.8581 | 0.9103 | 0.9842 | 0.9951 | 0.9843 | 0.9617 | 0.9646 | 0.9416 | 0.9549 | 0.9718 | 0.6305 | 0.7879 | 0.9915 | 0.9644 | 0.9100 | 0.9155 | 0.8976 | 0.9145 | 0.9245 | 0.5127 | 0.6920 |
92
+ | 0.0412 | 2.26 | 10500 | 0.0468 | 0.8574 | 0.9042 | 0.9835 | 0.9956 | 0.9848 | 0.9628 | 0.9669 | 0.9023 | 0.9615 | 0.9677 | 0.6115 | 0.7847 | 0.9918 | 0.9601 | 0.9248 | 0.9286 | 0.8656 | 0.9177 | 0.9245 | 0.5073 | 0.6964 |
93
+ | 0.0489 | 2.37 | 11000 | 0.0496 | 0.8511 | 0.9029 | 0.9832 | 0.9956 | 0.9858 | 0.9905 | 0.9948 | 0.8694 | 0.9574 | 0.9654 | 0.5748 | 0.7926 | 0.9921 | 0.9604 | 0.9066 | 0.9086 | 0.8615 | 0.9167 | 0.9228 | 0.4913 | 0.7004 |
94
+ | 0.0388 | 2.48 | 11500 | 0.0450 | 0.8594 | 0.9036 | 0.9849 | 0.9957 | 0.9857 | 0.9621 | 0.9648 | 0.9620 | 0.9493 | 0.9604 | 0.5733 | 0.7793 | 0.9922 | 0.9649 | 0.9155 | 0.9205 | 0.9076 | 0.9138 | 0.9257 | 0.4941 | 0.7002 |
95
+ | 0.0409 | 2.59 | 12000 | 0.0493 | 0.8579 | 0.9124 | 0.9844 | 0.9955 | 0.9853 | 0.9928 | 0.9929 | 0.9083 | 0.9573 | 0.9671 | 0.6288 | 0.7832 | 0.9921 | 0.9651 | 0.9046 | 0.9086 | 0.8842 | 0.9196 | 0.9267 | 0.5175 | 0.7026 |
96
+ | 0.0477 | 2.7 | 12500 | 0.0436 | 0.8610 | 0.9051 | 0.9848 | 0.9957 | 0.9868 | 0.9639 | 0.9675 | 0.9478 | 0.9445 | 0.9590 | 0.5972 | 0.7831 | 0.9919 | 0.9654 | 0.9187 | 0.9251 | 0.9029 | 0.9126 | 0.9253 | 0.5035 | 0.7034 |
97
+ | 0.0488 | 2.8 | 13000 | 0.0450 | 0.8577 | 0.9076 | 0.9842 | 0.9963 | 0.9848 | 0.9712 | 0.9695 | 0.9132 | 0.9493 | 0.9621 | 0.6188 | 0.8026 | 0.9924 | 0.9635 | 0.9095 | 0.9124 | 0.8742 | 0.9172 | 0.9276 | 0.5157 | 0.7065 |
98
+ | 0.0879 | 2.91 | 13500 | 0.0516 | 0.8453 | 0.8949 | 0.9819 | 0.9960 | 0.9867 | 0.9631 | 0.9665 | 0.8325 | 0.9618 | 0.9678 | 0.6033 | 0.7763 | 0.9919 | 0.9574 | 0.8955 | 0.9007 | 0.8088 | 0.9206 | 0.9245 | 0.5069 | 0.7013 |
99
+ | 0.0525 | 3.02 | 14000 | 0.0474 | 0.8521 | 0.9053 | 0.9830 | 0.9959 | 0.9849 | 0.9850 | 0.9925 | 0.8703 | 0.9481 | 0.9597 | 0.6076 | 0.8038 | 0.9923 | 0.9600 | 0.9050 | 0.9099 | 0.8420 | 0.9143 | 0.9263 | 0.5148 | 0.7044 |
100
+ | 0.0455 | 3.13 | 14500 | 0.0435 | 0.8579 | 0.9111 | 0.9842 | 0.9953 | 0.9852 | 0.9646 | 0.9672 | 0.9255 | 0.9569 | 0.9654 | 0.6514 | 0.7888 | 0.9923 | 0.9642 | 0.8971 | 0.9055 | 0.8780 | 0.9182 | 0.9284 | 0.5327 | 0.7046 |
101
+ | 0.0454 | 3.24 | 15000 | 0.0451 | 0.8599 | 0.9161 | 0.9844 | 0.9953 | 0.9858 | 0.9895 | 0.9907 | 0.8944 | 0.9635 | 0.9692 | 0.6643 | 0.7925 | 0.9924 | 0.9645 | 0.9061 | 0.9107 | 0.8803 | 0.9202 | 0.9236 | 0.5356 | 0.7058 |
102
+ | 0.0687 | 3.34 | 15500 | 0.0496 | 0.8482 | 0.9017 | 0.9827 | 0.9959 | 0.9869 | 0.9715 | 0.9676 | 0.8483 | 0.9616 | 0.9672 | 0.6235 | 0.7932 | 0.9922 | 0.9614 | 0.8904 | 0.8909 | 0.8269 | 0.9187 | 0.9218 | 0.5249 | 0.7069 |
103
+ | 0.0555 | 3.45 | 16000 | 0.0445 | 0.8568 | 0.9081 | 0.9838 | 0.9964 | 0.9858 | 0.9649 | 0.9681 | 0.8880 | 0.9585 | 0.9610 | 0.6510 | 0.7995 | 0.9922 | 0.9635 | 0.8996 | 0.9073 | 0.8582 | 0.9230 | 0.9257 | 0.5328 | 0.7093 |
104
+ | 0.0528 | 3.56 | 16500 | 0.0477 | 0.8549 | 0.9053 | 0.9833 | 0.9958 | 0.9875 | 0.9668 | 0.9677 | 0.8740 | 0.9512 | 0.9631 | 0.6512 | 0.7902 | 0.9920 | 0.9618 | 0.9021 | 0.9036 | 0.8486 | 0.9185 | 0.9254 | 0.5348 | 0.7070 |
105
+ | 0.043 | 3.67 | 17000 | 0.0439 | 0.8633 | 0.9173 | 0.9849 | 0.9960 | 0.9851 | 0.9860 | 0.9893 | 0.9114 | 0.9555 | 0.9656 | 0.6623 | 0.8046 | 0.9921 | 0.9666 | 0.9083 | 0.9158 | 0.8910 | 0.9197 | 0.9262 | 0.5391 | 0.7111 |
106
+ | 0.0372 | 3.77 | 17500 | 0.0474 | 0.8555 | 0.9039 | 0.9836 | 0.9959 | 0.9876 | 0.9626 | 0.9647 | 0.8818 | 0.9556 | 0.9623 | 0.6393 | 0.7858 | 0.9921 | 0.9623 | 0.8999 | 0.9065 | 0.8526 | 0.9218 | 0.9264 | 0.5299 | 0.7082 |
107
+ | 0.0614 | 3.88 | 18000 | 0.0463 | 0.8564 | 0.9088 | 0.9839 | 0.9959 | 0.9853 | 0.9644 | 0.9662 | 0.9035 | 0.9569 | 0.9638 | 0.6413 | 0.8025 | 0.9921 | 0.9643 | 0.8967 | 0.9020 | 0.8607 | 0.9202 | 0.9276 | 0.5330 | 0.7111 |
108
+ | 0.0413 | 3.99 | 18500 | 0.0453 | 0.8579 | 0.9123 | 0.9841 | 0.9963 | 0.9848 | 0.9794 | 0.9828 | 0.8865 | 0.9613 | 0.9695 | 0.6526 | 0.7977 | 0.9922 | 0.9648 | 0.8991 | 0.9047 | 0.8629 | 0.9221 | 0.9274 | 0.5369 | 0.7112 |
109
+ | 0.0386 | 4.1 | 19000 | 0.0438 | 0.8578 | 0.9109 | 0.9842 | 0.9959 | 0.9844 | 0.9649 | 0.9667 | 0.9154 | 0.9580 | 0.9662 | 0.6408 | 0.8062 | 0.9924 | 0.9644 | 0.8973 | 0.9025 | 0.8683 | 0.9196 | 0.9279 | 0.5340 | 0.7134 |
110
+ | 0.0541 | 4.21 | 19500 | 0.0443 | 0.8577 | 0.9118 | 0.9840 | 0.9957 | 0.9847 | 0.9829 | 0.9872 | 0.8935 | 0.9594 | 0.9686 | 0.6265 | 0.8077 | 0.9921 | 0.9641 | 0.9017 | 0.9079 | 0.8621 | 0.9203 | 0.9277 | 0.5298 | 0.7133 |
111
+ | 0.0409 | 4.31 | 20000 | 0.0433 | 0.8560 | 0.9083 | 0.9840 | 0.9959 | 0.9860 | 0.9670 | 0.9687 | 0.9020 | 0.9578 | 0.9632 | 0.6421 | 0.7918 | 0.9922 | 0.9652 | 0.8921 | 0.8966 | 0.8633 | 0.9206 | 0.9278 | 0.5349 | 0.7117 |
112
+ | 0.0398 | 4.42 | 20500 | 0.0451 | 0.8581 | 0.9102 | 0.9840 | 0.9960 | 0.9859 | 0.9687 | 0.9685 | 0.8885 | 0.9597 | 0.9684 | 0.6554 | 0.8004 | 0.9922 | 0.9638 | 0.9000 | 0.9042 | 0.8595 | 0.9232 | 0.9266 | 0.5395 | 0.7144 |
113
+ | 0.038 | 4.53 | 21000 | 0.0464 | 0.8608 | 0.9123 | 0.9843 | 0.9959 | 0.9866 | 0.9885 | 0.9907 | 0.8739 | 0.9616 | 0.9678 | 0.6398 | 0.8056 | 0.9921 | 0.9639 | 0.9088 | 0.9160 | 0.8657 | 0.9238 | 0.9273 | 0.5347 | 0.7150 |
114
+ | 0.0295 | 4.64 | 21500 | 0.0433 | 0.8596 | 0.9094 | 0.9840 | 0.9960 | 0.9864 | 0.9641 | 0.9664 | 0.8985 | 0.9535 | 0.9582 | 0.6581 | 0.8033 | 0.9922 | 0.9633 | 0.9056 | 0.9102 | 0.8619 | 0.9195 | 0.9276 | 0.5408 | 0.7151 |
115
+ | 0.0318 | 4.75 | 22000 | 0.0439 | 0.8600 | 0.9127 | 0.9842 | 0.9964 | 0.9848 | 0.9665 | 0.9676 | 0.8929 | 0.9627 | 0.9689 | 0.6656 | 0.8089 | 0.9923 | 0.9643 | 0.9007 | 0.9080 | 0.8645 | 0.9223 | 0.9283 | 0.5444 | 0.7156 |
116
+ | 0.0377 | 4.85 | 22500 | 0.0429 | 0.8619 | 0.9125 | 0.9846 | 0.9963 | 0.9849 | 0.9633 | 0.9666 | 0.9115 | 0.9609 | 0.9689 | 0.6527 | 0.8069 | 0.9923 | 0.9654 | 0.9052 | 0.9104 | 0.8762 | 0.9217 | 0.9288 | 0.5407 | 0.7166 |
117
+ | 0.0419 | 4.96 | 23000 | 0.0433 | 0.8611 | 0.9107 | 0.9846 | 0.9964 | 0.9857 | 0.9654 | 0.9664 | 0.9065 | 0.9591 | 0.9662 | 0.6491 | 0.8015 | 0.9923 | 0.9655 | 0.9017 | 0.9085 | 0.8749 | 0.9223 | 0.9289 | 0.5394 | 0.7160 |
118
+
119
+
120
+ ### Framework versions
121
+
122
+ - Transformers 4.35.2
123
+ - Pytorch 2.1.1
124
+ - Datasets 2.15.0
125
+ - Tokenizers 0.15.0
config.json ADDED
@@ -0,0 +1,92 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "mattmdjaga/segformer_b2_clothes",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 768,
9
+ "depths": [
10
+ 3,
11
+ 4,
12
+ 6,
13
+ 3
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 64,
26
+ 128,
27
+ 320,
28
+ 512
29
+ ],
30
+ "id2label": {
31
+ "0": "background",
32
+ "1": "upper_torso",
33
+ "2": "left_pants",
34
+ "3": "right_patns",
35
+ "4": "skirts",
36
+ "5": "left_sleeve",
37
+ "6": "right_sleeve",
38
+ "7": "outer_collar",
39
+ "8": "inner_collar"
40
+ },
41
+ "image_size": 224,
42
+ "initializer_range": 0.02,
43
+ "label2id": {
44
+ "background": 0,
45
+ "inner_collar": 8,
46
+ "left_pants": 2,
47
+ "left_sleeve": 5,
48
+ "outer_collar": 7,
49
+ "right_patns": 3,
50
+ "right_sleeve": 6,
51
+ "skirts": 4,
52
+ "upper_torso": 1
53
+ },
54
+ "layer_norm_eps": 1e-06,
55
+ "mlp_ratios": [
56
+ 4,
57
+ 4,
58
+ 4,
59
+ 4
60
+ ],
61
+ "model_type": "segformer",
62
+ "num_attention_heads": [
63
+ 1,
64
+ 2,
65
+ 5,
66
+ 8
67
+ ],
68
+ "num_channels": 3,
69
+ "num_encoder_blocks": 4,
70
+ "patch_sizes": [
71
+ 7,
72
+ 3,
73
+ 3,
74
+ 3
75
+ ],
76
+ "reshape_last_stage": true,
77
+ "semantic_loss_ignore_index": 255,
78
+ "sr_ratios": [
79
+ 8,
80
+ 4,
81
+ 2,
82
+ 1
83
+ ],
84
+ "strides": [
85
+ 4,
86
+ 2,
87
+ 2,
88
+ 2
89
+ ],
90
+ "torch_dtype": "float32",
91
+ "transformers_version": "4.35.2"
92
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63f48a76dfcc041d417c99e0949abafc962da9fcaaaf8ba74b8b030319bf1142
3
+ size 109465548
runs/Dec16_11-06-45_theone-ubuntu/events.out.tfevents.1702696006.theone-ubuntu.3281.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:28d043eebf2d49f7d8b9fba8ae1a9219b7e07ec9980fabea78c330cdd206215d
3
+ size 442524
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f7a7392a119b59fa082fe57e495708b5f0675b299b632f677d2744e91d1c340a
3
+ size 4600