vishalkatheriya18 commited on
Commit
87081d6
1 Parent(s): 9711163

cloth_classification

Browse files
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.625
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [facebook/convnextv2-tiny-1k-224](https://huggingface.co/facebook/convnextv2-tiny-1k-224) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 1.2021
36
- - Accuracy: 0.625
37
 
38
  ## Model description
39
 
@@ -61,99 +61,161 @@ The following hyperparameters were used during training:
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
- - num_epochs: 130
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-------:|:----:|:---------------:|:--------:|
70
- | No log | 1.0 | 1 | 2.0300 | 0.0 |
71
- | No log | 2.0 | 3 | 2.0208 | 0.0 |
72
- | No log | 3.0 | 5 | 1.9970 | 0.0 |
73
- | No log | 4.0 | 6 | 1.9853 | 0.125 |
74
- | No log | 5.0 | 7 | 1.9666 | 0.125 |
75
- | No log | 6.0 | 9 | 1.9215 | 0.25 |
76
- | 1.024 | 7.0 | 11 | 1.8757 | 0.125 |
77
- | 1.024 | 8.0 | 12 | 1.8580 | 0.125 |
78
- | 1.024 | 9.0 | 13 | 1.8413 | 0.125 |
79
- | 1.024 | 10.0 | 15 | 1.7954 | 0.375 |
80
- | 1.024 | 11.0 | 17 | 1.7510 | 0.5 |
81
- | 1.024 | 12.0 | 18 | 1.7309 | 0.625 |
82
- | 1.024 | 13.0 | 19 | 1.7132 | 0.625 |
83
- | 0.8487 | 14.0 | 21 | 1.6768 | 0.625 |
84
- | 0.8487 | 15.0 | 23 | 1.6402 | 0.625 |
85
- | 0.8487 | 16.0 | 24 | 1.6197 | 0.625 |
86
- | 0.8487 | 17.0 | 25 | 1.5952 | 0.625 |
87
- | 0.8487 | 18.0 | 27 | 1.5259 | 0.625 |
88
- | 0.8487 | 19.0 | 29 | 1.4599 | 0.625 |
89
- | 0.6549 | 20.0 | 30 | 1.4526 | 0.625 |
90
- | 0.6549 | 21.0 | 31 | 1.4459 | 0.625 |
91
- | 0.6549 | 22.0 | 33 | 1.4222 | 0.625 |
92
- | 0.6549 | 23.0 | 35 | 1.4136 | 0.625 |
93
- | 0.6549 | 24.0 | 36 | 1.4238 | 0.625 |
94
- | 0.6549 | 25.0 | 37 | 1.4286 | 0.625 |
95
- | 0.6549 | 26.0 | 39 | 1.4231 | 0.625 |
96
- | 0.479 | 27.0 | 41 | 1.3964 | 0.625 |
97
- | 0.479 | 28.0 | 42 | 1.3757 | 0.625 |
98
- | 0.479 | 29.0 | 43 | 1.3501 | 0.625 |
99
- | 0.479 | 30.0 | 45 | 1.2779 | 0.625 |
100
- | 0.479 | 31.0 | 47 | 1.2360 | 0.625 |
101
- | 0.479 | 32.0 | 48 | 1.2185 | 0.625 |
102
- | 0.479 | 33.0 | 49 | 1.1920 | 0.625 |
103
- | 0.3504 | 34.0 | 51 | 1.1326 | 0.625 |
104
- | 0.3504 | 35.0 | 53 | 1.1018 | 0.625 |
105
- | 0.3504 | 36.0 | 54 | 1.0970 | 0.625 |
106
- | 0.3504 | 37.0 | 55 | 1.1030 | 0.625 |
107
- | 0.3504 | 38.0 | 57 | 1.1378 | 0.625 |
108
- | 0.3504 | 39.0 | 59 | 1.1720 | 0.625 |
109
- | 0.2864 | 40.0 | 60 | 1.1867 | 0.625 |
110
- | 0.2864 | 41.0 | 61 | 1.1960 | 0.625 |
111
- | 0.2864 | 42.0 | 63 | 1.1959 | 0.625 |
112
- | 0.2864 | 43.0 | 65 | 1.1727 | 0.625 |
113
- | 0.2864 | 44.0 | 66 | 1.1653 | 0.625 |
114
- | 0.2864 | 45.0 | 67 | 1.1644 | 0.625 |
115
- | 0.2864 | 46.0 | 69 | 1.1809 | 0.625 |
116
- | 0.2357 | 47.0 | 71 | 1.1902 | 0.625 |
117
- | 0.2357 | 48.0 | 72 | 1.1872 | 0.625 |
118
- | 0.2357 | 49.0 | 73 | 1.1894 | 0.625 |
119
- | 0.2357 | 50.0 | 75 | 1.1982 | 0.625 |
120
- | 0.2357 | 51.0 | 77 | 1.2418 | 0.625 |
121
- | 0.2357 | 52.0 | 78 | 1.2575 | 0.625 |
122
- | 0.2357 | 53.0 | 79 | 1.2708 | 0.625 |
123
- | 0.1561 | 54.0 | 81 | 1.2666 | 0.625 |
124
- | 0.1561 | 55.0 | 83 | 1.2241 | 0.625 |
125
- | 0.1561 | 56.0 | 84 | 1.2089 | 0.625 |
126
- | 0.1561 | 57.0 | 85 | 1.1914 | 0.625 |
127
- | 0.1561 | 58.0 | 87 | 1.1559 | 0.625 |
128
- | 0.1561 | 59.0 | 89 | 1.1387 | 0.625 |
129
- | 0.1453 | 60.0 | 90 | 1.1337 | 0.625 |
130
- | 0.1453 | 61.0 | 91 | 1.1290 | 0.625 |
131
- | 0.1453 | 62.0 | 93 | 1.1369 | 0.625 |
132
- | 0.1453 | 63.0 | 95 | 1.1439 | 0.625 |
133
- | 0.1453 | 64.0 | 96 | 1.1448 | 0.625 |
134
- | 0.1453 | 65.0 | 97 | 1.1530 | 0.625 |
135
- | 0.1453 | 66.0 | 99 | 1.1718 | 0.625 |
136
- | 0.1271 | 67.0 | 101 | 1.1965 | 0.625 |
137
- | 0.1271 | 68.0 | 102 | 1.2092 | 0.625 |
138
- | 0.1271 | 69.0 | 103 | 1.2176 | 0.625 |
139
- | 0.1271 | 70.0 | 105 | 1.2337 | 0.625 |
140
- | 0.1271 | 71.0 | 107 | 1.2376 | 0.625 |
141
- | 0.1271 | 72.0 | 108 | 1.2384 | 0.625 |
142
- | 0.1271 | 73.0 | 109 | 1.2378 | 0.625 |
143
- | 0.1153 | 74.0 | 111 | 1.2385 | 0.625 |
144
- | 0.1153 | 75.0 | 113 | 1.2316 | 0.625 |
145
- | 0.1153 | 76.0 | 114 | 1.2274 | 0.625 |
146
- | 0.1153 | 77.0 | 115 | 1.2252 | 0.625 |
147
- | 0.1153 | 78.0 | 117 | 1.2196 | 0.625 |
148
- | 0.1153 | 79.0 | 119 | 1.2145 | 0.625 |
149
- | 0.0882 | 80.0 | 120 | 1.2130 | 0.625 |
150
- | 0.0882 | 81.0 | 121 | 1.2117 | 0.625 |
151
- | 0.0882 | 82.0 | 123 | 1.2097 | 0.625 |
152
- | 0.0882 | 83.0 | 125 | 1.2075 | 0.625 |
153
- | 0.0882 | 84.0 | 126 | 1.2054 | 0.625 |
154
- | 0.0882 | 85.0 | 127 | 1.2039 | 0.625 |
155
- | 0.0882 | 86.0 | 129 | 1.2025 | 0.625 |
156
- | 0.0987 | 86.6667 | 130 | 1.2021 | 0.625 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
157
 
158
 
159
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.8288288288288288
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [facebook/convnextv2-tiny-1k-224](https://huggingface.co/facebook/convnextv2-tiny-1k-224) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 0.7923
36
+ - Accuracy: 0.8288
37
 
38
  ## Model description
39
 
 
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 150
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-------:|:----:|:---------------:|:--------:|
70
+ | 3.5919 | 0.992 | 31 | 3.5736 | 0.0315 |
71
+ | 3.5392 | 1.984 | 62 | 3.4990 | 0.0473 |
72
+ | 3.4591 | 2.976 | 93 | 3.3992 | 0.1104 |
73
+ | 3.3361 | 4.0 | 125 | 3.2526 | 0.2523 |
74
+ | 3.2066 | 4.992 | 156 | 3.0851 | 0.3761 |
75
+ | 3.0024 | 5.984 | 187 | 2.8376 | 0.4437 |
76
+ | 2.8094 | 6.976 | 218 | 2.6320 | 0.4910 |
77
+ | 2.509 | 8.0 | 250 | 2.3765 | 0.5360 |
78
+ | 2.2526 | 8.992 | 281 | 2.0400 | 0.5923 |
79
+ | 1.9442 | 9.984 | 312 | 1.7940 | 0.6396 |
80
+ | 1.7672 | 10.9760 | 343 | 1.5892 | 0.6824 |
81
+ | 1.5273 | 12.0 | 375 | 1.3500 | 0.7185 |
82
+ | 1.3854 | 12.992 | 406 | 1.2243 | 0.7162 |
83
+ | 1.197 | 13.984 | 437 | 1.1022 | 0.7387 |
84
+ | 1.1114 | 14.9760 | 468 | 1.0138 | 0.7613 |
85
+ | 0.9364 | 16.0 | 500 | 0.9164 | 0.7748 |
86
+ | 0.8755 | 16.992 | 531 | 0.9058 | 0.7523 |
87
+ | 0.7473 | 17.984 | 562 | 0.8045 | 0.7928 |
88
+ | 0.7189 | 18.976 | 593 | 0.7735 | 0.7883 |
89
+ | 0.6461 | 20.0 | 625 | 0.6876 | 0.8198 |
90
+ | 0.6041 | 20.992 | 656 | 0.7212 | 0.7973 |
91
+ | 0.5016 | 21.984 | 687 | 0.6611 | 0.8198 |
92
+ | 0.4996 | 22.976 | 718 | 0.6110 | 0.8153 |
93
+ | 0.4825 | 24.0 | 750 | 0.6476 | 0.8063 |
94
+ | 0.434 | 24.992 | 781 | 0.6793 | 0.8041 |
95
+ | 0.4296 | 25.984 | 812 | 0.6015 | 0.8018 |
96
+ | 0.36 | 26.976 | 843 | 0.6615 | 0.8063 |
97
+ | 0.3646 | 28.0 | 875 | 0.6059 | 0.8221 |
98
+ | 0.3542 | 28.992 | 906 | 0.6973 | 0.7928 |
99
+ | 0.3091 | 29.984 | 937 | 0.6400 | 0.8266 |
100
+ | 0.2774 | 30.976 | 968 | 0.5798 | 0.8266 |
101
+ | 0.3166 | 32.0 | 1000 | 0.6134 | 0.8333 |
102
+ | 0.2878 | 32.992 | 1031 | 0.6353 | 0.8063 |
103
+ | 0.2529 | 33.984 | 1062 | 0.6628 | 0.8243 |
104
+ | 0.2601 | 34.976 | 1093 | 0.6367 | 0.8041 |
105
+ | 0.2208 | 36.0 | 1125 | 0.6313 | 0.8288 |
106
+ | 0.2342 | 36.992 | 1156 | 0.5969 | 0.8378 |
107
+ | 0.2122 | 37.984 | 1187 | 0.6391 | 0.8198 |
108
+ | 0.1791 | 38.976 | 1218 | 0.6771 | 0.8108 |
109
+ | 0.2113 | 40.0 | 1250 | 0.7035 | 0.8086 |
110
+ | 0.1703 | 40.992 | 1281 | 0.7096 | 0.8153 |
111
+ | 0.1751 | 41.984 | 1312 | 0.5964 | 0.8446 |
112
+ | 0.1889 | 42.976 | 1343 | 0.6607 | 0.8446 |
113
+ | 0.1791 | 44.0 | 1375 | 0.7000 | 0.8243 |
114
+ | 0.1372 | 44.992 | 1406 | 0.6866 | 0.8243 |
115
+ | 0.1785 | 45.984 | 1437 | 0.6621 | 0.8266 |
116
+ | 0.1469 | 46.976 | 1468 | 0.6391 | 0.8266 |
117
+ | 0.1628 | 48.0 | 1500 | 0.6623 | 0.8356 |
118
+ | 0.1425 | 48.992 | 1531 | 0.6443 | 0.8288 |
119
+ | 0.1727 | 49.984 | 1562 | 0.6361 | 0.8446 |
120
+ | 0.1442 | 50.976 | 1593 | 0.6397 | 0.8491 |
121
+ | 0.1386 | 52.0 | 1625 | 0.6835 | 0.8423 |
122
+ | 0.1564 | 52.992 | 1656 | 0.7072 | 0.8266 |
123
+ | 0.1151 | 53.984 | 1687 | 0.6835 | 0.8311 |
124
+ | 0.1446 | 54.976 | 1718 | 0.7347 | 0.8198 |
125
+ | 0.1353 | 56.0 | 1750 | 0.6935 | 0.8401 |
126
+ | 0.13 | 56.992 | 1781 | 0.7337 | 0.8198 |
127
+ | 0.1312 | 57.984 | 1812 | 0.6625 | 0.8311 |
128
+ | 0.1201 | 58.976 | 1843 | 0.6956 | 0.8243 |
129
+ | 0.1411 | 60.0 | 1875 | 0.7290 | 0.8243 |
130
+ | 0.1116 | 60.992 | 1906 | 0.7052 | 0.8356 |
131
+ | 0.1251 | 61.984 | 1937 | 0.6915 | 0.8311 |
132
+ | 0.1101 | 62.976 | 1968 | 0.6457 | 0.8378 |
133
+ | 0.0883 | 64.0 | 2000 | 0.6553 | 0.8378 |
134
+ | 0.1225 | 64.992 | 2031 | 0.6454 | 0.8401 |
135
+ | 0.1135 | 65.984 | 2062 | 0.6616 | 0.8514 |
136
+ | 0.1009 | 66.976 | 2093 | 0.6375 | 0.8536 |
137
+ | 0.1027 | 68.0 | 2125 | 0.6754 | 0.8266 |
138
+ | 0.0925 | 68.992 | 2156 | 0.7497 | 0.8176 |
139
+ | 0.0878 | 69.984 | 2187 | 0.6573 | 0.8491 |
140
+ | 0.1093 | 70.976 | 2218 | 0.7015 | 0.8356 |
141
+ | 0.1024 | 72.0 | 2250 | 0.6907 | 0.8446 |
142
+ | 0.0934 | 72.992 | 2281 | 0.7059 | 0.8356 |
143
+ | 0.103 | 73.984 | 2312 | 0.7159 | 0.8356 |
144
+ | 0.0974 | 74.976 | 2343 | 0.7324 | 0.8266 |
145
+ | 0.1049 | 76.0 | 2375 | 0.7397 | 0.8311 |
146
+ | 0.097 | 76.992 | 2406 | 0.7529 | 0.8176 |
147
+ | 0.0816 | 77.984 | 2437 | 0.7175 | 0.8423 |
148
+ | 0.0902 | 78.976 | 2468 | 0.7745 | 0.8288 |
149
+ | 0.0827 | 80.0 | 2500 | 0.7017 | 0.8423 |
150
+ | 0.0818 | 80.992 | 2531 | 0.7712 | 0.8243 |
151
+ | 0.076 | 81.984 | 2562 | 0.7341 | 0.8423 |
152
+ | 0.0837 | 82.976 | 2593 | 0.7242 | 0.8491 |
153
+ | 0.0743 | 84.0 | 2625 | 0.6999 | 0.8446 |
154
+ | 0.0552 | 84.992 | 2656 | 0.6875 | 0.8401 |
155
+ | 0.0762 | 85.984 | 2687 | 0.6743 | 0.8581 |
156
+ | 0.0742 | 86.976 | 2718 | 0.7027 | 0.8446 |
157
+ | 0.0708 | 88.0 | 2750 | 0.7367 | 0.8356 |
158
+ | 0.086 | 88.992 | 2781 | 0.6905 | 0.8401 |
159
+ | 0.0575 | 89.984 | 2812 | 0.7041 | 0.8423 |
160
+ | 0.0733 | 90.976 | 2843 | 0.6465 | 0.8423 |
161
+ | 0.0701 | 92.0 | 2875 | 0.7066 | 0.8401 |
162
+ | 0.0782 | 92.992 | 2906 | 0.6955 | 0.8243 |
163
+ | 0.0754 | 93.984 | 2937 | 0.6836 | 0.8468 |
164
+ | 0.0545 | 94.976 | 2968 | 0.7290 | 0.8288 |
165
+ | 0.0913 | 96.0 | 3000 | 0.7665 | 0.8266 |
166
+ | 0.0816 | 96.992 | 3031 | 0.7661 | 0.8311 |
167
+ | 0.0696 | 97.984 | 3062 | 0.6921 | 0.8356 |
168
+ | 0.0627 | 98.976 | 3093 | 0.7070 | 0.8446 |
169
+ | 0.0562 | 100.0 | 3125 | 0.7442 | 0.8401 |
170
+ | 0.0742 | 100.992 | 3156 | 0.7000 | 0.8423 |
171
+ | 0.0545 | 101.984 | 3187 | 0.7312 | 0.8401 |
172
+ | 0.0635 | 102.976 | 3218 | 0.7231 | 0.8491 |
173
+ | 0.0608 | 104.0 | 3250 | 0.7332 | 0.8333 |
174
+ | 0.0769 | 104.992 | 3281 | 0.7328 | 0.8356 |
175
+ | 0.057 | 105.984 | 3312 | 0.6954 | 0.8378 |
176
+ | 0.0447 | 106.976 | 3343 | 0.7006 | 0.8423 |
177
+ | 0.0629 | 108.0 | 3375 | 0.7149 | 0.8423 |
178
+ | 0.0394 | 108.992 | 3406 | 0.7469 | 0.8378 |
179
+ | 0.0602 | 109.984 | 3437 | 0.7274 | 0.8468 |
180
+ | 0.0635 | 110.976 | 3468 | 0.7495 | 0.8446 |
181
+ | 0.0565 | 112.0 | 3500 | 0.7885 | 0.8401 |
182
+ | 0.035 | 112.992 | 3531 | 0.7178 | 0.8468 |
183
+ | 0.0604 | 113.984 | 3562 | 0.7574 | 0.8356 |
184
+ | 0.0507 | 114.976 | 3593 | 0.7901 | 0.8266 |
185
+ | 0.05 | 116.0 | 3625 | 0.7730 | 0.8198 |
186
+ | 0.0465 | 116.992 | 3656 | 0.7967 | 0.8401 |
187
+ | 0.042 | 117.984 | 3687 | 0.7767 | 0.8423 |
188
+ | 0.0609 | 118.976 | 3718 | 0.7872 | 0.8378 |
189
+ | 0.0379 | 120.0 | 3750 | 0.7685 | 0.8514 |
190
+ | 0.0579 | 120.992 | 3781 | 0.7709 | 0.8423 |
191
+ | 0.0471 | 121.984 | 3812 | 0.7601 | 0.8423 |
192
+ | 0.0488 | 122.976 | 3843 | 0.8231 | 0.8356 |
193
+ | 0.0531 | 124.0 | 3875 | 0.8016 | 0.8378 |
194
+ | 0.0446 | 124.992 | 3906 | 0.7806 | 0.8423 |
195
+ | 0.0479 | 125.984 | 3937 | 0.7668 | 0.8378 |
196
+ | 0.0525 | 126.976 | 3968 | 0.7874 | 0.8288 |
197
+ | 0.0512 | 128.0 | 4000 | 0.7652 | 0.8311 |
198
+ | 0.0473 | 128.992 | 4031 | 0.7721 | 0.8356 |
199
+ | 0.0579 | 129.984 | 4062 | 0.7607 | 0.8356 |
200
+ | 0.0444 | 130.976 | 4093 | 0.7917 | 0.8356 |
201
+ | 0.0462 | 132.0 | 4125 | 0.7877 | 0.8333 |
202
+ | 0.0483 | 132.992 | 4156 | 0.8122 | 0.8401 |
203
+ | 0.042 | 133.984 | 4187 | 0.7956 | 0.8378 |
204
+ | 0.0439 | 134.976 | 4218 | 0.8281 | 0.8311 |
205
+ | 0.0458 | 136.0 | 4250 | 0.7723 | 0.8446 |
206
+ | 0.0307 | 136.992 | 4281 | 0.7686 | 0.8446 |
207
+ | 0.0481 | 137.984 | 4312 | 0.7834 | 0.8378 |
208
+ | 0.0503 | 138.976 | 4343 | 0.7987 | 0.8378 |
209
+ | 0.038 | 140.0 | 4375 | 0.8156 | 0.8311 |
210
+ | 0.0472 | 140.992 | 4406 | 0.8030 | 0.8356 |
211
+ | 0.0282 | 141.984 | 4437 | 0.7884 | 0.8378 |
212
+ | 0.0541 | 142.976 | 4468 | 0.7969 | 0.8311 |
213
+ | 0.0415 | 144.0 | 4500 | 0.7899 | 0.8333 |
214
+ | 0.0579 | 144.992 | 4531 | 0.7979 | 0.8266 |
215
+ | 0.048 | 145.984 | 4562 | 0.7935 | 0.8288 |
216
+ | 0.0353 | 146.976 | 4593 | 0.7933 | 0.8288 |
217
+ | 0.0438 | 148.0 | 4625 | 0.7916 | 0.8288 |
218
+ | 0.0487 | 148.8 | 4650 | 0.7923 | 0.8288 |
219
 
220
 
221
  ### Framework versions
all_results.json CHANGED
@@ -1,13 +1,13 @@
1
  {
2
- "epoch": 86.66666666666667,
3
- "eval_accuracy": 0.625,
4
- "eval_loss": 1.2021381855010986,
5
- "eval_runtime": 0.2304,
6
- "eval_samples_per_second": 34.717,
7
- "eval_steps_per_second": 4.34,
8
- "total_flos": 1.574865655328932e+17,
9
- "train_loss": 0.3546037518061124,
10
- "train_runtime": 225.362,
11
- "train_samples_per_second": 41.533,
12
- "train_steps_per_second": 0.577
13
  }
 
1
  {
2
+ "epoch": 148.8,
3
+ "eval_accuracy": 0.8288288288288288,
4
+ "eval_loss": 0.792281985282898,
5
+ "eval_runtime": 9.372,
6
+ "eval_samples_per_second": 47.375,
7
+ "eval_steps_per_second": 1.494,
8
+ "total_flos": 1.4980450073050055e+19,
9
+ "train_loss": 0.38013411539216196,
10
+ "train_runtime": 17110.8333,
11
+ "train_samples_per_second": 35.03,
12
+ "train_steps_per_second": 0.272
13
  }
config.json CHANGED
@@ -18,26 +18,84 @@
18
  768
19
  ],
20
  "id2label": {
21
- "0": "Joggers",
22
- "1": "capri",
23
- "2": "jeans",
24
- "3": "legging",
25
- "4": "plazzo",
26
- "5": "shorts",
27
- "6": "skirt",
28
- "7": "trouser"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
29
  },
30
  "image_size": 224,
31
  "initializer_range": 0.02,
32
  "label2id": {
33
- "Joggers": 0,
34
- "capri": 1,
35
- "jeans": 2,
36
- "legging": 3,
37
- "plazzo": 4,
38
- "shorts": 5,
39
- "skirt": 6,
40
- "trouser": 7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
41
  },
42
  "layer_norm_eps": 1e-12,
43
  "model_type": "convnextv2",
 
18
  768
19
  ],
20
  "id2label": {
21
+ "0": "Co_ords",
22
+ "1": "Joggers",
23
+ "2": "Kaftan",
24
+ "3": "anarkali",
25
+ "4": "blazer",
26
+ "5": "capri",
27
+ "6": "cardigan",
28
+ "7": "cloaks_abaya",
29
+ "8": "coat",
30
+ "9": "dress",
31
+ "10": "dungaree",
32
+ "11": "ethnic",
33
+ "12": "gown",
34
+ "13": "jacket",
35
+ "14": "jeans",
36
+ "15": "jumpsuit",
37
+ "16": "kurta",
38
+ "17": "kurti",
39
+ "18": "legging",
40
+ "19": "lehengas",
41
+ "20": "plazzo",
42
+ "21": "poncho",
43
+ "22": "robe",
44
+ "23": "salwar",
45
+ "24": "salwar_suit",
46
+ "25": "saree",
47
+ "26": "sharara",
48
+ "27": "shirt",
49
+ "28": "shorts",
50
+ "29": "shrug",
51
+ "30": "skirt",
52
+ "31": "summer_jacket",
53
+ "32": "sweatshirt_hoodie",
54
+ "33": "trouser",
55
+ "34": "tshirt",
56
+ "35": "tunic",
57
+ "36": "waistcoat"
58
  },
59
  "image_size": 224,
60
  "initializer_range": 0.02,
61
  "label2id": {
62
+ "Co_ords": 0,
63
+ "Joggers": 1,
64
+ "Kaftan": 2,
65
+ "anarkali": 3,
66
+ "blazer": 4,
67
+ "capri": 5,
68
+ "cardigan": 6,
69
+ "cloaks_abaya": 7,
70
+ "coat": 8,
71
+ "dress": 9,
72
+ "dungaree": 10,
73
+ "ethnic": 11,
74
+ "gown": 12,
75
+ "jacket": 13,
76
+ "jeans": 14,
77
+ "jumpsuit": 15,
78
+ "kurta": 16,
79
+ "kurti": 17,
80
+ "legging": 18,
81
+ "lehengas": 19,
82
+ "plazzo": 20,
83
+ "poncho": 21,
84
+ "robe": 22,
85
+ "salwar": 23,
86
+ "salwar_suit": 24,
87
+ "saree": 25,
88
+ "sharara": 26,
89
+ "shirt": 27,
90
+ "shorts": 28,
91
+ "shrug": 29,
92
+ "skirt": 30,
93
+ "summer_jacket": 31,
94
+ "sweatshirt_hoodie": 32,
95
+ "trouser": 33,
96
+ "tshirt": 34,
97
+ "tunic": 35,
98
+ "waistcoat": 36
99
  },
100
  "layer_norm_eps": 1e-12,
101
  "model_type": "convnextv2",
eval_results.json CHANGED
@@ -1,8 +1,8 @@
1
  {
2
- "epoch": 86.66666666666667,
3
- "eval_accuracy": 0.625,
4
- "eval_loss": 1.2021381855010986,
5
- "eval_runtime": 0.2304,
6
- "eval_samples_per_second": 34.717,
7
- "eval_steps_per_second": 4.34
8
  }
 
1
  {
2
+ "epoch": 148.8,
3
+ "eval_accuracy": 0.8288288288288288,
4
+ "eval_loss": 0.792281985282898,
5
+ "eval_runtime": 9.372,
6
+ "eval_samples_per_second": 47.375,
7
+ "eval_steps_per_second": 1.494
8
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:76e72daa47f949e84cf8be0ea3c00e00e92ed8369d17437418d3c618795149c2
3
- size 111514288
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:23f0ad911524e5393d0353d2ba65cf95db78326383b067e74626dc125fd101bc
3
+ size 111603516
train_results.json CHANGED
@@ -1,8 +1,8 @@
1
  {
2
- "epoch": 86.66666666666667,
3
- "total_flos": 1.574865655328932e+17,
4
- "train_loss": 0.3546037518061124,
5
- "train_runtime": 225.362,
6
- "train_samples_per_second": 41.533,
7
- "train_steps_per_second": 0.577
8
  }
 
1
  {
2
+ "epoch": 148.8,
3
+ "total_flos": 1.4980450073050055e+19,
4
+ "train_loss": 0.38013411539216196,
5
+ "train_runtime": 17110.8333,
6
+ "train_samples_per_second": 35.03,
7
+ "train_steps_per_second": 0.272
8
  }
trainer_state.json CHANGED
The diff for this file is too large to render. See raw diff
 
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a96c4a20e581fe9e08fca4557c67ff0e68b5b590c00d24f0cf2780c856e33981
3
  size 5240
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5e02f090ffcd093a81ba391b1f4e4874e6e59077dfe4afb50ffcd8c2e4630b2a
3
  size 5240