interestAI commited on
Commit
5348972
1 Parent(s): 349f6bf

End of training

Browse files
Files changed (2) hide show
  1. README.md +86 -77
  2. model.safetensors +1 -1
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.9090909090909091
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.1787
36
- - Accuracy: 0.9091
37
 
38
  ## Model description
39
 
@@ -61,85 +61,94 @@ The following hyperparameters were used during training:
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
- - num_epochs: 100
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-------:|:----:|:---------------:|:--------:|
70
- | No log | 0.7273 | 2 | 0.2246 | 0.9212 |
71
- | No log | 1.8182 | 5 | 0.1985 | 0.9333 |
72
- | No log | 2.9091 | 8 | 0.3016 | 0.8788 |
73
- | 0.1764 | 4.0 | 11 | 0.1781 | 0.9455 |
74
- | 0.1764 | 4.7273 | 13 | 0.2224 | 0.9091 |
75
- | 0.1764 | 5.8182 | 16 | 0.1740 | 0.9394 |
76
- | 0.1764 | 6.9091 | 19 | 0.2611 | 0.8848 |
77
- | 0.1981 | 8.0 | 22 | 0.2415 | 0.9212 |
78
- | 0.1981 | 8.7273 | 24 | 0.2253 | 0.9091 |
79
- | 0.1981 | 9.8182 | 27 | 0.2396 | 0.9030 |
80
- | 0.1993 | 10.9091 | 30 | 0.2822 | 0.9030 |
81
- | 0.1993 | 12.0 | 33 | 0.2739 | 0.8788 |
82
- | 0.1993 | 12.7273 | 35 | 0.1977 | 0.9333 |
83
- | 0.1993 | 13.8182 | 38 | 0.1926 | 0.9091 |
84
- | 0.178 | 14.9091 | 41 | 0.2566 | 0.9091 |
85
- | 0.178 | 16.0 | 44 | 0.2286 | 0.9030 |
86
- | 0.178 | 16.7273 | 46 | 0.2647 | 0.9152 |
87
- | 0.178 | 17.8182 | 49 | 0.2424 | 0.8970 |
88
- | 0.1947 | 18.9091 | 52 | 0.2525 | 0.8909 |
89
- | 0.1947 | 20.0 | 55 | 0.2102 | 0.9212 |
90
- | 0.1947 | 20.7273 | 57 | 0.2037 | 0.9273 |
91
- | 0.1997 | 21.8182 | 60 | 0.1405 | 0.9576 |
92
- | 0.1997 | 22.9091 | 63 | 0.2292 | 0.8970 |
93
- | 0.1997 | 24.0 | 66 | 0.1884 | 0.9394 |
94
- | 0.1997 | 24.7273 | 68 | 0.2245 | 0.8970 |
95
- | 0.1853 | 25.8182 | 71 | 0.1941 | 0.9455 |
96
- | 0.1853 | 26.9091 | 74 | 0.1821 | 0.9455 |
97
- | 0.1853 | 28.0 | 77 | 0.1761 | 0.9455 |
98
- | 0.1853 | 28.7273 | 79 | 0.1923 | 0.9212 |
99
- | 0.1779 | 29.8182 | 82 | 0.2422 | 0.8909 |
100
- | 0.1779 | 30.9091 | 85 | 0.2035 | 0.9394 |
101
- | 0.1779 | 32.0 | 88 | 0.2369 | 0.9212 |
102
- | 0.1944 | 32.7273 | 90 | 0.1982 | 0.9091 |
103
- | 0.1944 | 33.8182 | 93 | 0.1995 | 0.9152 |
104
- | 0.1944 | 34.9091 | 96 | 0.1886 | 0.9333 |
105
- | 0.1944 | 36.0 | 99 | 0.1938 | 0.9333 |
106
- | 0.1183 | 36.7273 | 101 | 0.1553 | 0.9394 |
107
- | 0.1183 | 37.8182 | 104 | 0.2268 | 0.8970 |
108
- | 0.1183 | 38.9091 | 107 | 0.1980 | 0.9152 |
109
- | 0.1647 | 40.0 | 110 | 0.1553 | 0.9333 |
110
- | 0.1647 | 40.7273 | 112 | 0.1020 | 0.9697 |
111
- | 0.1647 | 41.8182 | 115 | 0.2174 | 0.9152 |
112
- | 0.1647 | 42.9091 | 118 | 0.1499 | 0.9333 |
113
- | 0.1321 | 44.0 | 121 | 0.1632 | 0.9576 |
114
- | 0.1321 | 44.7273 | 123 | 0.1793 | 0.9333 |
115
- | 0.1321 | 45.8182 | 126 | 0.2242 | 0.9091 |
116
- | 0.1321 | 46.9091 | 129 | 0.2656 | 0.9030 |
117
- | 0.1616 | 48.0 | 132 | 0.1614 | 0.9273 |
118
- | 0.1616 | 48.7273 | 134 | 0.1677 | 0.9333 |
119
- | 0.1616 | 49.8182 | 137 | 0.1946 | 0.9333 |
120
- | 0.1682 | 50.9091 | 140 | 0.1380 | 0.9576 |
121
- | 0.1682 | 52.0 | 143 | 0.2623 | 0.8848 |
122
- | 0.1682 | 52.7273 | 145 | 0.1565 | 0.9455 |
123
- | 0.1682 | 53.8182 | 148 | 0.1886 | 0.9152 |
124
- | 0.1719 | 54.9091 | 151 | 0.1943 | 0.9152 |
125
- | 0.1719 | 56.0 | 154 | 0.2413 | 0.8909 |
126
- | 0.1719 | 56.7273 | 156 | 0.1563 | 0.9394 |
127
- | 0.1719 | 57.8182 | 159 | 0.1194 | 0.9636 |
128
- | 0.1775 | 58.9091 | 162 | 0.1687 | 0.9333 |
129
- | 0.1775 | 60.0 | 165 | 0.1041 | 0.9818 |
130
- | 0.1775 | 60.7273 | 167 | 0.2387 | 0.8848 |
131
- | 0.1636 | 61.8182 | 170 | 0.1202 | 0.9576 |
132
- | 0.1636 | 62.9091 | 173 | 0.1462 | 0.9333 |
133
- | 0.1636 | 64.0 | 176 | 0.1630 | 0.9394 |
134
- | 0.1636 | 64.7273 | 178 | 0.2130 | 0.9152 |
135
- | 0.1841 | 65.8182 | 181 | 0.1948 | 0.8970 |
136
- | 0.1841 | 66.9091 | 184 | 0.1859 | 0.9333 |
137
- | 0.1841 | 68.0 | 187 | 0.2148 | 0.9152 |
138
- | 0.1841 | 68.7273 | 189 | 0.1626 | 0.9515 |
139
- | 0.1586 | 69.8182 | 192 | 0.1820 | 0.9212 |
140
- | 0.1586 | 70.9091 | 195 | 0.1124 | 0.9455 |
141
- | 0.1586 | 72.0 | 198 | 0.1298 | 0.9394 |
142
- | 0.1383 | 72.7273 | 200 | 0.1787 | 0.9091 |
 
 
 
 
 
 
 
 
 
143
 
144
 
145
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.9393939393939394
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 0.1902
36
+ - Accuracy: 0.9394
37
 
38
  ## Model description
39
 
 
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 112
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-------:|:----:|:---------------:|:--------:|
70
+ | No log | 0.7273 | 2 | 1.1089 | 0.2909 |
71
+ | No log | 1.8182 | 5 | 1.0961 | 0.3758 |
72
+ | No log | 2.9091 | 8 | 1.0822 | 0.3758 |
73
+ | 1.0988 | 4.0 | 11 | 1.0480 | 0.5939 |
74
+ | 1.0988 | 4.7273 | 13 | 1.0333 | 0.6061 |
75
+ | 1.0988 | 5.8182 | 16 | 0.9881 | 0.6667 |
76
+ | 1.0988 | 6.9091 | 19 | 0.9274 | 0.6970 |
77
+ | 1.0061 | 8.0 | 22 | 0.8714 | 0.8364 |
78
+ | 1.0061 | 8.7273 | 24 | 0.8210 | 0.7879 |
79
+ | 1.0061 | 9.8182 | 27 | 0.7619 | 0.8121 |
80
+ | 0.8136 | 10.9091 | 30 | 0.6483 | 0.8667 |
81
+ | 0.8136 | 12.0 | 33 | 0.6937 | 0.7939 |
82
+ | 0.8136 | 12.7273 | 35 | 0.5885 | 0.8545 |
83
+ | 0.8136 | 13.8182 | 38 | 0.6046 | 0.8182 |
84
+ | 0.642 | 14.9091 | 41 | 0.5518 | 0.8364 |
85
+ | 0.642 | 16.0 | 44 | 0.5370 | 0.8242 |
86
+ | 0.642 | 16.7273 | 46 | 0.4765 | 0.8788 |
87
+ | 0.642 | 17.8182 | 49 | 0.4416 | 0.8606 |
88
+ | 0.5145 | 18.9091 | 52 | 0.4140 | 0.8970 |
89
+ | 0.5145 | 20.0 | 55 | 0.4007 | 0.8970 |
90
+ | 0.5145 | 20.7273 | 57 | 0.3803 | 0.8970 |
91
+ | 0.4226 | 21.8182 | 60 | 0.3167 | 0.9394 |
92
+ | 0.4226 | 22.9091 | 63 | 0.3398 | 0.9030 |
93
+ | 0.4226 | 24.0 | 66 | 0.3147 | 0.9273 |
94
+ | 0.4226 | 24.7273 | 68 | 0.3273 | 0.8970 |
95
+ | 0.3282 | 25.8182 | 71 | 0.3125 | 0.9030 |
96
+ | 0.3282 | 26.9091 | 74 | 0.2712 | 0.9212 |
97
+ | 0.3282 | 28.0 | 77 | 0.2871 | 0.9273 |
98
+ | 0.3282 | 28.7273 | 79 | 0.2534 | 0.9273 |
99
+ | 0.3076 | 29.8182 | 82 | 0.2620 | 0.9273 |
100
+ | 0.3076 | 30.9091 | 85 | 0.3845 | 0.8848 |
101
+ | 0.3076 | 32.0 | 88 | 0.2495 | 0.9273 |
102
+ | 0.3081 | 32.7273 | 90 | 0.3018 | 0.9091 |
103
+ | 0.3081 | 33.8182 | 93 | 0.2204 | 0.9455 |
104
+ | 0.3081 | 34.9091 | 96 | 0.2769 | 0.9152 |
105
+ | 0.3081 | 36.0 | 99 | 0.2261 | 0.9394 |
106
+ | 0.2451 | 36.7273 | 101 | 0.2092 | 0.9515 |
107
+ | 0.2451 | 37.8182 | 104 | 0.3196 | 0.8727 |
108
+ | 0.2451 | 38.9091 | 107 | 0.2629 | 0.9091 |
109
+ | 0.2741 | 40.0 | 110 | 0.2360 | 0.9333 |
110
+ | 0.2741 | 40.7273 | 112 | 0.1927 | 0.9515 |
111
+ | 0.2741 | 41.8182 | 115 | 0.2834 | 0.9030 |
112
+ | 0.2741 | 42.9091 | 118 | 0.2173 | 0.9394 |
113
+ | 0.244 | 44.0 | 121 | 0.1997 | 0.9394 |
114
+ | 0.244 | 44.7273 | 123 | 0.2163 | 0.9273 |
115
+ | 0.244 | 45.8182 | 126 | 0.2865 | 0.8970 |
116
+ | 0.244 | 46.9091 | 129 | 0.2483 | 0.9152 |
117
+ | 0.224 | 48.0 | 132 | 0.1707 | 0.9576 |
118
+ | 0.224 | 48.7273 | 134 | 0.1988 | 0.9455 |
119
+ | 0.224 | 49.8182 | 137 | 0.2168 | 0.9455 |
120
+ | 0.213 | 50.9091 | 140 | 0.1807 | 0.9576 |
121
+ | 0.213 | 52.0 | 143 | 0.2478 | 0.9152 |
122
+ | 0.213 | 52.7273 | 145 | 0.1975 | 0.9455 |
123
+ | 0.213 | 53.8182 | 148 | 0.2218 | 0.9212 |
124
+ | 0.2298 | 54.9091 | 151 | 0.2046 | 0.9455 |
125
+ | 0.2298 | 56.0 | 154 | 0.2557 | 0.9152 |
126
+ | 0.2298 | 56.7273 | 156 | 0.1962 | 0.9394 |
127
+ | 0.2298 | 57.8182 | 159 | 0.1879 | 0.9394 |
128
+ | 0.2189 | 58.9091 | 162 | 0.1983 | 0.9576 |
129
+ | 0.2189 | 60.0 | 165 | 0.1285 | 0.9697 |
130
+ | 0.2189 | 60.7273 | 167 | 0.2227 | 0.9212 |
131
+ | 0.211 | 61.8182 | 170 | 0.1671 | 0.9515 |
132
+ | 0.211 | 62.9091 | 173 | 0.1489 | 0.9636 |
133
+ | 0.211 | 64.0 | 176 | 0.1842 | 0.9394 |
134
+ | 0.211 | 64.7273 | 178 | 0.1687 | 0.9636 |
135
+ | 0.1834 | 65.8182 | 181 | 0.2118 | 0.9091 |
136
+ | 0.1834 | 66.9091 | 184 | 0.2191 | 0.9273 |
137
+ | 0.1834 | 68.0 | 187 | 0.2014 | 0.9273 |
138
+ | 0.1834 | 68.7273 | 189 | 0.1861 | 0.9515 |
139
+ | 0.1846 | 69.8182 | 192 | 0.1309 | 0.9758 |
140
+ | 0.1846 | 70.9091 | 195 | 0.1236 | 0.9636 |
141
+ | 0.1846 | 72.0 | 198 | 0.1541 | 0.9455 |
142
+ | 0.1581 | 72.7273 | 200 | 0.1577 | 0.9576 |
143
+ | 0.1581 | 73.8182 | 203 | 0.1927 | 0.9273 |
144
+ | 0.1581 | 74.9091 | 206 | 0.2247 | 0.9273 |
145
+ | 0.1581 | 76.0 | 209 | 0.1811 | 0.9576 |
146
+ | 0.1742 | 76.7273 | 211 | 0.2190 | 0.9273 |
147
+ | 0.1742 | 77.8182 | 214 | 0.1487 | 0.9697 |
148
+ | 0.1742 | 78.9091 | 217 | 0.1836 | 0.9576 |
149
+ | 0.1837 | 80.0 | 220 | 0.1228 | 0.9758 |
150
+ | 0.1837 | 80.7273 | 222 | 0.1400 | 0.9636 |
151
+ | 0.1837 | 81.4545 | 224 | 0.1902 | 0.9394 |
152
 
153
 
154
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:022af29e7c22239103800f6b4c1407514020b44cb8519c9f940f4e5d1adb213a
3
  size 343227052
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2ea9ea1224e521a9b1ae6ee3e80940bb8f1ce4295db23e7a9192eafff37ee331
3
  size 343227052