ansilmbabl commited on
Commit
46387da
1 Parent(s): f0a6f3b

Model save

Browse files
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.5027777777777778
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 1.1881
36
- - Accuracy: 0.5028
37
 
38
  ## Model description
39
 
@@ -61,42 +61,112 @@ The following hyperparameters were used during training:
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
- - num_epochs: 30
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
70
- | 1.5695 | 1.0 | 56 | 1.3563 | 0.4128 |
71
- | 1.6007 | 1.99 | 112 | 1.3816 | 0.4006 |
72
- | 1.6462 | 2.99 | 168 | 1.3451 | 0.4106 |
73
- | 1.641 | 4.0 | 225 | 1.3663 | 0.4128 |
74
- | 1.6453 | 5.0 | 281 | 1.3354 | 0.4139 |
75
- | 1.6279 | 5.99 | 337 | 1.2851 | 0.44 |
76
- | 1.606 | 6.99 | 393 | 1.4716 | 0.3772 |
77
- | 1.6047 | 8.0 | 450 | 1.2676 | 0.4472 |
78
- | 1.6002 | 9.0 | 506 | 1.2915 | 0.4528 |
79
- | 1.576 | 9.99 | 562 | 1.2531 | 0.4622 |
80
- | 1.5889 | 10.99 | 618 | 1.2340 | 0.4683 |
81
- | 1.529 | 12.0 | 675 | 1.2794 | 0.45 |
82
- | 1.5413 | 13.0 | 731 | 1.2660 | 0.4644 |
83
- | 1.5411 | 13.99 | 787 | 1.2251 | 0.4833 |
84
- | 1.5384 | 14.99 | 843 | 1.2273 | 0.4667 |
85
- | 1.541 | 16.0 | 900 | 1.2076 | 0.49 |
86
- | 1.537 | 17.0 | 956 | 1.2191 | 0.4756 |
87
- | 1.5212 | 17.99 | 1012 | 1.2236 | 0.4661 |
88
- | 1.4958 | 18.99 | 1068 | 1.2056 | 0.4894 |
89
- | 1.5101 | 20.0 | 1125 | 1.2054 | 0.4861 |
90
- | 1.4913 | 21.0 | 1181 | 1.2043 | 0.4956 |
91
- | 1.4598 | 21.99 | 1237 | 1.1938 | 0.4856 |
92
- | 1.4848 | 22.99 | 1293 | 1.2126 | 0.4889 |
93
- | 1.4511 | 24.0 | 1350 | 1.2124 | 0.4894 |
94
- | 1.4512 | 25.0 | 1406 | 1.1899 | 0.4994 |
95
- | 1.4324 | 25.99 | 1462 | 1.2020 | 0.495 |
96
- | 1.4453 | 26.99 | 1518 | 1.1930 | 0.4989 |
97
- | 1.4135 | 28.0 | 1575 | 1.1905 | 0.5011 |
98
- | 1.4317 | 29.0 | 1631 | 1.1872 | 0.5022 |
99
- | 1.3923 | 29.87 | 1680 | 1.1881 | 0.5028 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
100
 
101
 
102
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.46055555555555555
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 1.7837
36
+ - Accuracy: 0.4606
37
 
38
  ## Model description
39
 
 
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 100
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
70
+ | 1.4297 | 1.0 | 56 | 1.1976 | 0.4933 |
71
+ | 1.4078 | 1.99 | 112 | 1.1964 | 0.5011 |
72
+ | 1.417 | 2.99 | 168 | 1.2025 | 0.4961 |
73
+ | 1.4163 | 4.0 | 225 | 1.2295 | 0.4883 |
74
+ | 1.4318 | 5.0 | 281 | 1.2330 | 0.495 |
75
+ | 1.4383 | 5.99 | 337 | 1.2162 | 0.5022 |
76
+ | 1.4212 | 6.99 | 393 | 1.2634 | 0.4717 |
77
+ | 1.4346 | 8.0 | 450 | 1.3083 | 0.4689 |
78
+ | 1.419 | 9.0 | 506 | 1.2719 | 0.4806 |
79
+ | 1.4252 | 9.99 | 562 | 1.3048 | 0.4911 |
80
+ | 1.4522 | 10.99 | 618 | 1.2708 | 0.4794 |
81
+ | 1.3748 | 12.0 | 675 | 1.3720 | 0.4383 |
82
+ | 1.3966 | 13.0 | 731 | 1.3095 | 0.4594 |
83
+ | 1.4507 | 13.99 | 787 | 1.2430 | 0.485 |
84
+ | 1.4033 | 14.99 | 843 | 1.2728 | 0.4794 |
85
+ | 1.3972 | 16.0 | 900 | 1.2611 | 0.4883 |
86
+ | 1.4136 | 17.0 | 956 | 1.3166 | 0.45 |
87
+ | 1.3992 | 17.99 | 1012 | 1.3103 | 0.4856 |
88
+ | 1.3614 | 18.99 | 1068 | 1.3302 | 0.4422 |
89
+ | 1.3747 | 20.0 | 1125 | 1.2919 | 0.4856 |
90
+ | 1.3868 | 21.0 | 1181 | 1.3166 | 0.4728 |
91
+ | 1.3399 | 21.99 | 1237 | 1.3200 | 0.4672 |
92
+ | 1.3943 | 22.99 | 1293 | 1.2920 | 0.4811 |
93
+ | 1.3635 | 24.0 | 1350 | 1.3109 | 0.4833 |
94
+ | 1.3724 | 25.0 | 1406 | 1.3100 | 0.4644 |
95
+ | 1.3141 | 25.99 | 1462 | 1.3263 | 0.4978 |
96
+ | 1.3576 | 26.99 | 1518 | 1.3307 | 0.4772 |
97
+ | 1.3022 | 28.0 | 1575 | 1.3409 | 0.4978 |
98
+ | 1.2982 | 29.0 | 1631 | 1.3962 | 0.4583 |
99
+ | 1.2657 | 29.99 | 1687 | 1.3329 | 0.4817 |
100
+ | 1.3152 | 30.99 | 1743 | 1.2973 | 0.49 |
101
+ | 1.2924 | 32.0 | 1800 | 1.3159 | 0.4833 |
102
+ | 1.214 | 33.0 | 1856 | 1.3955 | 0.4833 |
103
+ | 1.2717 | 33.99 | 1912 | 1.4583 | 0.46 |
104
+ | 1.2692 | 34.99 | 1968 | 1.3504 | 0.4939 |
105
+ | 1.2127 | 36.0 | 2025 | 1.3784 | 0.4833 |
106
+ | 1.1956 | 37.0 | 2081 | 1.4184 | 0.4817 |
107
+ | 1.2408 | 37.99 | 2137 | 1.3849 | 0.4944 |
108
+ | 1.1699 | 38.99 | 2193 | 1.4298 | 0.4844 |
109
+ | 1.1727 | 40.0 | 2250 | 1.4331 | 0.4772 |
110
+ | 1.1485 | 41.0 | 2306 | 1.4597 | 0.4672 |
111
+ | 1.1668 | 41.99 | 2362 | 1.4429 | 0.4783 |
112
+ | 1.1881 | 42.99 | 2418 | 1.4555 | 0.4839 |
113
+ | 1.1204 | 44.0 | 2475 | 1.4648 | 0.4783 |
114
+ | 1.1523 | 45.0 | 2531 | 1.4744 | 0.4733 |
115
+ | 1.1206 | 45.99 | 2587 | 1.4792 | 0.4906 |
116
+ | 1.1135 | 46.99 | 2643 | 1.5009 | 0.4678 |
117
+ | 1.1227 | 48.0 | 2700 | 1.5480 | 0.4733 |
118
+ | 1.1017 | 49.0 | 2756 | 1.5907 | 0.4644 |
119
+ | 1.1601 | 49.99 | 2812 | 1.5136 | 0.47 |
120
+ | 1.1239 | 50.99 | 2868 | 1.5384 | 0.4789 |
121
+ | 1.09 | 52.0 | 2925 | 1.5716 | 0.4711 |
122
+ | 1.1023 | 53.0 | 2981 | 1.5736 | 0.4728 |
123
+ | 1.1038 | 53.99 | 3037 | 1.5919 | 0.4556 |
124
+ | 1.058 | 54.99 | 3093 | 1.5534 | 0.4772 |
125
+ | 1.0405 | 56.0 | 3150 | 1.5788 | 0.4717 |
126
+ | 1.0172 | 57.0 | 3206 | 1.5855 | 0.4767 |
127
+ | 1.0036 | 57.99 | 3262 | 1.6425 | 0.455 |
128
+ | 1.0124 | 58.99 | 3318 | 1.6039 | 0.4678 |
129
+ | 1.0647 | 60.0 | 3375 | 1.5891 | 0.4572 |
130
+ | 1.0143 | 61.0 | 3431 | 1.6265 | 0.4483 |
131
+ | 1.0051 | 61.99 | 3487 | 1.6208 | 0.4633 |
132
+ | 0.9571 | 62.99 | 3543 | 1.6874 | 0.4483 |
133
+ | 0.9838 | 64.0 | 3600 | 1.6778 | 0.4517 |
134
+ | 0.9995 | 65.0 | 3656 | 1.6248 | 0.4722 |
135
+ | 1.0374 | 65.99 | 3712 | 1.6645 | 0.4667 |
136
+ | 0.9483 | 66.99 | 3768 | 1.6307 | 0.4611 |
137
+ | 0.9825 | 68.0 | 3825 | 1.6662 | 0.4661 |
138
+ | 1.0023 | 69.0 | 3881 | 1.6650 | 0.46 |
139
+ | 0.9642 | 69.99 | 3937 | 1.6953 | 0.4494 |
140
+ | 0.9687 | 70.99 | 3993 | 1.7076 | 0.4661 |
141
+ | 0.9542 | 72.0 | 4050 | 1.7012 | 0.4656 |
142
+ | 0.9378 | 73.0 | 4106 | 1.7056 | 0.4533 |
143
+ | 0.9542 | 73.99 | 4162 | 1.7331 | 0.4572 |
144
+ | 0.9035 | 74.99 | 4218 | 1.7459 | 0.4417 |
145
+ | 0.9631 | 76.0 | 4275 | 1.7236 | 0.465 |
146
+ | 0.8759 | 77.0 | 4331 | 1.7294 | 0.455 |
147
+ | 0.9218 | 77.99 | 4387 | 1.7654 | 0.4578 |
148
+ | 0.9077 | 78.99 | 4443 | 1.7234 | 0.4594 |
149
+ | 0.8924 | 80.0 | 4500 | 1.7256 | 0.4683 |
150
+ | 0.9156 | 81.0 | 4556 | 1.7320 | 0.4678 |
151
+ | 0.806 | 81.99 | 4612 | 1.7348 | 0.4661 |
152
+ | 0.8863 | 82.99 | 4668 | 1.7514 | 0.4606 |
153
+ | 0.8698 | 84.0 | 4725 | 1.7484 | 0.4661 |
154
+ | 0.8623 | 85.0 | 4781 | 1.7420 | 0.4778 |
155
+ | 0.8643 | 85.99 | 4837 | 1.7636 | 0.4617 |
156
+ | 0.8914 | 86.99 | 4893 | 1.7552 | 0.465 |
157
+ | 0.837 | 88.0 | 4950 | 1.7552 | 0.4644 |
158
+ | 0.8217 | 89.0 | 5006 | 1.7532 | 0.4639 |
159
+ | 0.8601 | 89.99 | 5062 | 1.7447 | 0.4683 |
160
+ | 0.8293 | 90.99 | 5118 | 1.7622 | 0.4611 |
161
+ | 0.8301 | 92.0 | 5175 | 1.7616 | 0.4633 |
162
+ | 0.7752 | 93.0 | 5231 | 1.7585 | 0.4722 |
163
+ | 0.8533 | 93.99 | 5287 | 1.7842 | 0.4617 |
164
+ | 0.8156 | 94.99 | 5343 | 1.7837 | 0.4622 |
165
+ | 0.8094 | 96.0 | 5400 | 1.7896 | 0.4583 |
166
+ | 0.839 | 97.0 | 5456 | 1.7835 | 0.465 |
167
+ | 0.839 | 97.99 | 5512 | 1.7883 | 0.46 |
168
+ | 0.7763 | 98.99 | 5568 | 1.7838 | 0.4594 |
169
+ | 0.8186 | 99.56 | 5600 | 1.7837 | 0.4606 |
170
 
171
 
172
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c7b92027871d8e0dc49761dcd47326bf4677e31db1fb9f84a1e6a2b5437ad2d4
3
  size 110364364
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6702ff24bdbb6b75f10514fa0e012ada0133d475e0837f40172dcf9e91a038d7
3
  size 110364364
runs/Feb13_16-55-04_e2e-86-172/events.out.tfevents.1707823520.e2e-86-172.524489.7 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0201e4d37b5eac52f76412a7bc69c465cc0a268d968a557ff4ff7f7502749d2b
3
- size 124418
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7b6de765d41214ac921cc5ee0260d6b9804acb7f817099ed17b2320caaa811ef
3
+ size 125723