End of training
Browse files- README.md +109 -109
- model.safetensors +1 -1
- train.ipynb +0 -0
- training_args.bin +1 -1
README.md
CHANGED
@@ -25,16 +25,16 @@ model-index:
|
|
25 |
metrics:
|
26 |
- name: Accuracy
|
27 |
type: accuracy
|
28 |
-
value: 0.
|
29 |
- name: F1
|
30 |
type: f1
|
31 |
-
value: 0.
|
32 |
- name: Precision
|
33 |
type: precision
|
34 |
-
value: 0.
|
35 |
- name: Recall
|
36 |
type: recall
|
37 |
-
value: 0.
|
38 |
---
|
39 |
|
40 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -44,11 +44,11 @@ should probably proofread and complete it, then remove this comment. -->
|
|
44 |
|
45 |
This model is a fine-tuned version of [google/siglip-base-patch16-224](https://huggingface.co/google/siglip-base-patch16-224) on the stanford-dogs dataset.
|
46 |
It achieves the following results on the evaluation set:
|
47 |
-
- Loss: 0.
|
48 |
-
- Accuracy: 0.
|
49 |
-
- F1: 0.
|
50 |
-
- Precision: 0.
|
51 |
-
- Recall: 0.
|
52 |
|
53 |
## Model description
|
54 |
|
@@ -81,106 +81,106 @@ The following hyperparameters were used during training:
|
|
81 |
|
82 |
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|
83 |
|:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
|
84 |
-
| 4.
|
85 |
-
| 4.
|
86 |
-
| 3.
|
87 |
-
| 3.
|
88 |
-
| 2.
|
89 |
-
| 2.
|
90 |
-
| 2.
|
91 |
-
| 2.
|
92 |
-
| 1.
|
93 |
-
| 1.
|
94 |
-
| 1.
|
95 |
-
| 1.
|
96 |
-
| 1.
|
97 |
-
| 1.
|
98 |
-
| 1.
|
99 |
-
| 1.
|
100 |
-
| 1.
|
101 |
-
| 1.
|
102 |
-
| 1.
|
103 |
-
| 1.
|
104 |
-
| 1.
|
105 |
-
| 1.
|
106 |
-
| 1.
|
107 |
-
| 1.
|
108 |
-
| 1.
|
109 |
-
| 1.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
|
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
-
| 0.
|
128 |
-
| 0.
|
129 |
-
| 0.
|
130 |
-
| 0.
|
131 |
-
| 0.
|
132 |
-
| 0.
|
133 |
-
| 0.
|
134 |
-
| 0.
|
135 |
-
| 0.
|
136 |
-
| 0.
|
137 |
-
| 0.
|
138 |
-
| 0.
|
139 |
-
| 0.
|
140 |
-
| 0.
|
141 |
-
| 0.
|
142 |
-
| 0.
|
143 |
-
| 0.
|
144 |
-
| 0.
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
-
| 0.
|
154 |
-
| 0.
|
155 |
-
| 0.
|
156 |
-
| 0.
|
157 |
-
| 0.
|
158 |
-
| 0.
|
159 |
-
| 0.
|
160 |
-
| 0.
|
161 |
-
| 0.
|
162 |
-
| 0.
|
163 |
-
| 0.
|
164 |
-
| 0.
|
165 |
-
| 0.
|
166 |
-
| 0.
|
167 |
-
| 0.
|
168 |
-
| 0.
|
169 |
-
| 0.
|
170 |
-
| 0.
|
171 |
-
| 0.
|
172 |
-
| 0.
|
173 |
-
| 0.
|
174 |
-
| 0.
|
175 |
-
| 0.
|
176 |
-
| 0.
|
177 |
-
| 0.
|
178 |
-
| 0.
|
179 |
-
| 0.
|
180 |
-
| 0.
|
181 |
-
| 0.
|
182 |
-
| 0.
|
183 |
-
| 0.
|
184 |
|
185 |
|
186 |
### Framework versions
|
|
|
25 |
metrics:
|
26 |
- name: Accuracy
|
27 |
type: accuracy
|
28 |
+
value: 0.8323615160349854
|
29 |
- name: F1
|
30 |
type: f1
|
31 |
+
value: 0.8275029898684891
|
32 |
- name: Precision
|
33 |
type: precision
|
34 |
+
value: 0.834013028184158
|
35 |
- name: Recall
|
36 |
type: recall
|
37 |
+
value: 0.8284605497111518
|
38 |
---
|
39 |
|
40 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
44 |
|
45 |
This model is a fine-tuned version of [google/siglip-base-patch16-224](https://huggingface.co/google/siglip-base-patch16-224) on the stanford-dogs dataset.
|
46 |
It achieves the following results on the evaluation set:
|
47 |
+
- Loss: 0.5447
|
48 |
+
- Accuracy: 0.8324
|
49 |
+
- F1: 0.8275
|
50 |
+
- Precision: 0.8340
|
51 |
+
- Recall: 0.8285
|
52 |
|
53 |
## Model description
|
54 |
|
|
|
81 |
|
82 |
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|
83 |
|:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
|
84 |
+
| 4.8988 | 0.0777 | 10 | 4.4703 | 0.0632 | 0.0290 | 0.0456 | 0.0624 |
|
85 |
+
| 4.4323 | 0.1553 | 20 | 3.8317 | 0.1540 | 0.1033 | 0.1490 | 0.1435 |
|
86 |
+
| 3.8517 | 0.2330 | 30 | 2.9889 | 0.2787 | 0.2215 | 0.3131 | 0.2661 |
|
87 |
+
| 3.4059 | 0.3107 | 40 | 2.3481 | 0.3754 | 0.3339 | 0.4429 | 0.3702 |
|
88 |
+
| 2.8496 | 0.3883 | 50 | 2.3529 | 0.3649 | 0.3426 | 0.5046 | 0.3637 |
|
89 |
+
| 2.597 | 0.4660 | 60 | 1.6990 | 0.5350 | 0.5160 | 0.6056 | 0.5289 |
|
90 |
+
| 2.2791 | 0.5437 | 70 | 1.5456 | 0.5649 | 0.5345 | 0.6426 | 0.5591 |
|
91 |
+
| 2.056 | 0.6214 | 80 | 1.5037 | 0.5678 | 0.5557 | 0.6359 | 0.5658 |
|
92 |
+
| 1.9135 | 0.6990 | 90 | 1.5768 | 0.5413 | 0.5097 | 0.6302 | 0.5321 |
|
93 |
+
| 1.8408 | 0.7767 | 100 | 1.1497 | 0.6591 | 0.6394 | 0.6927 | 0.6535 |
|
94 |
+
| 1.7106 | 0.8544 | 110 | 1.2396 | 0.6365 | 0.6200 | 0.6801 | 0.6297 |
|
95 |
+
| 1.7172 | 0.9320 | 120 | 1.0894 | 0.6820 | 0.6715 | 0.7272 | 0.6766 |
|
96 |
+
| 1.6366 | 1.0097 | 130 | 1.0108 | 0.6963 | 0.6866 | 0.7387 | 0.6907 |
|
97 |
+
| 1.3805 | 1.0874 | 140 | 0.9943 | 0.6941 | 0.6838 | 0.7329 | 0.6878 |
|
98 |
+
| 1.4473 | 1.1650 | 150 | 0.9784 | 0.7034 | 0.6917 | 0.7437 | 0.6999 |
|
99 |
+
| 1.3215 | 1.2427 | 160 | 1.0036 | 0.6922 | 0.6767 | 0.7445 | 0.6862 |
|
100 |
+
| 1.3711 | 1.3204 | 170 | 0.9941 | 0.6859 | 0.6797 | 0.7414 | 0.6807 |
|
101 |
+
| 1.2312 | 1.3981 | 180 | 0.9691 | 0.6973 | 0.6904 | 0.7373 | 0.6970 |
|
102 |
+
| 1.3214 | 1.4757 | 190 | 0.9573 | 0.7106 | 0.6934 | 0.7435 | 0.7041 |
|
103 |
+
| 1.2569 | 1.5534 | 200 | 0.9337 | 0.7155 | 0.7062 | 0.7480 | 0.7147 |
|
104 |
+
| 1.2645 | 1.6311 | 210 | 0.8849 | 0.7298 | 0.7231 | 0.7586 | 0.7264 |
|
105 |
+
| 1.2608 | 1.7087 | 220 | 0.8403 | 0.7264 | 0.7153 | 0.7580 | 0.7232 |
|
106 |
+
| 1.2059 | 1.7864 | 230 | 0.8654 | 0.7293 | 0.7240 | 0.7632 | 0.7274 |
|
107 |
+
| 1.1956 | 1.8641 | 240 | 0.7840 | 0.7524 | 0.7435 | 0.7721 | 0.7498 |
|
108 |
+
| 1.1926 | 1.9417 | 250 | 0.8357 | 0.7383 | 0.7326 | 0.7800 | 0.7359 |
|
109 |
+
| 1.1563 | 2.0194 | 260 | 0.8298 | 0.7413 | 0.7332 | 0.7727 | 0.7359 |
|
110 |
+
| 0.9693 | 2.0971 | 270 | 0.7872 | 0.7512 | 0.7434 | 0.7717 | 0.7475 |
|
111 |
+
| 0.9372 | 2.1748 | 280 | 0.7755 | 0.7561 | 0.7502 | 0.7704 | 0.7527 |
|
112 |
+
| 1.0188 | 2.2524 | 290 | 0.7516 | 0.7612 | 0.7539 | 0.7832 | 0.7566 |
|
113 |
+
| 0.8951 | 2.3301 | 300 | 0.7819 | 0.7510 | 0.7408 | 0.7678 | 0.7457 |
|
114 |
+
| 0.8975 | 2.4078 | 310 | 0.8678 | 0.7298 | 0.7221 | 0.7643 | 0.7269 |
|
115 |
+
| 0.9194 | 2.4854 | 320 | 0.7628 | 0.7655 | 0.7555 | 0.7908 | 0.7596 |
|
116 |
+
| 0.8753 | 2.5631 | 330 | 0.7341 | 0.7668 | 0.7567 | 0.7876 | 0.7624 |
|
117 |
+
| 0.8798 | 2.6408 | 340 | 0.7475 | 0.7600 | 0.7541 | 0.7839 | 0.7589 |
|
118 |
+
| 0.9025 | 2.7184 | 350 | 0.7138 | 0.7694 | 0.7632 | 0.7889 | 0.7676 |
|
119 |
+
| 0.8974 | 2.7961 | 360 | 0.7128 | 0.7736 | 0.7668 | 0.7868 | 0.7694 |
|
120 |
+
| 0.8956 | 2.8738 | 370 | 0.7460 | 0.7636 | 0.7580 | 0.7855 | 0.7618 |
|
121 |
+
| 0.8629 | 2.9515 | 380 | 0.7315 | 0.7675 | 0.7590 | 0.7853 | 0.7616 |
|
122 |
+
| 0.8477 | 3.0291 | 390 | 0.7071 | 0.7738 | 0.7674 | 0.7933 | 0.7705 |
|
123 |
+
| 0.6569 | 3.1068 | 400 | 0.7051 | 0.7787 | 0.7681 | 0.7907 | 0.7723 |
|
124 |
+
| 0.691 | 3.1845 | 410 | 0.6839 | 0.7840 | 0.7768 | 0.8040 | 0.7780 |
|
125 |
+
| 0.6823 | 3.2621 | 420 | 0.6759 | 0.7852 | 0.7768 | 0.7935 | 0.7810 |
|
126 |
+
| 0.7074 | 3.3398 | 430 | 0.6757 | 0.7835 | 0.7795 | 0.8003 | 0.7812 |
|
127 |
+
| 0.6721 | 3.4175 | 440 | 0.6905 | 0.7889 | 0.7811 | 0.7999 | 0.7851 |
|
128 |
+
| 0.7367 | 3.4951 | 450 | 0.6906 | 0.7830 | 0.7750 | 0.7939 | 0.7812 |
|
129 |
+
| 0.6784 | 3.5728 | 460 | 0.6663 | 0.7937 | 0.7863 | 0.8039 | 0.7913 |
|
130 |
+
| 0.6661 | 3.6505 | 470 | 0.6949 | 0.7840 | 0.7762 | 0.7990 | 0.7804 |
|
131 |
+
| 0.6648 | 3.7282 | 480 | 0.6440 | 0.7971 | 0.7922 | 0.8119 | 0.7937 |
|
132 |
+
| 0.7052 | 3.8058 | 490 | 0.6983 | 0.7823 | 0.7748 | 0.7917 | 0.7784 |
|
133 |
+
| 0.7213 | 3.8835 | 500 | 0.6627 | 0.7930 | 0.7877 | 0.8059 | 0.7878 |
|
134 |
+
| 0.6638 | 3.9612 | 510 | 0.6402 | 0.7971 | 0.7910 | 0.8050 | 0.7929 |
|
135 |
+
| 0.6242 | 4.0388 | 520 | 0.6487 | 0.7983 | 0.7925 | 0.8090 | 0.7961 |
|
136 |
+
| 0.5233 | 4.1165 | 530 | 0.6648 | 0.7942 | 0.7859 | 0.8033 | 0.7899 |
|
137 |
+
| 0.5677 | 4.1942 | 540 | 0.6201 | 0.8076 | 0.8017 | 0.8141 | 0.8044 |
|
138 |
+
| 0.5325 | 4.2718 | 550 | 0.6332 | 0.8039 | 0.7970 | 0.8110 | 0.8018 |
|
139 |
+
| 0.5479 | 4.3495 | 560 | 0.6283 | 0.8083 | 0.8028 | 0.8143 | 0.8047 |
|
140 |
+
| 0.5485 | 4.4272 | 570 | 0.6005 | 0.8122 | 0.8090 | 0.8183 | 0.8101 |
|
141 |
+
| 0.5521 | 4.5049 | 580 | 0.6273 | 0.8069 | 0.8029 | 0.8169 | 0.8040 |
|
142 |
+
| 0.5607 | 4.5825 | 590 | 0.6291 | 0.8069 | 0.8020 | 0.8203 | 0.8027 |
|
143 |
+
| 0.5263 | 4.6602 | 600 | 0.6218 | 0.8076 | 0.8033 | 0.8192 | 0.8026 |
|
144 |
+
| 0.5798 | 4.7379 | 610 | 0.5982 | 0.8178 | 0.8134 | 0.8275 | 0.8138 |
|
145 |
+
| 0.5593 | 4.8155 | 620 | 0.6212 | 0.8105 | 0.8075 | 0.8209 | 0.8067 |
|
146 |
+
| 0.58 | 4.8932 | 630 | 0.5949 | 0.8166 | 0.8111 | 0.8250 | 0.8121 |
|
147 |
+
| 0.4746 | 4.9709 | 640 | 0.6007 | 0.8180 | 0.8122 | 0.8273 | 0.8122 |
|
148 |
+
| 0.4821 | 5.0485 | 650 | 0.5929 | 0.8183 | 0.8131 | 0.8234 | 0.8138 |
|
149 |
+
| 0.4221 | 5.1262 | 660 | 0.6179 | 0.8086 | 0.8017 | 0.8151 | 0.8044 |
|
150 |
+
| 0.4615 | 5.2039 | 670 | 0.5937 | 0.8195 | 0.8136 | 0.8228 | 0.8150 |
|
151 |
+
| 0.4078 | 5.2816 | 680 | 0.5970 | 0.8132 | 0.8095 | 0.8213 | 0.8085 |
|
152 |
+
| 0.4551 | 5.3592 | 690 | 0.5937 | 0.8132 | 0.8100 | 0.8210 | 0.8103 |
|
153 |
+
| 0.4211 | 5.4369 | 700 | 0.5834 | 0.8180 | 0.8140 | 0.8236 | 0.8134 |
|
154 |
+
| 0.4055 | 5.5146 | 710 | 0.5938 | 0.8173 | 0.8114 | 0.8239 | 0.8116 |
|
155 |
+
| 0.4284 | 5.5922 | 720 | 0.5988 | 0.8134 | 0.8102 | 0.8182 | 0.8103 |
|
156 |
+
| 0.4113 | 5.6699 | 730 | 0.6067 | 0.8132 | 0.8072 | 0.8198 | 0.8094 |
|
157 |
+
| 0.3689 | 5.7476 | 740 | 0.6013 | 0.8134 | 0.8081 | 0.8201 | 0.8099 |
|
158 |
+
| 0.3788 | 5.8252 | 750 | 0.5993 | 0.8090 | 0.8024 | 0.8146 | 0.8048 |
|
159 |
+
| 0.427 | 5.9029 | 760 | 0.5807 | 0.8222 | 0.8173 | 0.8262 | 0.8185 |
|
160 |
+
| 0.4027 | 5.9806 | 770 | 0.5829 | 0.8239 | 0.8182 | 0.8289 | 0.8191 |
|
161 |
+
| 0.3971 | 6.0583 | 780 | 0.5741 | 0.8243 | 0.8218 | 0.8300 | 0.8209 |
|
162 |
+
| 0.3543 | 6.1359 | 790 | 0.5662 | 0.8246 | 0.8206 | 0.8296 | 0.8203 |
|
163 |
+
| 0.3304 | 6.2136 | 800 | 0.5678 | 0.8253 | 0.8216 | 0.8323 | 0.8219 |
|
164 |
+
| 0.3065 | 6.2913 | 810 | 0.5797 | 0.8214 | 0.8167 | 0.8279 | 0.8175 |
|
165 |
+
| 0.2913 | 6.3689 | 820 | 0.5769 | 0.8212 | 0.8162 | 0.8250 | 0.8167 |
|
166 |
+
| 0.3447 | 6.4466 | 830 | 0.5726 | 0.8202 | 0.8165 | 0.8256 | 0.8168 |
|
167 |
+
| 0.3064 | 6.5243 | 840 | 0.5750 | 0.8241 | 0.8207 | 0.8310 | 0.8208 |
|
168 |
+
| 0.3106 | 6.6019 | 850 | 0.5631 | 0.8285 | 0.8247 | 0.8355 | 0.8246 |
|
169 |
+
| 0.297 | 6.6796 | 860 | 0.5591 | 0.8282 | 0.8238 | 0.8321 | 0.8244 |
|
170 |
+
| 0.2967 | 6.7573 | 870 | 0.5623 | 0.8243 | 0.8198 | 0.8279 | 0.8206 |
|
171 |
+
| 0.3157 | 6.8350 | 880 | 0.5617 | 0.8222 | 0.8177 | 0.8247 | 0.8182 |
|
172 |
+
| 0.3129 | 6.9126 | 890 | 0.5638 | 0.8251 | 0.8200 | 0.8283 | 0.8210 |
|
173 |
+
| 0.2994 | 6.9903 | 900 | 0.5578 | 0.8270 | 0.8210 | 0.8288 | 0.8233 |
|
174 |
+
| 0.31 | 7.0680 | 910 | 0.5498 | 0.8304 | 0.8262 | 0.8315 | 0.8267 |
|
175 |
+
| 0.2733 | 7.1456 | 920 | 0.5547 | 0.8280 | 0.8230 | 0.8291 | 0.8242 |
|
176 |
+
| 0.2496 | 7.2233 | 930 | 0.5527 | 0.8292 | 0.8255 | 0.8319 | 0.8255 |
|
177 |
+
| 0.2398 | 7.3010 | 940 | 0.5562 | 0.8287 | 0.8240 | 0.8305 | 0.8250 |
|
178 |
+
| 0.2758 | 7.3786 | 950 | 0.5509 | 0.8311 | 0.8272 | 0.8337 | 0.8279 |
|
179 |
+
| 0.2539 | 7.4563 | 960 | 0.5521 | 0.8297 | 0.8243 | 0.8310 | 0.8260 |
|
180 |
+
| 0.2891 | 7.5340 | 970 | 0.5492 | 0.8314 | 0.8266 | 0.8337 | 0.8275 |
|
181 |
+
| 0.239 | 7.6117 | 980 | 0.5466 | 0.8321 | 0.8271 | 0.8337 | 0.8283 |
|
182 |
+
| 0.23 | 7.6893 | 990 | 0.5449 | 0.8324 | 0.8275 | 0.8338 | 0.8285 |
|
183 |
+
| 0.2565 | 7.7670 | 1000 | 0.5447 | 0.8324 | 0.8275 | 0.8340 | 0.8285 |
|
184 |
|
185 |
|
186 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 371930976
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:430eb0443753e815fee4ce3a8c5a4147e7dede6d710c7ee44c4369a1da1c2b06
|
3 |
size 371930976
|
train.ipynb
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5112
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6a04dfd289d2c5c1771454fdd51e6b0dd1d82f36bcd427ac5c5f1eb46e0c72cd
|
3 |
size 5112
|