carolinetfls commited on
Commit
08e7a0f
1 Parent(s): 4f095e8

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +163 -0
README.md ADDED
@@ -0,0 +1,163 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - imagefolder
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: plant-seedlings-freeze-0-6-aug-3-all-train-2
11
+ results:
12
+ - task:
13
+ name: Image Classification
14
+ type: image-classification
15
+ dataset:
16
+ name: imagefolder
17
+ type: imagefolder
18
+ config: default
19
+ split: train
20
+ args: default
21
+ metrics:
22
+ - name: Accuracy
23
+ type: accuracy
24
+ value: 0.949048496009822
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # plant-seedlings-freeze-0-6-aug-3-all-train-2
31
+
32
+ This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.1846
35
+ - Accuracy: 0.9490
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 0.0002
55
+ - train_batch_size: 16
56
+ - eval_batch_size: 8
57
+ - seed: 42
58
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - num_epochs: 22
61
+ - mixed_precision_training: Native AMP
62
+
63
+ ### Training results
64
+
65
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
66
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
67
+ | 0.6118 | 0.25 | 100 | 0.6427 | 0.7901 |
68
+ | 0.5478 | 0.49 | 200 | 0.5610 | 0.8232 |
69
+ | 0.3263 | 0.74 | 300 | 0.4523 | 0.8508 |
70
+ | 0.3938 | 0.98 | 400 | 0.3913 | 0.8649 |
71
+ | 0.3764 | 1.23 | 500 | 0.4459 | 0.8539 |
72
+ | 0.422 | 1.47 | 600 | 0.3761 | 0.8711 |
73
+ | 0.491 | 1.72 | 700 | 0.3525 | 0.8729 |
74
+ | 0.361 | 1.97 | 800 | 0.3738 | 0.8699 |
75
+ | 0.2441 | 2.21 | 900 | 0.3580 | 0.8828 |
76
+ | 0.4054 | 2.46 | 1000 | 0.4232 | 0.8692 |
77
+ | 0.3191 | 2.7 | 1100 | 0.2954 | 0.8969 |
78
+ | 0.343 | 2.95 | 1200 | 0.3528 | 0.8785 |
79
+ | 0.1623 | 3.19 | 1300 | 0.2624 | 0.9122 |
80
+ | 0.3418 | 3.44 | 1400 | 0.4062 | 0.8686 |
81
+ | 0.2535 | 3.69 | 1500 | 0.3043 | 0.8975 |
82
+ | 0.3356 | 3.93 | 1600 | 0.2746 | 0.9104 |
83
+ | 0.2092 | 4.18 | 1700 | 0.3080 | 0.9048 |
84
+ | 0.2423 | 4.42 | 1800 | 0.2958 | 0.9018 |
85
+ | 0.3758 | 4.67 | 1900 | 0.2949 | 0.9055 |
86
+ | 0.3434 | 4.91 | 2000 | 0.2647 | 0.9251 |
87
+ | 0.1809 | 5.16 | 2100 | 0.3192 | 0.9036 |
88
+ | 0.1617 | 5.41 | 2200 | 0.3036 | 0.8975 |
89
+ | 0.3044 | 5.65 | 2300 | 0.3053 | 0.8956 |
90
+ | 0.1709 | 5.9 | 2400 | 0.3879 | 0.8846 |
91
+ | 0.2963 | 6.14 | 2500 | 0.3243 | 0.8932 |
92
+ | 0.2314 | 6.39 | 2600 | 0.2632 | 0.9147 |
93
+ | 0.1128 | 6.63 | 2700 | 0.1934 | 0.9374 |
94
+ | 0.3211 | 6.88 | 2800 | 0.3639 | 0.8901 |
95
+ | 0.1108 | 7.13 | 2900 | 0.2748 | 0.9116 |
96
+ | 0.1128 | 7.37 | 3000 | 0.3050 | 0.9091 |
97
+ | 0.1648 | 7.62 | 3100 | 0.2830 | 0.9134 |
98
+ | 0.0887 | 7.86 | 3200 | 0.2707 | 0.9134 |
99
+ | 0.198 | 8.11 | 3300 | 0.2978 | 0.9116 |
100
+ | 0.1902 | 8.35 | 3400 | 0.2946 | 0.9042 |
101
+ | 0.1294 | 8.6 | 3500 | 0.2440 | 0.9227 |
102
+ | 0.2045 | 8.85 | 3600 | 0.2637 | 0.9104 |
103
+ | 0.2953 | 9.09 | 3700 | 0.2741 | 0.9141 |
104
+ | 0.2298 | 9.34 | 3800 | 0.2652 | 0.9177 |
105
+ | 0.2703 | 9.58 | 3900 | 0.2832 | 0.9091 |
106
+ | 0.261 | 9.83 | 4000 | 0.2521 | 0.9239 |
107
+ | 0.1135 | 10.07 | 4100 | 0.2647 | 0.9227 |
108
+ | 0.2153 | 10.32 | 4200 | 0.2623 | 0.9165 |
109
+ | 0.2826 | 10.57 | 4300 | 0.2619 | 0.9134 |
110
+ | 0.14 | 10.81 | 4400 | 0.2275 | 0.9300 |
111
+ | 0.1469 | 11.06 | 4500 | 0.2015 | 0.9282 |
112
+ | 0.1961 | 11.3 | 4600 | 0.2150 | 0.9269 |
113
+ | 0.1918 | 11.55 | 4700 | 0.2377 | 0.9288 |
114
+ | 0.2371 | 11.79 | 4800 | 0.2622 | 0.9184 |
115
+ | 0.0774 | 12.04 | 4900 | 0.2443 | 0.9239 |
116
+ | 0.136 | 12.29 | 5000 | 0.2577 | 0.9196 |
117
+ | 0.2154 | 12.53 | 5100 | 0.2278 | 0.9300 |
118
+ | 0.0926 | 12.78 | 5200 | 0.2209 | 0.9349 |
119
+ | 0.16 | 13.02 | 5300 | 0.2616 | 0.9196 |
120
+ | 0.0983 | 13.27 | 5400 | 0.2337 | 0.9276 |
121
+ | 0.1474 | 13.51 | 5500 | 0.2231 | 0.9355 |
122
+ | 0.1653 | 13.76 | 5600 | 0.2356 | 0.9245 |
123
+ | 0.099 | 14.0 | 5700 | 0.1976 | 0.9417 |
124
+ | 0.1248 | 14.25 | 5800 | 0.2684 | 0.9257 |
125
+ | 0.1565 | 14.5 | 5900 | 0.2197 | 0.9294 |
126
+ | 0.1752 | 14.74 | 6000 | 0.2312 | 0.9368 |
127
+ | 0.1962 | 14.99 | 6100 | 0.1968 | 0.9398 |
128
+ | 0.1373 | 15.23 | 6200 | 0.1925 | 0.9435 |
129
+ | 0.1003 | 15.48 | 6300 | 0.2182 | 0.9325 |
130
+ | 0.0511 | 15.72 | 6400 | 0.1993 | 0.9454 |
131
+ | 0.0401 | 15.97 | 6500 | 0.1941 | 0.9417 |
132
+ | 0.1051 | 16.22 | 6600 | 0.2161 | 0.9349 |
133
+ | 0.0593 | 16.46 | 6700 | 0.1940 | 0.9423 |
134
+ | 0.1215 | 16.71 | 6800 | 0.2579 | 0.9269 |
135
+ | 0.0568 | 16.95 | 6900 | 0.1968 | 0.9423 |
136
+ | 0.087 | 17.2 | 7000 | 0.1827 | 0.9441 |
137
+ | 0.0666 | 17.44 | 7100 | 0.2130 | 0.9454 |
138
+ | 0.0971 | 17.69 | 7200 | 0.2082 | 0.9398 |
139
+ | 0.1444 | 17.94 | 7300 | 0.2233 | 0.9337 |
140
+ | 0.0687 | 18.18 | 7400 | 0.2127 | 0.9429 |
141
+ | 0.0586 | 18.43 | 7500 | 0.2116 | 0.9380 |
142
+ | 0.0501 | 18.67 | 7600 | 0.2089 | 0.9441 |
143
+ | 0.0849 | 18.92 | 7700 | 0.1946 | 0.9490 |
144
+ | 0.0702 | 19.16 | 7800 | 0.2154 | 0.9454 |
145
+ | 0.0542 | 19.41 | 7900 | 0.1922 | 0.9478 |
146
+ | 0.0617 | 19.66 | 8000 | 0.2004 | 0.9423 |
147
+ | 0.061 | 19.9 | 8100 | 0.1923 | 0.9466 |
148
+ | 0.145 | 20.15 | 8200 | 0.1738 | 0.9503 |
149
+ | 0.0443 | 20.39 | 8300 | 0.2079 | 0.9429 |
150
+ | 0.0462 | 20.64 | 8400 | 0.1900 | 0.9478 |
151
+ | 0.0642 | 20.88 | 8500 | 0.1792 | 0.9460 |
152
+ | 0.0715 | 21.13 | 8600 | 0.1926 | 0.9466 |
153
+ | 0.0606 | 21.38 | 8700 | 0.1596 | 0.9527 |
154
+ | 0.0677 | 21.62 | 8800 | 0.2054 | 0.9454 |
155
+ | 0.0807 | 21.87 | 8900 | 0.1846 | 0.9490 |
156
+
157
+
158
+ ### Framework versions
159
+
160
+ - Transformers 4.28.1
161
+ - Pytorch 2.0.0+cu118
162
+ - Datasets 2.11.0
163
+ - Tokenizers 0.13.3