venetis commited on
Commit
82db05a
1 Parent(s): 64d1c70

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +165 -0
README.md ADDED
@@ -0,0 +1,165 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - rock-glacier-dataset
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: hf_train_output
11
+ results:
12
+ - task:
13
+ name: Image Classification
14
+ type: image-classification
15
+ dataset:
16
+ name: rock-glacier-dataset
17
+ type: rock-glacier-dataset
18
+ config: default
19
+ split: train
20
+ args: default
21
+ metrics:
22
+ - name: Accuracy
23
+ type: accuracy
24
+ value: 0.9258241758241759
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # hf_train_output
31
+
32
+ This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the rock-glacier-dataset dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.3894
35
+ - Accuracy: 0.9258
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 1e-05
55
+ - train_batch_size: 16
56
+ - eval_batch_size: 8
57
+ - seed: 42
58
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - num_epochs: 50
61
+ - mixed_precision_training: Native AMP
62
+
63
+ ### Training results
64
+
65
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
66
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
67
+ | 0.5619 | 0.55 | 50 | 0.5432 | 0.7692 |
68
+ | 0.4582 | 1.1 | 100 | 0.4435 | 0.8352 |
69
+ | 0.3548 | 1.65 | 150 | 0.3739 | 0.8599 |
70
+ | 0.217 | 2.2 | 200 | 0.2913 | 0.9093 |
71
+ | 0.1709 | 2.75 | 250 | 0.2619 | 0.9148 |
72
+ | 0.0919 | 3.3 | 300 | 0.2475 | 0.9148 |
73
+ | 0.0652 | 3.85 | 350 | 0.3275 | 0.8901 |
74
+ | 0.0495 | 4.4 | 400 | 0.2515 | 0.9093 |
75
+ | 0.0321 | 4.95 | 450 | 0.2878 | 0.9066 |
76
+ | 0.0247 | 5.49 | 500 | 0.2612 | 0.9148 |
77
+ | 0.017 | 6.04 | 550 | 0.2687 | 0.9176 |
78
+ | 0.0131 | 6.59 | 600 | 0.3062 | 0.9093 |
79
+ | 0.0113 | 7.14 | 650 | 0.2587 | 0.9231 |
80
+ | 0.0099 | 7.69 | 700 | 0.2815 | 0.9203 |
81
+ | 0.009 | 8.24 | 750 | 0.2675 | 0.9286 |
82
+ | 0.0084 | 8.79 | 800 | 0.2711 | 0.9286 |
83
+ | 0.0077 | 9.34 | 850 | 0.2663 | 0.9313 |
84
+ | 0.0073 | 9.89 | 900 | 0.3003 | 0.9258 |
85
+ | 0.0069 | 10.44 | 950 | 0.2758 | 0.9313 |
86
+ | 0.0064 | 10.99 | 1000 | 0.2999 | 0.9258 |
87
+ | 0.0061 | 11.54 | 1050 | 0.2931 | 0.9313 |
88
+ | 0.0057 | 12.09 | 1100 | 0.2989 | 0.9313 |
89
+ | 0.0056 | 12.64 | 1150 | 0.2974 | 0.9313 |
90
+ | 0.0053 | 13.19 | 1200 | 0.3099 | 0.9258 |
91
+ | 0.005 | 13.74 | 1250 | 0.3131 | 0.9313 |
92
+ | 0.0049 | 14.29 | 1300 | 0.3201 | 0.9258 |
93
+ | 0.0046 | 14.84 | 1350 | 0.3109 | 0.9313 |
94
+ | 0.0045 | 15.38 | 1400 | 0.3168 | 0.9313 |
95
+ | 0.0043 | 15.93 | 1450 | 0.3226 | 0.9231 |
96
+ | 0.0042 | 16.48 | 1500 | 0.3234 | 0.9231 |
97
+ | 0.0041 | 17.03 | 1550 | 0.3283 | 0.9258 |
98
+ | 0.0039 | 17.58 | 1600 | 0.3304 | 0.9258 |
99
+ | 0.0038 | 18.13 | 1650 | 0.3321 | 0.9231 |
100
+ | 0.0037 | 18.68 | 1700 | 0.3362 | 0.9231 |
101
+ | 0.0036 | 19.23 | 1750 | 0.3307 | 0.9286 |
102
+ | 0.0035 | 19.78 | 1800 | 0.3357 | 0.9231 |
103
+ | 0.0034 | 20.33 | 1850 | 0.3244 | 0.9313 |
104
+ | 0.0033 | 20.88 | 1900 | 0.3497 | 0.9231 |
105
+ | 0.0032 | 21.43 | 1950 | 0.3443 | 0.9231 |
106
+ | 0.0031 | 21.98 | 2000 | 0.3398 | 0.9286 |
107
+ | 0.003 | 22.53 | 2050 | 0.3388 | 0.9286 |
108
+ | 0.003 | 23.08 | 2100 | 0.3399 | 0.9286 |
109
+ | 0.0029 | 23.63 | 2150 | 0.3548 | 0.9231 |
110
+ | 0.0028 | 24.18 | 2200 | 0.3475 | 0.9286 |
111
+ | 0.0028 | 24.73 | 2250 | 0.3480 | 0.9286 |
112
+ | 0.0027 | 25.27 | 2300 | 0.3542 | 0.9231 |
113
+ | 0.0026 | 25.82 | 2350 | 0.3589 | 0.9231 |
114
+ | 0.0026 | 26.37 | 2400 | 0.3449 | 0.9286 |
115
+ | 0.0025 | 26.92 | 2450 | 0.3604 | 0.9231 |
116
+ | 0.0025 | 27.47 | 2500 | 0.3493 | 0.9286 |
117
+ | 0.0024 | 28.02 | 2550 | 0.3631 | 0.9258 |
118
+ | 0.0024 | 28.57 | 2600 | 0.3590 | 0.9258 |
119
+ | 0.0023 | 29.12 | 2650 | 0.3604 | 0.9258 |
120
+ | 0.0023 | 29.67 | 2700 | 0.3667 | 0.9258 |
121
+ | 0.0022 | 30.22 | 2750 | 0.3571 | 0.9286 |
122
+ | 0.0022 | 30.77 | 2800 | 0.3660 | 0.9258 |
123
+ | 0.0021 | 31.32 | 2850 | 0.3638 | 0.9286 |
124
+ | 0.0021 | 31.87 | 2900 | 0.3729 | 0.9258 |
125
+ | 0.0021 | 32.42 | 2950 | 0.3706 | 0.9258 |
126
+ | 0.002 | 32.97 | 3000 | 0.3669 | 0.9286 |
127
+ | 0.002 | 33.52 | 3050 | 0.3740 | 0.9258 |
128
+ | 0.002 | 34.07 | 3100 | 0.3693 | 0.9286 |
129
+ | 0.002 | 34.62 | 3150 | 0.3700 | 0.9286 |
130
+ | 0.0019 | 35.16 | 3200 | 0.3752 | 0.9258 |
131
+ | 0.0019 | 35.71 | 3250 | 0.3753 | 0.9258 |
132
+ | 0.0019 | 36.26 | 3300 | 0.3721 | 0.9286 |
133
+ | 0.0018 | 36.81 | 3350 | 0.3764 | 0.9258 |
134
+ | 0.0018 | 37.36 | 3400 | 0.3758 | 0.9258 |
135
+ | 0.0018 | 37.91 | 3450 | 0.3775 | 0.9258 |
136
+ | 0.0018 | 38.46 | 3500 | 0.3812 | 0.9258 |
137
+ | 0.0018 | 39.01 | 3550 | 0.3817 | 0.9258 |
138
+ | 0.0017 | 39.56 | 3600 | 0.3815 | 0.9258 |
139
+ | 0.0017 | 40.11 | 3650 | 0.3825 | 0.9258 |
140
+ | 0.0017 | 40.66 | 3700 | 0.3852 | 0.9258 |
141
+ | 0.0017 | 41.21 | 3750 | 0.3854 | 0.9258 |
142
+ | 0.0017 | 41.76 | 3800 | 0.3823 | 0.9258 |
143
+ | 0.0016 | 42.31 | 3850 | 0.3829 | 0.9258 |
144
+ | 0.0016 | 42.86 | 3900 | 0.3873 | 0.9258 |
145
+ | 0.0016 | 43.41 | 3950 | 0.3842 | 0.9258 |
146
+ | 0.0016 | 43.96 | 4000 | 0.3857 | 0.9258 |
147
+ | 0.0016 | 44.51 | 4050 | 0.3873 | 0.9258 |
148
+ | 0.0016 | 45.05 | 4100 | 0.3878 | 0.9258 |
149
+ | 0.0016 | 45.6 | 4150 | 0.3881 | 0.9258 |
150
+ | 0.0016 | 46.15 | 4200 | 0.3888 | 0.9258 |
151
+ | 0.0016 | 46.7 | 4250 | 0.3891 | 0.9258 |
152
+ | 0.0016 | 47.25 | 4300 | 0.3878 | 0.9258 |
153
+ | 0.0016 | 47.8 | 4350 | 0.3890 | 0.9258 |
154
+ | 0.0016 | 48.35 | 4400 | 0.3890 | 0.9258 |
155
+ | 0.0015 | 48.9 | 4450 | 0.3895 | 0.9258 |
156
+ | 0.0015 | 49.45 | 4500 | 0.3896 | 0.9258 |
157
+ | 0.0015 | 50.0 | 4550 | 0.3894 | 0.9258 |
158
+
159
+
160
+ ### Framework versions
161
+
162
+ - Transformers 4.24.0
163
+ - Pytorch 1.12.1+cu113
164
+ - Datasets 2.7.0
165
+ - Tokenizers 0.13.2