davanstrien HF staff commited on
Commit
14385b5
1 Parent(s): 4ac1f00

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +147 -7
README.md CHANGED
@@ -1,8 +1,6 @@
1
  ---
2
  tags:
3
  - generated_from_trainer
4
- datasets:
5
- - davanstrien/manuscript_noisy_labels_iiif
6
  model-index:
7
  - name: clip-roberta-finetuned
8
  results: []
@@ -13,7 +11,9 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # clip-roberta-finetuned
15
 
16
- This model is a fine-tuned version of [./clip-roberta](https://huggingface.co/./clip-roberta) on the davanstrien/manuscript_noisy_labels_iiif dataset.
 
 
17
 
18
  ## Model description
19
 
@@ -33,20 +33,160 @@ More information needed
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 5e-05
36
- - train_batch_size: 64
37
- - eval_batch_size: 64
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: linear
41
- - num_epochs: 3.0
 
42
 
43
  ### Training results
44
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
45
 
46
 
47
  ### Framework versions
48
 
49
  - Transformers 4.21.0.dev0
50
- - Pytorch 1.12.0+cu113
51
  - Datasets 2.3.2
52
  - Tokenizers 0.12.1
 
1
  ---
2
  tags:
3
  - generated_from_trainer
 
 
4
  model-index:
5
  - name: clip-roberta-finetuned
6
  results: []
 
11
 
12
  # clip-roberta-finetuned
13
 
14
+ This model was trained from scratch on the None dataset.
15
+ It achieves the following results on the evaluation set:
16
+ - Loss: 2.6894
17
 
18
  ## Model description
19
 
 
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 5e-05
36
+ - train_batch_size: 128
37
+ - eval_batch_size: 256
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: linear
41
+ - num_epochs: 10.0
42
+ - mixed_precision_training: Native AMP
43
 
44
  ### Training results
45
 
46
+ | Training Loss | Epoch | Step | Validation Loss |
47
+ |:-------------:|:-----:|:-----:|:---------------:|
48
+ | 2.9841 | 0.07 | 500 | 3.4112 |
49
+ | 2.72 | 0.15 | 1000 | 3.3430 |
50
+ | 2.6319 | 0.22 | 1500 | 3.2295 |
51
+ | 2.5781 | 0.29 | 2000 | 3.1645 |
52
+ | 2.5339 | 0.36 | 2500 | 3.1226 |
53
+ | 2.503 | 0.44 | 3000 | 3.0856 |
54
+ | 2.4581 | 0.51 | 3500 | 3.0639 |
55
+ | 2.4494 | 0.58 | 4000 | 3.0415 |
56
+ | 2.4275 | 0.65 | 4500 | 3.0245 |
57
+ | 2.3909 | 0.73 | 5000 | 2.9991 |
58
+ | 2.3902 | 0.8 | 5500 | 2.9931 |
59
+ | 2.3741 | 0.87 | 6000 | 2.9612 |
60
+ | 2.3536 | 0.95 | 6500 | 2.9509 |
61
+ | 2.3392 | 1.02 | 7000 | 2.9289 |
62
+ | 2.3083 | 1.09 | 7500 | 2.9214 |
63
+ | 2.3094 | 1.16 | 8000 | 2.9153 |
64
+ | 2.2864 | 1.24 | 8500 | 2.9034 |
65
+ | 2.2893 | 1.31 | 9000 | 2.8963 |
66
+ | 2.2697 | 1.38 | 9500 | 2.8847 |
67
+ | 2.2762 | 1.46 | 10000 | 2.8665 |
68
+ | 2.2667 | 1.53 | 10500 | 2.8536 |
69
+ | 2.2548 | 1.6 | 11000 | 2.8472 |
70
+ | 2.238 | 1.67 | 11500 | 2.8491 |
71
+ | 2.2423 | 1.75 | 12000 | 2.8257 |
72
+ | 2.2406 | 1.82 | 12500 | 2.8287 |
73
+ | 2.2248 | 1.89 | 13000 | 2.8193 |
74
+ | 2.223 | 1.96 | 13500 | 2.8101 |
75
+ | 2.1995 | 2.04 | 14000 | 2.8027 |
76
+ | 2.1834 | 2.11 | 14500 | 2.7880 |
77
+ | 2.1723 | 2.18 | 15000 | 2.7783 |
78
+ | 2.1651 | 2.26 | 15500 | 2.7739 |
79
+ | 2.1575 | 2.33 | 16000 | 2.7825 |
80
+ | 2.1598 | 2.4 | 16500 | 2.7660 |
81
+ | 2.1667 | 2.47 | 17000 | 2.7578 |
82
+ | 2.1565 | 2.55 | 17500 | 2.7580 |
83
+ | 2.1558 | 2.62 | 18000 | 2.7561 |
84
+ | 2.1642 | 2.69 | 18500 | 2.7512 |
85
+ | 2.1374 | 2.77 | 19000 | 2.7361 |
86
+ | 2.1402 | 2.84 | 19500 | 2.7385 |
87
+ | 2.1326 | 2.91 | 20000 | 2.7235 |
88
+ | 2.1272 | 2.98 | 20500 | 2.7183 |
89
+ | 2.0954 | 3.06 | 21000 | 2.7156 |
90
+ | 2.0842 | 3.13 | 21500 | 2.7065 |
91
+ | 2.0859 | 3.2 | 22000 | 2.7089 |
92
+ | 2.0856 | 3.27 | 22500 | 2.6962 |
93
+ | 2.0775 | 3.35 | 23000 | 2.6931 |
94
+ | 2.0821 | 3.42 | 23500 | 2.6933 |
95
+ | 2.0706 | 3.49 | 24000 | 2.7011 |
96
+ | 2.0689 | 3.57 | 24500 | 2.7009 |
97
+ | 2.0807 | 3.64 | 25000 | 2.6825 |
98
+ | 2.0639 | 3.71 | 25500 | 2.6744 |
99
+ | 2.0742 | 3.78 | 26000 | 2.6777 |
100
+ | 2.0789 | 3.86 | 26500 | 2.6689 |
101
+ | 2.0594 | 3.93 | 27000 | 2.6566 |
102
+ | 2.056 | 4.0 | 27500 | 2.6676 |
103
+ | 2.0223 | 4.08 | 28000 | 2.6711 |
104
+ | 2.0185 | 4.15 | 28500 | 2.6568 |
105
+ | 2.018 | 4.22 | 29000 | 2.6567 |
106
+ | 2.0036 | 4.29 | 29500 | 2.6545 |
107
+ | 2.0238 | 4.37 | 30000 | 2.6559 |
108
+ | 2.0091 | 4.44 | 30500 | 2.6450 |
109
+ | 2.0096 | 4.51 | 31000 | 2.6389 |
110
+ | 2.0083 | 4.58 | 31500 | 2.6401 |
111
+ | 2.0012 | 4.66 | 32000 | 2.6399 |
112
+ | 2.0166 | 4.73 | 32500 | 2.6289 |
113
+ | 1.9963 | 4.8 | 33000 | 2.6348 |
114
+ | 1.9943 | 4.88 | 33500 | 2.6240 |
115
+ | 2.0099 | 4.95 | 34000 | 2.6190 |
116
+ | 1.9895 | 5.02 | 34500 | 2.6308 |
117
+ | 1.9581 | 5.09 | 35000 | 2.6385 |
118
+ | 1.9502 | 5.17 | 35500 | 2.6237 |
119
+ | 1.9485 | 5.24 | 36000 | 2.6248 |
120
+ | 1.9643 | 5.31 | 36500 | 2.6279 |
121
+ | 1.9535 | 5.38 | 37000 | 2.6185 |
122
+ | 1.9575 | 5.46 | 37500 | 2.6146 |
123
+ | 1.9475 | 5.53 | 38000 | 2.6093 |
124
+ | 1.9434 | 5.6 | 38500 | 2.6090 |
125
+ | 1.954 | 5.68 | 39000 | 2.6027 |
126
+ | 1.9509 | 5.75 | 39500 | 2.6107 |
127
+ | 1.9454 | 5.82 | 40000 | 2.5980 |
128
+ | 1.9479 | 5.89 | 40500 | 2.6016 |
129
+ | 1.9539 | 5.97 | 41000 | 2.5971 |
130
+ | 1.9119 | 6.04 | 41500 | 2.6228 |
131
+ | 1.8974 | 6.11 | 42000 | 2.6169 |
132
+ | 1.9038 | 6.19 | 42500 | 2.6027 |
133
+ | 1.9008 | 6.26 | 43000 | 2.6027 |
134
+ | 1.9142 | 6.33 | 43500 | 2.6011 |
135
+ | 1.8783 | 6.4 | 44000 | 2.5960 |
136
+ | 1.8896 | 6.48 | 44500 | 2.6111 |
137
+ | 1.8975 | 6.55 | 45000 | 2.5889 |
138
+ | 1.9048 | 6.62 | 45500 | 2.6007 |
139
+ | 1.9049 | 6.69 | 46000 | 2.5972 |
140
+ | 1.8969 | 6.77 | 46500 | 2.6053 |
141
+ | 1.9105 | 6.84 | 47000 | 2.5893 |
142
+ | 1.8921 | 6.91 | 47500 | 2.5883 |
143
+ | 1.8918 | 6.99 | 48000 | 2.5792 |
144
+ | 1.8671 | 7.06 | 48500 | 2.6041 |
145
+ | 1.8551 | 7.13 | 49000 | 2.6070 |
146
+ | 1.8555 | 7.2 | 49500 | 2.6148 |
147
+ | 1.8543 | 7.28 | 50000 | 2.6077 |
148
+ | 1.8485 | 7.35 | 50500 | 2.6131 |
149
+ | 1.8474 | 7.42 | 51000 | 2.6039 |
150
+ | 1.8474 | 7.5 | 51500 | 2.5973 |
151
+ | 1.8442 | 7.57 | 52000 | 2.5946 |
152
+ | 1.8329 | 7.64 | 52500 | 2.6069 |
153
+ | 1.8551 | 7.71 | 53000 | 2.5923 |
154
+ | 1.8433 | 7.79 | 53500 | 2.5922 |
155
+ | 1.851 | 7.86 | 54000 | 2.5993 |
156
+ | 1.8313 | 7.93 | 54500 | 2.5960 |
157
+ | 1.8298 | 8.0 | 55000 | 2.6058 |
158
+ | 1.8159 | 8.08 | 55500 | 2.6286 |
159
+ | 1.817 | 8.15 | 56000 | 2.6348 |
160
+ | 1.8066 | 8.22 | 56500 | 2.6411 |
161
+ | 1.7935 | 8.3 | 57000 | 2.6338 |
162
+ | 1.809 | 8.37 | 57500 | 2.6290 |
163
+ | 1.812 | 8.44 | 58000 | 2.6258 |
164
+ | 1.79 | 8.51 | 58500 | 2.6321 |
165
+ | 1.8046 | 8.59 | 59000 | 2.6291 |
166
+ | 1.7975 | 8.66 | 59500 | 2.6283 |
167
+ | 1.7968 | 8.73 | 60000 | 2.6284 |
168
+ | 1.7779 | 8.81 | 60500 | 2.6257 |
169
+ | 1.7664 | 8.88 | 61000 | 2.6232 |
170
+ | 1.792 | 8.95 | 61500 | 2.6305 |
171
+ | 1.7725 | 9.02 | 62000 | 2.6525 |
172
+ | 1.7563 | 9.1 | 62500 | 2.6794 |
173
+ | 1.7606 | 9.17 | 63000 | 2.6784 |
174
+ | 1.7666 | 9.24 | 63500 | 2.6798 |
175
+ | 1.7551 | 9.31 | 64000 | 2.6813 |
176
+ | 1.7578 | 9.39 | 64500 | 2.6830 |
177
+ | 1.7483 | 9.46 | 65000 | 2.6833 |
178
+ | 1.7431 | 9.53 | 65500 | 2.6884 |
179
+ | 1.743 | 9.61 | 66000 | 2.6932 |
180
+ | 1.7395 | 9.68 | 66500 | 2.6927 |
181
+ | 1.7473 | 9.75 | 67000 | 2.6904 |
182
+ | 1.7413 | 9.82 | 67500 | 2.6892 |
183
+ | 1.7437 | 9.9 | 68000 | 2.6898 |
184
+ | 1.7546 | 9.97 | 68500 | 2.6894 |
185
 
186
 
187
  ### Framework versions
188
 
189
  - Transformers 4.21.0.dev0
190
+ - Pytorch 1.12.0+cu102
191
  - Datasets 2.3.2
192
  - Tokenizers 0.12.1