kksukk commited on
Commit
dace55c
1 Parent(s): 24671c5

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +285 -0
README.md ADDED
@@ -0,0 +1,285 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - zeroth_korean_asr
7
+ metrics:
8
+ - wer
9
+ model-index:
10
+ - name: hubert_zeroth_gpu
11
+ results:
12
+ - task:
13
+ name: Automatic Speech Recognition
14
+ type: automatic-speech-recognition
15
+ dataset:
16
+ name: zeroth_korean_asr
17
+ type: zeroth_korean_asr
18
+ config: clean
19
+ split: train
20
+ args: clean
21
+ metrics:
22
+ - name: Wer
23
+ type: wer
24
+ value: 1.0
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # hubert_zeroth_gpu
31
+
32
+ This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on the zeroth_korean_asr dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 4.8302
35
+ - Wer: 1.0
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 0.0003
55
+ - train_batch_size: 16
56
+ - eval_batch_size: 16
57
+ - seed: 42
58
+ - gradient_accumulation_steps: 2
59
+ - total_train_batch_size: 32
60
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
61
+ - lr_scheduler_type: linear
62
+ - lr_scheduler_warmup_steps: 500
63
+ - num_epochs: 30
64
+ - mixed_precision_training: Native AMP
65
+
66
+ ### Training results
67
+
68
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
69
+ |:-------------:|:-----:|:-----:|:---------------:|:---:|
70
+ | 26.5222 | 0.14 | 100 | 10.9084 | 1.0 |
71
+ | 6.6076 | 0.29 | 200 | 4.8783 | 1.0 |
72
+ | 4.8383 | 0.43 | 300 | 4.8768 | 1.0 |
73
+ | 4.8372 | 0.57 | 400 | 4.8608 | 1.0 |
74
+ | 4.8298 | 0.72 | 500 | 4.8625 | 1.0 |
75
+ | 4.8377 | 0.86 | 600 | 4.8646 | 1.0 |
76
+ | 4.829 | 1.01 | 700 | 4.8472 | 1.0 |
77
+ | 4.8282 | 1.15 | 800 | 4.8435 | 1.0 |
78
+ | 4.8282 | 1.29 | 900 | 4.8438 | 1.0 |
79
+ | 4.8299 | 1.44 | 1000 | 4.8540 | 1.0 |
80
+ | 4.8276 | 1.58 | 1100 | 4.8408 | 1.0 |
81
+ | 4.8306 | 1.72 | 1200 | 4.8390 | 1.0 |
82
+ | 4.8315 | 1.87 | 1300 | 4.8426 | 1.0 |
83
+ | 4.8296 | 2.01 | 1400 | 4.8418 | 1.0 |
84
+ | 4.829 | 2.16 | 1500 | 4.8475 | 1.0 |
85
+ | 4.8324 | 2.3 | 1600 | 4.8409 | 1.0 |
86
+ | 4.8299 | 2.44 | 1700 | 4.8360 | 1.0 |
87
+ | 4.8285 | 2.59 | 1800 | 4.8419 | 1.0 |
88
+ | 4.8267 | 2.73 | 1900 | 4.8355 | 1.0 |
89
+ | 4.8232 | 2.87 | 2000 | 4.8445 | 1.0 |
90
+ | 4.8179 | 3.02 | 2100 | 4.8390 | 1.0 |
91
+ | 4.8248 | 3.16 | 2200 | 4.8506 | 1.0 |
92
+ | 4.8184 | 3.3 | 2300 | 4.8392 | 1.0 |
93
+ | 4.8268 | 3.45 | 2400 | 4.8509 | 1.0 |
94
+ | 4.8315 | 3.59 | 2500 | 4.8469 | 1.0 |
95
+ | 4.8249 | 3.74 | 2600 | 4.8457 | 1.0 |
96
+ | 4.8244 | 3.88 | 2700 | 4.8414 | 1.0 |
97
+ | 4.8226 | 4.02 | 2800 | 4.8333 | 1.0 |
98
+ | 4.8275 | 4.17 | 2900 | 4.8344 | 1.0 |
99
+ | 4.8218 | 4.31 | 3000 | 4.8351 | 1.0 |
100
+ | 4.8199 | 4.45 | 3100 | 4.8386 | 1.0 |
101
+ | 4.825 | 4.6 | 3200 | 4.8344 | 1.0 |
102
+ | 4.828 | 4.74 | 3300 | 4.8372 | 1.0 |
103
+ | 4.8228 | 4.89 | 3400 | 4.8349 | 1.0 |
104
+ | 4.8264 | 5.03 | 3500 | 4.8344 | 1.0 |
105
+ | 4.8237 | 5.17 | 3600 | 4.8332 | 1.0 |
106
+ | 4.8269 | 5.32 | 3700 | 4.8376 | 1.0 |
107
+ | 4.833 | 5.46 | 3800 | 4.8380 | 1.0 |
108
+ | 4.8188 | 5.6 | 3900 | 4.8352 | 1.0 |
109
+ | 4.8208 | 5.75 | 4000 | 4.8354 | 1.0 |
110
+ | 4.8177 | 5.89 | 4100 | 4.8291 | 1.0 |
111
+ | 4.8208 | 6.03 | 4200 | 4.8500 | 1.0 |
112
+ | 4.8242 | 6.18 | 4300 | 4.8369 | 1.0 |
113
+ | 4.8222 | 6.32 | 4400 | 4.8366 | 1.0 |
114
+ | 4.8259 | 6.47 | 4500 | 4.8369 | 1.0 |
115
+ | 4.8231 | 6.61 | 4600 | 4.8319 | 1.0 |
116
+ | 4.825 | 6.75 | 4700 | 4.8363 | 1.0 |
117
+ | 4.8245 | 6.9 | 4800 | 4.8420 | 1.0 |
118
+ | 4.8139 | 7.04 | 4900 | 4.8427 | 1.0 |
119
+ | 4.8202 | 7.18 | 5000 | 4.8393 | 1.0 |
120
+ | 4.8196 | 7.33 | 5100 | 4.8380 | 1.0 |
121
+ | 4.8199 | 7.47 | 5200 | 4.8364 | 1.0 |
122
+ | 4.8264 | 7.61 | 5300 | 4.8414 | 1.0 |
123
+ | 4.8259 | 7.76 | 5400 | 4.8397 | 1.0 |
124
+ | 4.8215 | 7.9 | 5500 | 4.8376 | 1.0 |
125
+ | 4.8198 | 8.05 | 5600 | 4.8344 | 1.0 |
126
+ | 4.828 | 8.19 | 5700 | 4.8314 | 1.0 |
127
+ | 4.8246 | 8.33 | 5800 | 4.8361 | 1.0 |
128
+ | 4.8167 | 8.48 | 5900 | 4.8336 | 1.0 |
129
+ | 4.8174 | 8.62 | 6000 | 4.8345 | 1.0 |
130
+ | 4.8283 | 8.76 | 6100 | 4.8363 | 1.0 |
131
+ | 4.8231 | 8.91 | 6200 | 4.8345 | 1.0 |
132
+ | 4.8191 | 9.05 | 6300 | 4.8327 | 1.0 |
133
+ | 4.8144 | 9.2 | 6400 | 4.8299 | 1.0 |
134
+ | 4.8206 | 9.34 | 6500 | 4.8281 | 1.0 |
135
+ | 4.822 | 9.48 | 6600 | 4.8329 | 1.0 |
136
+ | 4.8228 | 9.63 | 6700 | 4.8309 | 1.0 |
137
+ | 4.8239 | 9.77 | 6800 | 4.8348 | 1.0 |
138
+ | 4.8245 | 9.91 | 6900 | 4.8309 | 1.0 |
139
+ | 4.8173 | 10.06 | 7000 | 4.8303 | 1.0 |
140
+ | 4.8188 | 10.2 | 7100 | 4.8335 | 1.0 |
141
+ | 4.8208 | 10.34 | 7200 | 4.8290 | 1.0 |
142
+ | 4.8228 | 10.49 | 7300 | 4.8316 | 1.0 |
143
+ | 4.8226 | 10.63 | 7400 | 4.8272 | 1.0 |
144
+ | 4.824 | 10.78 | 7500 | 4.8309 | 1.0 |
145
+ | 4.8175 | 10.92 | 7600 | 4.8317 | 1.0 |
146
+ | 4.8234 | 11.06 | 7700 | 4.8271 | 1.0 |
147
+ | 4.8188 | 11.21 | 7800 | 4.8291 | 1.0 |
148
+ | 4.8182 | 11.35 | 7900 | 4.8340 | 1.0 |
149
+ | 4.8224 | 11.49 | 8000 | 4.8309 | 1.0 |
150
+ | 4.8207 | 11.64 | 8100 | 4.8308 | 1.0 |
151
+ | 4.8207 | 11.78 | 8200 | 4.8301 | 1.0 |
152
+ | 4.822 | 11.93 | 8300 | 4.8281 | 1.0 |
153
+ | 4.8199 | 12.07 | 8400 | 4.8301 | 1.0 |
154
+ | 4.8198 | 12.21 | 8500 | 4.8337 | 1.0 |
155
+ | 4.8212 | 12.36 | 8600 | 4.8310 | 1.0 |
156
+ | 4.8211 | 12.5 | 8700 | 4.8304 | 1.0 |
157
+ | 4.8226 | 12.64 | 8800 | 4.8303 | 1.0 |
158
+ | 4.8224 | 12.79 | 8900 | 4.8312 | 1.0 |
159
+ | 4.8146 | 12.93 | 9000 | 4.8362 | 1.0 |
160
+ | 4.8173 | 13.07 | 9100 | 4.8321 | 1.0 |
161
+ | 4.816 | 13.22 | 9200 | 4.8347 | 1.0 |
162
+ | 4.8219 | 13.36 | 9300 | 4.8377 | 1.0 |
163
+ | 4.8251 | 13.51 | 9400 | 4.8403 | 1.0 |
164
+ | 4.8173 | 13.65 | 9500 | 4.8387 | 1.0 |
165
+ | 4.8226 | 13.79 | 9600 | 4.8375 | 1.0 |
166
+ | 4.8137 | 13.94 | 9700 | 4.8364 | 1.0 |
167
+ | 4.819 | 14.08 | 9800 | 4.8323 | 1.0 |
168
+ | 4.8258 | 14.22 | 9900 | 4.8329 | 1.0 |
169
+ | 4.8097 | 14.37 | 10000 | 4.8293 | 1.0 |
170
+ | 4.8247 | 14.51 | 10100 | 4.8311 | 1.0 |
171
+ | 4.8197 | 14.66 | 10200 | 4.8306 | 1.0 |
172
+ | 4.8201 | 14.8 | 10300 | 4.8308 | 1.0 |
173
+ | 4.8158 | 14.94 | 10400 | 4.8319 | 1.0 |
174
+ | 4.818 | 15.09 | 10500 | 4.8306 | 1.0 |
175
+ | 4.8216 | 15.23 | 10600 | 4.8343 | 1.0 |
176
+ | 4.8096 | 15.37 | 10700 | 4.8326 | 1.0 |
177
+ | 4.8248 | 15.52 | 10800 | 4.8323 | 1.0 |
178
+ | 4.8178 | 15.66 | 10900 | 4.8358 | 1.0 |
179
+ | 4.8191 | 15.8 | 11000 | 4.8338 | 1.0 |
180
+ | 4.8248 | 15.95 | 11100 | 4.8359 | 1.0 |
181
+ | 4.8095 | 16.09 | 11200 | 4.8392 | 1.0 |
182
+ | 4.8196 | 16.24 | 11300 | 4.8374 | 1.0 |
183
+ | 4.827 | 16.38 | 11400 | 4.8346 | 1.0 |
184
+ | 4.8165 | 16.52 | 11500 | 4.8365 | 1.0 |
185
+ | 4.8206 | 16.67 | 11600 | 4.8344 | 1.0 |
186
+ | 4.8169 | 16.81 | 11700 | 4.8344 | 1.0 |
187
+ | 4.8164 | 16.95 | 11800 | 4.8390 | 1.0 |
188
+ | 4.8159 | 17.1 | 11900 | 4.8367 | 1.0 |
189
+ | 4.8202 | 17.24 | 12000 | 4.8375 | 1.0 |
190
+ | 4.8156 | 17.39 | 12100 | 4.8362 | 1.0 |
191
+ | 4.8174 | 17.53 | 12200 | 4.8410 | 1.0 |
192
+ | 4.8188 | 17.67 | 12300 | 4.8323 | 1.0 |
193
+ | 4.8167 | 17.82 | 12400 | 4.8319 | 1.0 |
194
+ | 4.8229 | 17.96 | 12500 | 4.8347 | 1.0 |
195
+ | 4.8179 | 18.1 | 12600 | 4.8320 | 1.0 |
196
+ | 4.8182 | 18.25 | 12700 | 4.8384 | 1.0 |
197
+ | 4.8151 | 18.39 | 12800 | 4.8374 | 1.0 |
198
+ | 4.8212 | 18.53 | 12900 | 4.8346 | 1.0 |
199
+ | 4.8241 | 18.68 | 13000 | 4.8344 | 1.0 |
200
+ | 4.8184 | 18.82 | 13100 | 4.8352 | 1.0 |
201
+ | 4.8174 | 18.97 | 13200 | 4.8357 | 1.0 |
202
+ | 4.8092 | 19.11 | 13300 | 4.8332 | 1.0 |
203
+ | 4.8149 | 19.25 | 13400 | 4.8347 | 1.0 |
204
+ | 4.813 | 19.4 | 13500 | 4.8376 | 1.0 |
205
+ | 4.8226 | 19.54 | 13600 | 4.8343 | 1.0 |
206
+ | 4.8175 | 19.68 | 13700 | 4.8320 | 1.0 |
207
+ | 4.8203 | 19.83 | 13800 | 4.8339 | 1.0 |
208
+ | 4.8227 | 19.97 | 13900 | 4.8324 | 1.0 |
209
+ | 4.8177 | 20.11 | 14000 | 4.8356 | 1.0 |
210
+ | 4.824 | 20.26 | 14100 | 4.8339 | 1.0 |
211
+ | 4.815 | 20.4 | 14200 | 4.8342 | 1.0 |
212
+ | 4.8189 | 20.55 | 14300 | 4.8340 | 1.0 |
213
+ | 4.8115 | 20.69 | 14400 | 4.8319 | 1.0 |
214
+ | 4.8162 | 20.83 | 14500 | 4.8288 | 1.0 |
215
+ | 4.8183 | 20.98 | 14600 | 4.8321 | 1.0 |
216
+ | 4.8189 | 21.12 | 14700 | 4.8315 | 1.0 |
217
+ | 4.8123 | 21.26 | 14800 | 4.8311 | 1.0 |
218
+ | 4.8165 | 21.41 | 14900 | 4.8321 | 1.0 |
219
+ | 4.8247 | 21.55 | 15000 | 4.8309 | 1.0 |
220
+ | 4.8165 | 21.7 | 15100 | 4.8313 | 1.0 |
221
+ | 4.815 | 21.84 | 15200 | 4.8354 | 1.0 |
222
+ | 4.8234 | 21.98 | 15300 | 4.8300 | 1.0 |
223
+ | 4.8134 | 22.13 | 15400 | 4.8284 | 1.0 |
224
+ | 4.8178 | 22.27 | 15500 | 4.8298 | 1.0 |
225
+ | 4.8128 | 22.41 | 15600 | 4.8309 | 1.0 |
226
+ | 4.8185 | 22.56 | 15700 | 4.8291 | 1.0 |
227
+ | 4.8177 | 22.7 | 15800 | 4.8288 | 1.0 |
228
+ | 4.8208 | 22.84 | 15900 | 4.8306 | 1.0 |
229
+ | 4.8183 | 22.99 | 16000 | 4.8277 | 1.0 |
230
+ | 4.8135 | 23.13 | 16100 | 4.8286 | 1.0 |
231
+ | 4.8116 | 23.28 | 16200 | 4.8275 | 1.0 |
232
+ | 4.816 | 23.42 | 16300 | 4.8290 | 1.0 |
233
+ | 4.8203 | 23.56 | 16400 | 4.8292 | 1.0 |
234
+ | 4.8198 | 23.71 | 16500 | 4.8299 | 1.0 |
235
+ | 4.8203 | 23.85 | 16600 | 4.8294 | 1.0 |
236
+ | 4.8177 | 23.99 | 16700 | 4.8286 | 1.0 |
237
+ | 4.8153 | 24.14 | 16800 | 4.8275 | 1.0 |
238
+ | 4.8201 | 24.28 | 16900 | 4.8259 | 1.0 |
239
+ | 4.8189 | 24.43 | 17000 | 4.8289 | 1.0 |
240
+ | 4.8219 | 24.57 | 17100 | 4.8280 | 1.0 |
241
+ | 4.8148 | 24.71 | 17200 | 4.8284 | 1.0 |
242
+ | 4.8113 | 24.86 | 17300 | 4.8286 | 1.0 |
243
+ | 4.8133 | 25.0 | 17400 | 4.8293 | 1.0 |
244
+ | 4.8164 | 25.14 | 17500 | 4.8302 | 1.0 |
245
+ | 4.8231 | 25.29 | 17600 | 4.8278 | 1.0 |
246
+ | 4.8136 | 25.43 | 17700 | 4.8296 | 1.0 |
247
+ | 4.8118 | 25.57 | 17800 | 4.8288 | 1.0 |
248
+ | 4.8139 | 25.72 | 17900 | 4.8280 | 1.0 |
249
+ | 4.8144 | 25.86 | 18000 | 4.8282 | 1.0 |
250
+ | 4.8206 | 26.01 | 18100 | 4.8279 | 1.0 |
251
+ | 4.8096 | 26.15 | 18200 | 4.8281 | 1.0 |
252
+ | 4.8177 | 26.29 | 18300 | 4.8271 | 1.0 |
253
+ | 4.8222 | 26.44 | 18400 | 4.8289 | 1.0 |
254
+ | 4.8148 | 26.58 | 18500 | 4.8282 | 1.0 |
255
+ | 4.8148 | 26.72 | 18600 | 4.8277 | 1.0 |
256
+ | 4.819 | 26.87 | 18700 | 4.8283 | 1.0 |
257
+ | 4.8138 | 27.01 | 18800 | 4.8290 | 1.0 |
258
+ | 4.8094 | 27.16 | 18900 | 4.8292 | 1.0 |
259
+ | 4.8236 | 27.3 | 19000 | 4.8282 | 1.0 |
260
+ | 4.8208 | 27.44 | 19100 | 4.8293 | 1.0 |
261
+ | 4.816 | 27.59 | 19200 | 4.8281 | 1.0 |
262
+ | 4.8103 | 27.73 | 19300 | 4.8294 | 1.0 |
263
+ | 4.8152 | 27.87 | 19400 | 4.8297 | 1.0 |
264
+ | 4.8158 | 28.02 | 19500 | 4.8305 | 1.0 |
265
+ | 4.8121 | 28.16 | 19600 | 4.8294 | 1.0 |
266
+ | 4.8199 | 28.3 | 19700 | 4.8292 | 1.0 |
267
+ | 4.8185 | 28.45 | 19800 | 4.8288 | 1.0 |
268
+ | 4.8199 | 28.59 | 19900 | 4.8288 | 1.0 |
269
+ | 4.8102 | 28.74 | 20000 | 4.8292 | 1.0 |
270
+ | 4.8168 | 28.88 | 20100 | 4.8291 | 1.0 |
271
+ | 4.8117 | 29.02 | 20200 | 4.8304 | 1.0 |
272
+ | 4.8156 | 29.17 | 20300 | 4.8295 | 1.0 |
273
+ | 4.8126 | 29.31 | 20400 | 4.8296 | 1.0 |
274
+ | 4.8193 | 29.45 | 20500 | 4.8302 | 1.0 |
275
+ | 4.8175 | 29.6 | 20600 | 4.8301 | 1.0 |
276
+ | 4.8167 | 29.74 | 20700 | 4.8301 | 1.0 |
277
+ | 4.8137 | 29.89 | 20800 | 4.8302 | 1.0 |
278
+
279
+
280
+ ### Framework versions
281
+
282
+ - Transformers 4.24.0
283
+ - Pytorch 1.13.0+cu117
284
+ - Datasets 2.0.0
285
+ - Tokenizers 0.13.2