Yurii Paniv commited on
Commit
2162892
1 Parent(s): 72470a2

Update model to WER 27.99%

Browse files
README.md CHANGED
@@ -10,7 +10,7 @@ datasets:
10
  - common_voice
11
  model-index:
12
  - name: wav2vec2-xls-r-300m-uk
13
- results:
14
  - task:
15
  name: Speech Recognition
16
  type: automatic-speech-recognition
@@ -21,7 +21,7 @@ model-index:
21
  metrics:
22
  - name: Test WER
23
  type: wer
24
- value: 31.59
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -29,11 +29,11 @@ should probably proofread and complete it, then remove this comment. -->
29
 
30
  # wav2vec2-xls-r-300m-uk
31
 
32
- This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the Common Voice 7.0 dataset.
33
  It achieves the following results on the evaluation set:
34
- - Loss: 0.4754
35
- - Wer: 0.3159
36
- - Cer: 0.0739
37
 
38
  ## Model description
39
 
@@ -52,149 +52,75 @@ More information needed
52
  ### Training hyperparameters
53
 
54
  The following hyperparameters were used during training:
55
- - learning_rate: 0.0005
56
  - train_batch_size: 8
57
- - eval_batch_size: 4
58
  - seed: 42
 
 
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_steps: 500
62
- - num_epochs: 180.0
63
  - mixed_precision_training: Native AMP
64
 
65
  ### Training results
66
 
67
- | Training Loss | Epoch | Step | Cer | Validation Loss | Wer |
68
- |:-------------:|:------:|:------:|:------:|:---------------:|:------:|
69
- | 3.0162 | 0.12 | 500 | 1.0 | 3.1486 | 1.0 |
70
- | 1.6532 | 0.24 | 1000 | 0.4583 | 1.3737 | 0.9951 |
71
- | 1.3941 | 0.37 | 1500 | 0.3709 | 1.1033 | 0.9866 |
72
- | 1.3275 | 0.49 | 2000 | 0.3487 | 1.0937 | 0.9540 |
73
- | 1.2648 | 0.61 | 2500 | 0.3137 | 0.9403 | 0.9450 |
74
- | 1.3085 | 0.73 | 3000 | 0.3090 | 0.9275 | 0.9288 |
75
- | 1.1934 | 0.85 | 3500 | 0.2816 | 0.8737 | 0.8882 |
76
- | 1.1909 | 0.98 | 4000 | 0.2780 | 0.8657 | 0.8698 |
77
- | 1.0647 | 1.1 | 4500 | 0.2660 | 0.8246 | 0.8817 |
78
- | 1.1362 | 1.22 | 5000 | 0.2711 | 0.8032 | 0.9086 |
79
- | 1.0994 | 1.34 | 5500 | 0.2462 | 0.7719 | 0.8306 |
80
- | 1.1 | 1.46 | 6000 | 0.2561 | 0.7853 | 0.8401 |
81
- | 1.0629 | 1.59 | 6500 | 0.2459 | 0.7809 | 0.8245 |
82
- | 1.1032 | 1.71 | 7000 | 0.2427 | 0.7638 | 0.8227 |
83
- | 1.0171 | 1.83 | 7500 | 0.2332 | 0.7411 | 0.8087 |
84
- | 1.0591 | 1.95 | 8000 | 0.2362 | 0.7332 | 0.8274 |
85
- | 0.9725 | 2.07 | 8500 | 0.2217 | 0.7190 | 0.7847 |
86
- | 1.03 | 2.2 | 9000 | 0.2356 | 0.7176 | 0.8255 |
87
- | 0.9939 | 2.32 | 9500 | 0.2471 | 0.7189 | 0.8653 |
88
- | 0.9564 | 2.44 | 10000 | 0.2270 | 0.7050 | 0.7984 |
89
- | 0.966 | 2.56 | 10500 | 0.2200 | 0.6984 | 0.7738 |
90
- | 0.9858 | 2.68 | 11000 | 0.2255 | 0.6885 | 0.8050 |
91
- | 0.9484 | 2.81 | 11500 | 0.2183 | 0.6879 | 0.7646 |
92
- | 0.9244 | 2.93 | 12000 | 0.2166 | 0.6590 | 0.7744 |
93
- | 0.9224 | 3.05 | 12500 | 0.2035 | 0.6523 | 0.7477 |
94
- | 0.9148 | 3.17 | 13000 | 0.2054 | 0.6522 | 0.7507 |
95
- | 0.9227 | 3.29 | 13500 | 0.2037 | 0.6420 | 0.7541 |
96
- | 0.8935 | 3.42 | 14000 | 0.2014 | 0.6442 | 0.7416 |
97
- | 0.9257 | 3.54 | 14500 | 0.1986 | 0.6285 | 0.7263 |
98
- | 0.9194 | 3.66 | 15000 | 0.1938 | 0.6117 | 0.72 |
99
- | 0.9158 | 3.78 | 15500 | 0.1942 | 0.6197 | 0.7234 |
100
- | 0.9079 | 3.9 | 16000 | 0.1939 | 0.6110 | 0.7187 |
101
- | 0.8748 | 4.03 | 16500 | 0.1924 | 0.6182 | 0.7096 |
102
- | 0.8646 | 4.15 | 17000 | 0.1894 | 0.6105 | 0.7057 |
103
- | 0.8455 | 4.27 | 17500 | 0.1912 | 0.6236 | 0.7036 |
104
- | 0.8922 | 4.39 | 18000 | 0.1921 | 0.5946 | 0.7341 |
105
- | 0.892 | 4.51 | 18500 | 0.1869 | 0.5912 | 0.7142 |
106
- | 0.8652 | 4.64 | 19000 | 0.1871 | 0.6005 | 0.6966 |
107
- | 0.899 | 4.76 | 19500 | 0.1828 | 0.5773 | 0.6981 |
108
- | 0.8552 | 4.88 | 20000 | 0.1805 | 0.5840 | 0.6875 |
109
- | 0.8581 | 5.0 | 20500 | 0.1900 | 0.5941 | 0.7327 |
110
- | 0.8571 | 5.12 | 21000 | 0.1846 | 0.5919 | 0.7049 |
111
- | 0.7979 | 5.25 | 21500 | 0.1748 | 0.5704 | 0.6698 |
112
- | 0.8348 | 5.37 | 22000 | 0.1789 | 0.5869 | 0.6766 |
113
- | 0.7843 | 5.49 | 22500 | 0.1750 | 0.5732 | 0.6732 |
114
- | 0.855 | 5.61 | 23000 | 0.1687 | 0.5448 | 0.6520 |
115
- | 0.7774 | 5.73 | 23500 | 0.1759 | 0.5685 | 0.6818 |
116
- | 0.8622 | 5.86 | 24000 | 0.1742 | 0.5598 | 0.6687 |
117
- | 0.7968 | 5.98 | 24500 | 0.1699 | 0.5589 | 0.6577 |
118
- | 0.8253 | 6.1 | 25000 | 0.1689 | 0.5601 | 0.6617 |
119
- | 0.7947 | 6.22 | 25500 | 0.1678 | 0.5527 | 0.6472 |
120
- | 0.8273 | 6.34 | 26000 | 0.1723 | 0.5426 | 0.6673 |
121
- | 0.8085 | 6.47 | 26500 | 0.1682 | 0.5464 | 0.6476 |
122
- | 0.8164 | 6.59 | 27000 | 0.1653 | 0.5460 | 0.6329 |
123
- | 0.755 | 6.71 | 27500 | 0.1694 | 0.5420 | 0.6614 |
124
- | 0.822 | 6.83 | 28000 | 0.1699 | 0.5540 | 0.6493 |
125
- | 0.7957 | 6.95 | 28500 | 0.1630 | 0.5358 | 0.6373 |
126
- | 0.7739 | 7.08 | 29000 | 0.1727 | 0.5662 | 0.6696 |
127
- | 0.7833 | 7.2 | 29500 | 0.1594 | 0.5323 | 0.6227 |
128
- | 0.7737 | 7.32 | 30000 | 0.1613 | 0.5349 | 0.6303 |
129
- | 0.7697 | 7.44 | 30500 | 0.1623 | 0.5315 | 0.6386 |
130
- | 0.7647 | 7.56 | 31000 | 0.1608 | 0.5346 | 0.6219 |
131
- | 0.7123 | 7.69 | 31500 | 0.1561 | 0.5195 | 0.6110 |
132
- | 0.7412 | 7.81 | 32000 | 0.1613 | 0.5385 | 0.6256 |
133
- | 0.7702 | 7.93 | 32500 | 0.1614 | 0.5291 | 0.6343 |
134
- | 0.7561 | 8.05 | 33000 | 0.1553 | 0.5044 | 0.6138 |
135
- | 0.6707 | 8.78 | 36000 | 0.1484 | 0.4949 | 0.5881 |
136
- | 0.719 | 9.52 | 39000 | 0.1508 | 0.5014 | 0.5959 |
137
- | 0.6563 | 10.25 | 42000 | 0.1442 | 0.4852 | 0.5691 |
138
- | 0.7166 | 10.98 | 45000 | 0.1437 | 0.4731 | 0.5718 |
139
- | 0.6627 | 11.71 | 48000 | 0.1421 | 0.4787 | 0.5595 |
140
- | 0.6642 | 12.45 | 51000 | 0.1353 | 0.4787 | 0.5417 |
141
- | 0.615 | 13.18 | 54000 | 0.1324 | 0.4704 | 0.5297 |
142
- | 0.6308 | 13.91 | 57000 | 0.1298 | 0.4570 | 0.5181 |
143
- | 0.6169 | 14.64 | 60000 | 0.1291 | 0.4514 | 0.5106 |
144
- | 0.5731 | 15.37 | 63000 | 0.1259 | 0.4462 | 0.5028 |
145
- | 0.5328 | 16.11 | 66000 | 0.1246 | 0.4535 | 0.5023 |
146
- | 0.5743 | 16.84 | 69000 | 0.1255 | 0.4555 | 0.5069 |
147
- | 0.5363 | 17.57 | 72000 | 0.1214 | 0.4389 | 0.4915 |
148
- | 0.5078 | 18.3 | 75000 | 0.1222 | 0.4525 | 0.4915 |
149
- | 0.5075 | 19.03 | 78000 | 0.1208 | 0.4532 | 0.4871 |
150
- | 0.5461 | 19.77 | 81000 | 0.1196 | 0.4401 | 0.4813 |
151
- | 0.5044 | 20.5 | 84000 | 0.1144 | 0.4268 | 0.4654 |
152
- | 0.4332 | 21.23 | 87000 | 0.1138 | 0.4383 | 0.4626 |
153
- | 0.4671 | 21.96 | 90000 | 0.1118 | 0.4198 | 0.4547 |
154
- | 0.4451 | 22.69 | 93000 | 0.1119 | 0.4426 | 0.4509 |
155
- | 0.4319 | 23.43 | 96000 | 0.1096 | 0.4272 | 0.4472 |
156
- | 0.3624 | 24.16 | 99000 | 0.1078 | 0.4347 | 0.4437 |
157
- | 0.4512 | 24.89 | 102000 | 0.1102 | 0.4271 | 0.4471 |
158
- | 0.4049 | 25.62 | 105000 | 0.1071 | 0.4207 | 0.4349 |
159
- | 0.4134 | 26.35 | 108000 | 0.1061 | 0.4302 | 0.4351 |
160
- | 0.4083 | 27.09 | 111000 | 0.1062 | 0.4583 | 0.4320 |
161
- | 0.4618 | 27.82 | 114000 | 0.1046 | 0.4229 | 0.4281 |
162
- | 0.4538 | 28.55 | 117000 | 0.1022 | 0.4060 | 0.42 |
163
- | 0.4378 | 29.28 | 120000 | 0.1030 | 0.4239 | 0.4161 |
164
- | 0.4062 | 30.01 | 123000 | 0.1012 | 0.4130 | 0.4171 |
165
- | 0.3903 | 30.75 | 126000 | 0.1006 | 0.4134 | 0.4124 |
166
- | 0.369 | 31.48 | 129000 | 0.0976 | 0.4163 | 0.4007 |
167
- | 0.3896 | 32.21 | 132000 | 0.0986 | 0.3985 | 0.4015 |
168
- | 0.3912 | 32.94 | 135000 | 0.0964 | 0.4103 | 0.3948 |
169
- | 0.3995 | 33.67 | 138000 | 0.0975 | 0.3962 | 0.4024 |
170
- | 0.4042 | 34.41 | 141000 | 0.0940 | 0.4196 | 0.3947 |
171
- | 0.4055 | 35.14 | 144000 | 0.0949 | 0.3956 | 0.3882 |
172
- | 0.3831 | 35.87 | 147000 | 0.0933 | 0.3962 | 0.3842 |
173
- | 0.408 | 36.6 | 150000 | 0.0914 | 0.4019 | 0.3781 |
174
- | 0.3632 | 37.34 | 153000 | 0.0917 | 0.4083 | 0.3814 |
175
- | 0.381 | 38.07 | 156000 | 0.0914 | 0.4063 | 0.3738 |
176
- | 0.3891 | 38.8 | 159000 | 0.0900 | 0.4060 | 0.3734 |
177
- | 0.3668 | 39.53 | 162000 | 0.0893 | 0.4087 | 0.3701 |
178
- | 0.3243 | 133.39 | 165000 | 0.3808 | 0.3460 | 0.0820 |
179
- | 0.2861 | 135.81 | 168000 | 0.3986 | 0.3321 | 0.0788 |
180
- | 0.2684 | 138.24 | 171000 | 0.4015 | 0.3299 | 0.0774 |
181
- | 0.3027 | 140.66 | 174000 | 0.4023 | 0.3272 | 0.0771 |
182
- | 0.2742 | 143.09 | 177000 | 0.4133 | 0.3273 | 0.0770 |
183
- | 0.2339 | 145.51 | 180000 | 0.4287 | 0.3268 | 0.0771 |
184
- | 0.2547 | 147.94 | 183000 | 0.4396 | 0.3254 | 0.0768 |
185
- | 0.2072 | 150.36 | 186000 | 0.4586 | 0.3289 | 0.0774 |
186
- | 0.2444 | 152.79 | 189000 | 0.4524 | 0.3239 | 0.0762 |
187
- | 0.2272 | 155.21 | 192000 | 0.4620 | 0.3222 | 0.0759 |
188
- | 0.2102 | 157.64 | 195000 | 0.4533 | 0.3212 | 0.0754 |
189
- | 0.2231 | 160.06 | 198000 | 0.4563 | 0.3183 | 0.0745 |
190
- | 0.2096 | 162.49 | 201000 | 0.4669 | 0.3183 | 0.0747 |
191
- | 0.2173 | 164.92 | 204000 | 0.4704 | 0.3180 | 0.0746 |
192
- | 0.1797 | 167.34 | 207000 | 0.4653 | 0.3169 | 0.0739 |
193
- | 0.1841 | 169.77 | 210000 | 0.4726 | 0.3164 | 0.0737 |
194
- | 0.1774 | 172.19 | 213000 | 0.4742 | 0.3162 | 0.0738 |
195
- | 0.1819 | 174.62 | 216000 | 0.4720 | 0.3149 | 0.0737 |
196
- | 0.1746 | 177.04 | 219000 | 0.4736 | 0.3153 | 0.0738 |
197
- | 0.2101 | 179.47 | 222000 | 0.4756 | 0.3161 | 0.0738 |
198
 
199
 
200
  ### Framework versions
 
10
  - common_voice
11
  model-index:
12
  - name: wav2vec2-xls-r-300m-uk
13
+ results:
14
  - task:
15
  name: Speech Recognition
16
  type: automatic-speech-recognition
 
21
  metrics:
22
  - name: Test WER
23
  type: wer
24
+ value: 27.99
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
29
 
30
  # wav2vec2-xls-r-300m-uk
31
 
32
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.4165
35
+ - Wer: 0.2799
36
+ - Cer: 0.0601
37
 
38
  ## Model description
39
 
 
52
  ### Training hyperparameters
53
 
54
  The following hyperparameters were used during training:
55
+ - learning_rate: 0.0003
56
  - train_batch_size: 8
57
+ - eval_batch_size: 8
58
  - seed: 42
59
+ - gradient_accumulation_steps: 20
60
+ - total_train_batch_size: 160
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_steps: 500
64
+ - num_epochs: 500
65
  - mixed_precision_training: Native AMP
66
 
67
  ### Training results
68
 
69
+ | Training Loss | Epoch | Step | Cer | Validation Loss | Wer |
70
+ |:-------------:|:------:|:-----:|:------:|:---------------:|:------:|
71
+ | 4.3982 | 9.3 | 400 | 0.1437 | 0.5218 | 0.6507 |
72
+ | 0.229 | 18.6 | 800 | 0.0848 | 0.3679 | 0.4048 |
73
+ | 0.1054 | 27.9 | 1200 | 0.0778 | 0.3813 | 0.3670 |
74
+ | 0.0784 | 37.21 | 1600 | 0.0747 | 0.3839 | 0.3550 |
75
+ | 0.066 | 46.51 | 2000 | 0.0736 | 0.3970 | 0.3443 |
76
+ | 0.0603 | 55.8 | 2400 | 0.0722 | 0.3702 | 0.3393 |
77
+ | 0.0539 | 65.11 | 2800 | 0.0724 | 0.3762 | 0.3388 |
78
+ | 0.0497 | 74.41 | 3200 | 0.0713 | 0.3623 | 0.3414 |
79
+ | 0.0432 | 83.71 | 3600 | 0.0725 | 0.3847 | 0.3346 |
80
+ | 0.0438 | 93.02 | 4000 | 0.0750 | 0.4058 | 0.3393 |
81
+ | 0.0413 | 102.32 | 4400 | 0.0727 | 0.3957 | 0.3363 |
82
+ | 0.039 | 111.62 | 4800 | 0.0718 | 0.3865 | 0.3330 |
83
+ | 0.0356 | 120.92 | 5200 | 0.0711 | 0.3860 | 0.3319 |
84
+ | 0.0336 | 130.23 | 5600 | 0.0700 | 0.3902 | 0.3242 |
85
+ | 0.034 | 139.53 | 6000 | 0.0732 | 0.3930 | 0.3337 |
86
+ | 0.0273 | 148.83 | 6400 | 0.0748 | 0.3912 | 0.3375 |
87
+ | 0.027 | 158.14 | 6800 | 0.0752 | 0.4266 | 0.3434 |
88
+ | 0.028 | 167.44 | 7200 | 0.0708 | 0.3895 | 0.3227 |
89
+ | 0.0241 | 176.73 | 7600 | 0.0727 | 0.3967 | 0.3294 |
90
+ | 0.0241 | 186.05 | 8000 | 0.0712 | 0.4058 | 0.3255 |
91
+ | 0.0209 | 195.34 | 8400 | 0.0702 | 0.4102 | 0.3233 |
92
+ | 0.0206 | 204.64 | 8800 | 0.0699 | 0.4075 | 0.3194 |
93
+ | 0.0172 | 213.94 | 9200 | 0.0695 | 0.4222 | 0.3191 |
94
+ | 0.0166 | 223.25 | 9600 | 0.0678 | 0.3860 | 0.3135 |
95
+ | 0.0156 | 232.55 | 10000 | 0.0677 | 0.4035 | 0.3117 |
96
+ | 0.0149 | 241.85 | 10400 | 0.0677 | 0.3951 | 0.3087 |
97
+ | 0.0142 | 251.16 | 10800 | 0.0674 | 0.3972 | 0.3097 |
98
+ | 0.0134 | 260.46 | 11200 | 0.0675 | 0.4069 | 0.3111 |
99
+ | 0.0116 | 269.76 | 11600 | 0.0697 | 0.4189 | 0.3161 |
100
+ | 0.0119 | 279.07 | 12000 | 0.0648 | 0.3902 | 0.3008 |
101
+ | 0.0098 | 288.37 | 12400 | 0.0652 | 0.4095 | 0.3002 |
102
+ | 0.0091 | 297.67 | 12800 | 0.0644 | 0.3892 | 0.2990 |
103
+ | 0.0094 | 306.96 | 13200 | 0.0647 | 0.4026 | 0.2983 |
104
+ | 0.0081 | 316.28 | 13600 | 0.0646 | 0.4303 | 0.2978 |
105
+ | 0.0079 | 325.57 | 14000 | 0.0643 | 0.4044 | 0.2980 |
106
+ | 0.0072 | 334.87 | 14400 | 0.0655 | 0.3828 | 0.2999 |
107
+ | 0.0081 | 344.18 | 14800 | 0.0668 | 0.4108 | 0.3046 |
108
+ | 0.0088 | 353.48 | 15200 | 0.0654 | 0.4019 | 0.2993 |
109
+ | 0.0088 | 362.78 | 15600 | 0.0681 | 0.4073 | 0.3091 |
110
+ | 0.0079 | 372.09 | 16000 | 0.0667 | 0.4204 | 0.3055 |
111
+ | 0.0072 | 381.39 | 16400 | 0.0656 | 0.4030 | 0.3028 |
112
+ | 0.0073 | 390.69 | 16800 | 0.0677 | 0.4032 | 0.3081 |
113
+ | 0.0069 | 399.99 | 17200 | 0.0669 | 0.4130 | 0.3021 |
114
+ | 0.0063 | 409.3 | 17600 | 0.0651 | 0.4072 | 0.2979 |
115
+ | 0.0059 | 418.6 | 18000 | 0.0640 | 0.4110 | 0.2969 |
116
+ | 0.0056 | 427.9 | 18400 | 0.0647 | 0.4229 | 0.2995 |
117
+ | 0.005 | 437.21 | 18800 | 0.0624 | 0.4118 | 0.2885 |
118
+ | 0.0046 | 446.51 | 19200 | 0.0615 | 0.4111 | 0.2841 |
119
+ | 0.0043 | 455.8 | 19600 | 0.0616 | 0.4071 | 0.2850 |
120
+ | 0.0038 | 465.11 | 20000 | 0.0624 | 0.4268 | 0.2867 |
121
+ | 0.0035 | 474.41 | 20400 | 0.0605 | 0.4117 | 0.2820 |
122
+ | 0.0035 | 483.71 | 20800 | 0.0602 | 0.4155 | 0.2819 |
123
+ | 0.0034 | 493.02 | 21200 | 0.0601 | 0.4165 | 0.2799 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
124
 
125
 
126
  ### Framework versions
added_tokens.json CHANGED
@@ -1 +1 @@
1
- {"<s>": 37, "</s>": 38}
 
1
+ {"<s>": 36, "</s>": 37}
all_results.json DELETED
@@ -1,15 +0,0 @@
1
- {
2
- "epoch": 180.0,
3
- "eval_cer": 0.07386836134410645,
4
- "eval_loss": 0.4753616452217102,
5
- "eval_runtime": 157.9734,
6
- "eval_samples": 4193,
7
- "eval_samples_per_second": 26.542,
8
- "eval_steps_per_second": 6.64,
9
- "eval_wer": 0.31588907014681894,
10
- "train_loss": 0.061750437557927515,
11
- "train_runtime": 29496.4676,
12
- "train_samples": 9889,
13
- "train_samples_per_second": 60.347,
14
- "train_steps_per_second": 7.549
15
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json CHANGED
@@ -52,6 +52,7 @@
52
  "feat_proj_dropout": 0.0,
53
  "feat_quantizer_dropout": 0.0,
54
  "final_dropout": 0.0,
 
55
  "hidden_act": "gelu",
56
  "hidden_dropout": 0.0,
57
  "hidden_size": 1024,
@@ -59,12 +60,12 @@
59
  "intermediate_size": 4096,
60
  "layer_norm_eps": 1e-05,
61
  "layerdrop": 0.0,
62
- "mask_feature_length": 64,
63
  "mask_feature_min_masks": 0,
64
- "mask_feature_prob": 0.1,
65
  "mask_time_length": 10,
66
  "mask_time_min_masks": 2,
67
- "mask_time_prob": 0.3,
68
  "model_type": "wav2vec2",
69
  "num_adapter_layers": 3,
70
  "num_attention_heads": 16,
@@ -76,10 +77,10 @@
76
  "num_hidden_layers": 24,
77
  "num_negatives": 100,
78
  "output_hidden_size": 1024,
79
- "pad_token_id": 36,
80
  "proj_codevector_dim": 768,
81
  "torch_dtype": "float32",
82
  "transformers_version": "4.14.1",
83
  "use_weighted_layer_sum": false,
84
- "vocab_size": 39
85
  }
 
52
  "feat_proj_dropout": 0.0,
53
  "feat_quantizer_dropout": 0.0,
54
  "final_dropout": 0.0,
55
+ "gradient_checkpointing": false,
56
  "hidden_act": "gelu",
57
  "hidden_dropout": 0.0,
58
  "hidden_size": 1024,
 
60
  "intermediate_size": 4096,
61
  "layer_norm_eps": 1e-05,
62
  "layerdrop": 0.0,
63
+ "mask_feature_length": 10,
64
  "mask_feature_min_masks": 0,
65
+ "mask_feature_prob": 0.0,
66
  "mask_time_length": 10,
67
  "mask_time_min_masks": 2,
68
+ "mask_time_prob": 0.05,
69
  "model_type": "wav2vec2",
70
  "num_adapter_layers": 3,
71
  "num_attention_heads": 16,
 
77
  "num_hidden_layers": 24,
78
  "num_negatives": 100,
79
  "output_hidden_size": 1024,
80
+ "pad_token_id": 35,
81
  "proj_codevector_dim": 768,
82
  "torch_dtype": "float32",
83
  "transformers_version": "4.14.1",
84
  "use_weighted_layer_sum": false,
85
+ "vocab_size": 38
86
  }
eval_results.json DELETED
@@ -1,10 +0,0 @@
1
- {
2
- "epoch": 180.0,
3
- "eval_cer": 0.07386836134410645,
4
- "eval_loss": 0.4753616452217102,
5
- "eval_runtime": 157.9734,
6
- "eval_samples": 4193,
7
- "eval_samples_per_second": 26.542,
8
- "eval_steps_per_second": 6.64,
9
- "eval_wer": 0.31588907014681894
10
- }
 
 
 
 
 
 
 
 
 
 
 
preprocessor_config.json CHANGED
@@ -3,7 +3,7 @@
3
  "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
  "feature_size": 1,
5
  "padding_side": "right",
6
- "padding_value": 0,
7
  "return_attention_mask": true,
8
  "sampling_rate": 16000
9
  }
 
3
  "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
  "feature_size": 1,
5
  "padding_side": "right",
6
+ "padding_value": 0.0,
7
  "return_attention_mask": true,
8
  "sampling_rate": 16000
9
  }
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6fec030a8b93bf8846c4285ae762c4b49729965350f37f48ffae03557b3e1d55
3
- size 1262083569
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d59ff36212cbdd264b1851e4e7f8ebc8f3108665363c9ed12efdd9f2754c7337
3
+ size 1262079473
special_tokens_map.json CHANGED
@@ -1 +1 @@
1
- {"bos_token": "<s>", "eos_token": "</s>", "unk_token": "[UNK]", "pad_token": "[PAD]", "additional_special_tokens": [{"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}]}
 
1
+ {"bos_token": "<s>", "eos_token": "</s>", "unk_token": "[UNK]", "pad_token": "[PAD]", "additional_special_tokens": [{"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}]}
tokenizer_config.json CHANGED
@@ -1 +1 @@
1
- {"unk_token": "[UNK]", "bos_token": "<s>", "eos_token": "</s>", "pad_token": "[PAD]", "do_lower_case": false, "word_delimiter_token": "|", "special_tokens_map_file": null, "tokenizer_file": null, "name_or_path": "./wav2vec2-xls-r-300m-uk", "tokenizer_class": "Wav2Vec2CTCTokenizer"}
 
1
+ {"unk_token": "[UNK]", "bos_token": "<s>", "eos_token": "</s>", "pad_token": "[PAD]", "do_lower_case": false, "word_delimiter_token": "|", "special_tokens_map_file": null, "tokenizer_file": null, "name_or_path": "./", "tokenizer_class": "Wav2Vec2CTCTokenizer"}
train_results.json DELETED
@@ -1,8 +0,0 @@
1
- {
2
- "epoch": 180.0,
3
- "train_loss": 0.061750437557927515,
4
- "train_runtime": 29496.4676,
5
- "train_samples": 9889,
6
- "train_samples_per_second": 60.347,
7
- "train_steps_per_second": 7.549
8
- }
 
 
 
 
 
 
 
 
 
trainer_state.json CHANGED
The diff for this file is too large to render. See raw diff
 
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4eaf0322682a5dc5352958832b28880acc9c8d1d79076deb7e8bccb9368ac2f8
3
  size 2927
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a9f2d55e2d30c3efb297191f66a757c1a52ea11bcbc472646289172e2a0d182
3
  size 2927
vocab.json CHANGED
@@ -1 +1 @@
1
- {"ʼ": 1, "а": 2, "б": 3, "в": 4, "г": 5, "д": 6, "е": 7, "ж": 8, "з": 9, "и": 10, "й": 11, "к": 12, "л": 13, "м": 14, "н": 15, "о": 16, "п": 17, "р": 18, "с": 19, "т": 20, "у": 21, "ф": 22, "х": 23, "ц": 24, "ч": 25, "ш": 26, "щ": 27, "ь": 28, "ю": 29, "я": 30, "є": 31, "і": 32, "ї": 33, "ґ": 34, "|": 0, "[UNK]": 35, "[PAD]": 36}
 
1
+ {"а": 1, "б": 2, "в": 3, "г": 4, "д": 5, "е": 6, "ж": 7, "з": 8, "и": 9, "й": 10, "к": 11, "л": 12, "м": 13, "н": 14, "о": 15, "п": 16, "р": 17, "с": 18, "т": 19, "у": 20, "ф": 21, "х": 22, "ц": 23, "ч": 24, "ш": 25, "щ": 26, "ь": 27, "ю": 28, "я": 29, "є": 30, "і": 31, "ї": 32, "ґ": 33, "|": 0, "[UNK]": 34, "[PAD]": 35}