g8a9 commited on
Commit
6bb2849
1 Parent(s): 644389b

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +239 -0
README.md ADDED
@@ -0,0 +1,239 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: wav2vec2-xls-r-300m-italian-augmented-multids
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # wav2vec2-xls-r-300m-italian-augmented-multids
14
+
15
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.1633
18
+ - Wer: 0.1636
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 0.0003
38
+ - train_batch_size: 32
39
+ - eval_batch_size: 8
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - lr_scheduler_warmup_steps: 500
44
+ - num_epochs: 10.0
45
+ - mixed_precision_training: Native AMP
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
50
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
51
+ | No log | 0.06 | 400 | 0.7508 | 0.7354 |
52
+ | 2.3127 | 0.11 | 800 | 0.5888 | 0.5882 |
53
+ | 0.7256 | 0.17 | 1200 | 0.5121 | 0.5247 |
54
+ | 0.6692 | 0.22 | 1600 | 0.4774 | 0.5028 |
55
+ | 0.6384 | 0.28 | 2000 | 0.4832 | 0.4885 |
56
+ | 0.6384 | 0.33 | 2400 | 0.4410 | 0.4581 |
57
+ | 0.6199 | 0.39 | 2800 | 0.4160 | 0.4331 |
58
+ | 0.5972 | 0.44 | 3200 | 0.4136 | 0.4275 |
59
+ | 0.6048 | 0.5 | 3600 | 0.4362 | 0.4538 |
60
+ | 0.5627 | 0.55 | 4000 | 0.4313 | 0.4469 |
61
+ | 0.5627 | 0.61 | 4400 | 0.4425 | 0.4579 |
62
+ | 0.5855 | 0.66 | 4800 | 0.3859 | 0.4133 |
63
+ | 0.5702 | 0.72 | 5200 | 0.3974 | 0.4097 |
64
+ | 0.55 | 0.77 | 5600 | 0.3931 | 0.4134 |
65
+ | 0.5624 | 0.83 | 6000 | 0.3900 | 0.4126 |
66
+ | 0.5624 | 0.88 | 6400 | 0.3622 | 0.3899 |
67
+ | 0.5615 | 0.94 | 6800 | 0.3755 | 0.4067 |
68
+ | 0.5472 | 0.99 | 7200 | 0.3980 | 0.4284 |
69
+ | 0.5663 | 1.05 | 7600 | 0.3553 | 0.3782 |
70
+ | 0.5189 | 1.1 | 8000 | 0.3538 | 0.3726 |
71
+ | 0.5189 | 1.16 | 8400 | 0.3425 | 0.3624 |
72
+ | 0.518 | 1.21 | 8800 | 0.3431 | 0.3651 |
73
+ | 0.5399 | 1.27 | 9200 | 0.3442 | 0.3573 |
74
+ | 0.5303 | 1.32 | 9600 | 0.3241 | 0.3404 |
75
+ | 0.5043 | 1.38 | 10000 | 0.3175 | 0.3378 |
76
+ | 0.5043 | 1.43 | 10400 | 0.3265 | 0.3501 |
77
+ | 0.4968 | 1.49 | 10800 | 0.3539 | 0.3703 |
78
+ | 0.5102 | 1.54 | 11200 | 0.3323 | 0.3506 |
79
+ | 0.5008 | 1.6 | 11600 | 0.3188 | 0.3433 |
80
+ | 0.4996 | 1.65 | 12000 | 0.3162 | 0.3388 |
81
+ | 0.4996 | 1.71 | 12400 | 0.3353 | 0.3552 |
82
+ | 0.5007 | 1.76 | 12800 | 0.3152 | 0.3317 |
83
+ | 0.4956 | 1.82 | 13200 | 0.3207 | 0.3430 |
84
+ | 0.5205 | 1.87 | 13600 | 0.3239 | 0.3430 |
85
+ | 0.4829 | 1.93 | 14000 | 0.3134 | 0.3266 |
86
+ | 0.4829 | 1.98 | 14400 | 0.3039 | 0.3291 |
87
+ | 0.5251 | 2.04 | 14800 | 0.2944 | 0.3169 |
88
+ | 0.4872 | 2.09 | 15200 | 0.3061 | 0.3228 |
89
+ | 0.4805 | 2.15 | 15600 | 0.3034 | 0.3152 |
90
+ | 0.4949 | 2.2 | 16000 | 0.2896 | 0.3066 |
91
+ | 0.4949 | 2.26 | 16400 | 0.3059 | 0.3344 |
92
+ | 0.468 | 2.31 | 16800 | 0.2932 | 0.3111 |
93
+ | 0.4637 | 2.37 | 17200 | 0.2890 | 0.3074 |
94
+ | 0.4638 | 2.42 | 17600 | 0.2893 | 0.3112 |
95
+ | 0.4728 | 2.48 | 18000 | 0.2832 | 0.3013 |
96
+ | 0.4728 | 2.54 | 18400 | 0.2921 | 0.3065 |
97
+ | 0.456 | 2.59 | 18800 | 0.2961 | 0.3104 |
98
+ | 0.4628 | 2.65 | 19200 | 0.2886 | 0.3109 |
99
+ | 0.4534 | 2.7 | 19600 | 0.2828 | 0.3020 |
100
+ | 0.4578 | 2.76 | 20000 | 0.2805 | 0.3026 |
101
+ | 0.4578 | 2.81 | 20400 | 0.2796 | 0.2987 |
102
+ | 0.4702 | 2.87 | 20800 | 0.2748 | 0.2906 |
103
+ | 0.4487 | 2.92 | 21200 | 0.2819 | 0.3008 |
104
+ | 0.4411 | 2.98 | 21600 | 0.2722 | 0.2868 |
105
+ | 0.4631 | 3.03 | 22000 | 0.2814 | 0.2974 |
106
+ | 0.4631 | 3.09 | 22400 | 0.2762 | 0.2894 |
107
+ | 0.4591 | 3.14 | 22800 | 0.2802 | 0.2980 |
108
+ | 0.4349 | 3.2 | 23200 | 0.2748 | 0.2951 |
109
+ | 0.4339 | 3.25 | 23600 | 0.2792 | 0.2927 |
110
+ | 0.4254 | 3.31 | 24000 | 0.2712 | 0.2911 |
111
+ | 0.4254 | 3.36 | 24400 | 0.2719 | 0.2892 |
112
+ | 0.4317 | 3.42 | 24800 | 0.2686 | 0.2861 |
113
+ | 0.4282 | 3.47 | 25200 | 0.2632 | 0.2861 |
114
+ | 0.4262 | 3.53 | 25600 | 0.2633 | 0.2817 |
115
+ | 0.4162 | 3.58 | 26000 | 0.2561 | 0.2765 |
116
+ | 0.4162 | 3.64 | 26400 | 0.2613 | 0.2847 |
117
+ | 0.414 | 3.69 | 26800 | 0.2679 | 0.2824 |
118
+ | 0.4132 | 3.75 | 27200 | 0.2569 | 0.2813 |
119
+ | 0.405 | 3.8 | 27600 | 0.2589 | 0.2785 |
120
+ | 0.4128 | 3.86 | 28000 | 0.2611 | 0.2714 |
121
+ | 0.4128 | 3.91 | 28400 | 0.2548 | 0.2731 |
122
+ | 0.4174 | 3.97 | 28800 | 0.2574 | 0.2716 |
123
+ | 0.421 | 4.02 | 29200 | 0.2529 | 0.2700 |
124
+ | 0.4109 | 4.08 | 29600 | 0.2547 | 0.2682 |
125
+ | 0.4027 | 4.13 | 30000 | 0.2578 | 0.2758 |
126
+ | 0.4027 | 4.19 | 30400 | 0.2511 | 0.2715 |
127
+ | 0.4075 | 4.24 | 30800 | 0.2507 | 0.2601 |
128
+ | 0.3947 | 4.3 | 31200 | 0.2552 | 0.2711 |
129
+ | 0.4042 | 4.35 | 31600 | 0.2530 | 0.2695 |
130
+ | 0.3907 | 4.41 | 32000 | 0.2543 | 0.2738 |
131
+ | 0.3907 | 4.46 | 32400 | 0.2491 | 0.2629 |
132
+ | 0.3895 | 4.52 | 32800 | 0.2471 | 0.2611 |
133
+ | 0.3901 | 4.57 | 33200 | 0.2404 | 0.2559 |
134
+ | 0.3818 | 4.63 | 33600 | 0.2378 | 0.2583 |
135
+ | 0.3831 | 4.68 | 34000 | 0.2341 | 0.2499 |
136
+ | 0.3831 | 4.74 | 34400 | 0.2379 | 0.2560 |
137
+ | 0.3808 | 4.79 | 34800 | 0.2418 | 0.2553 |
138
+ | 0.4015 | 4.85 | 35200 | 0.2378 | 0.2565 |
139
+ | 0.407 | 4.9 | 35600 | 0.2375 | 0.2535 |
140
+ | 0.38 | 4.96 | 36000 | 0.2329 | 0.2451 |
141
+ | 0.38 | 5.02 | 36400 | 0.2541 | 0.2737 |
142
+ | 0.3753 | 5.07 | 36800 | 0.2475 | 0.2580 |
143
+ | 0.3701 | 5.13 | 37200 | 0.2356 | 0.2484 |
144
+ | 0.3627 | 5.18 | 37600 | 0.2422 | 0.2552 |
145
+ | 0.3652 | 5.24 | 38000 | 0.2353 | 0.2518 |
146
+ | 0.3652 | 5.29 | 38400 | 0.2328 | 0.2452 |
147
+ | 0.3667 | 5.35 | 38800 | 0.2358 | 0.2478 |
148
+ | 0.3711 | 5.4 | 39200 | 0.2340 | 0.2463 |
149
+ | 0.361 | 5.46 | 39600 | 0.2375 | 0.2452 |
150
+ | 0.3655 | 5.51 | 40000 | 0.2292 | 0.2387 |
151
+ | 0.3655 | 5.57 | 40400 | 0.2330 | 0.2432 |
152
+ | 0.3637 | 5.62 | 40800 | 0.2242 | 0.2396 |
153
+ | 0.3516 | 5.68 | 41200 | 0.2284 | 0.2394 |
154
+ | 0.3498 | 5.73 | 41600 | 0.2254 | 0.2343 |
155
+ | 0.3626 | 5.79 | 42000 | 0.2191 | 0.2318 |
156
+ | 0.3626 | 5.84 | 42400 | 0.2261 | 0.2399 |
157
+ | 0.3719 | 5.9 | 42800 | 0.2261 | 0.2411 |
158
+ | 0.3563 | 5.95 | 43200 | 0.2259 | 0.2416 |
159
+ | 0.3574 | 6.01 | 43600 | 0.2148 | 0.2249 |
160
+ | 0.3339 | 6.06 | 44000 | 0.2173 | 0.2237 |
161
+ | 0.3339 | 6.12 | 44400 | 0.2133 | 0.2238 |
162
+ | 0.3303 | 6.17 | 44800 | 0.2193 | 0.2297 |
163
+ | 0.331 | 6.23 | 45200 | 0.2122 | 0.2205 |
164
+ | 0.3372 | 6.28 | 45600 | 0.2083 | 0.2215 |
165
+ | 0.3427 | 6.34 | 46000 | 0.2079 | 0.2163 |
166
+ | 0.3427 | 6.39 | 46400 | 0.2072 | 0.2154 |
167
+ | 0.3215 | 6.45 | 46800 | 0.2067 | 0.2170 |
168
+ | 0.3246 | 6.5 | 47200 | 0.2089 | 0.2183 |
169
+ | 0.3217 | 6.56 | 47600 | 0.2030 | 0.2130 |
170
+ | 0.3309 | 6.61 | 48000 | 0.2020 | 0.2123 |
171
+ | 0.3309 | 6.67 | 48400 | 0.2054 | 0.2133 |
172
+ | 0.3343 | 6.72 | 48800 | 0.2013 | 0.2128 |
173
+ | 0.3213 | 6.78 | 49200 | 0.1971 | 0.2064 |
174
+ | 0.3145 | 6.83 | 49600 | 0.2029 | 0.2107 |
175
+ | 0.3274 | 6.89 | 50000 | 0.2038 | 0.2136 |
176
+ | 0.3274 | 6.94 | 50400 | 0.1991 | 0.2064 |
177
+ | 0.3202 | 7.0 | 50800 | 0.1970 | 0.2083 |
178
+ | 0.314 | 7.05 | 51200 | 0.1970 | 0.2035 |
179
+ | 0.3031 | 7.11 | 51600 | 0.1943 | 0.2053 |
180
+ | 0.3004 | 7.16 | 52000 | 0.1942 | 0.1985 |
181
+ | 0.3004 | 7.22 | 52400 | 0.1941 | 0.2003 |
182
+ | 0.3029 | 7.27 | 52800 | 0.1936 | 0.2008 |
183
+ | 0.2915 | 7.33 | 53200 | 0.1935 | 0.1995 |
184
+ | 0.3005 | 7.38 | 53600 | 0.1943 | 0.2032 |
185
+ | 0.2984 | 7.44 | 54000 | 0.1913 | 0.1978 |
186
+ | 0.2984 | 7.5 | 54400 | 0.1907 | 0.1965 |
187
+ | 0.2978 | 7.55 | 54800 | 0.1881 | 0.1958 |
188
+ | 0.2944 | 7.61 | 55200 | 0.1887 | 0.1966 |
189
+ | 0.3004 | 7.66 | 55600 | 0.1870 | 0.1930 |
190
+ | 0.3099 | 7.72 | 56000 | 0.1906 | 0.1976 |
191
+ | 0.3099 | 7.77 | 56400 | 0.1856 | 0.1939 |
192
+ | 0.2917 | 7.83 | 56800 | 0.1883 | 0.1961 |
193
+ | 0.2924 | 7.88 | 57200 | 0.1864 | 0.1930 |
194
+ | 0.3061 | 7.94 | 57600 | 0.1831 | 0.1872 |
195
+ | 0.2834 | 7.99 | 58000 | 0.1835 | 0.1896 |
196
+ | 0.2834 | 8.05 | 58400 | 0.1828 | 0.1875 |
197
+ | 0.2807 | 8.1 | 58800 | 0.1820 | 0.1874 |
198
+ | 0.2765 | 8.16 | 59200 | 0.1807 | 0.1869 |
199
+ | 0.2737 | 8.21 | 59600 | 0.1810 | 0.1848 |
200
+ | 0.2722 | 8.27 | 60000 | 0.1795 | 0.1829 |
201
+ | 0.2722 | 8.32 | 60400 | 0.1785 | 0.1826 |
202
+ | 0.272 | 8.38 | 60800 | 0.1802 | 0.1836 |
203
+ | 0.268 | 8.43 | 61200 | 0.1771 | 0.1813 |
204
+ | 0.2695 | 8.49 | 61600 | 0.1773 | 0.1821 |
205
+ | 0.2686 | 8.54 | 62000 | 0.1756 | 0.1814 |
206
+ | 0.2686 | 8.6 | 62400 | 0.1740 | 0.1770 |
207
+ | 0.2687 | 8.65 | 62800 | 0.1748 | 0.1769 |
208
+ | 0.2686 | 8.71 | 63200 | 0.1734 | 0.1766 |
209
+ | 0.2683 | 8.76 | 63600 | 0.1722 | 0.1759 |
210
+ | 0.2686 | 8.82 | 64000 | 0.1719 | 0.1760 |
211
+ | 0.2686 | 8.87 | 64400 | 0.1720 | 0.1743 |
212
+ | 0.2626 | 8.93 | 64800 | 0.1696 | 0.1742 |
213
+ | 0.2587 | 8.98 | 65200 | 0.1690 | 0.1718 |
214
+ | 0.2554 | 9.04 | 65600 | 0.1704 | 0.1722 |
215
+ | 0.2537 | 9.09 | 66000 | 0.1702 | 0.1721 |
216
+ | 0.2537 | 9.15 | 66400 | 0.1696 | 0.1717 |
217
+ | 0.2511 | 9.2 | 66800 | 0.1685 | 0.1701 |
218
+ | 0.2473 | 9.26 | 67200 | 0.1696 | 0.1704 |
219
+ | 0.2458 | 9.31 | 67600 | 0.1686 | 0.1698 |
220
+ | 0.2476 | 9.37 | 68000 | 0.1675 | 0.1687 |
221
+ | 0.2476 | 9.42 | 68400 | 0.1659 | 0.1673 |
222
+ | 0.2463 | 9.48 | 68800 | 0.1664 | 0.1674 |
223
+ | 0.2481 | 9.53 | 69200 | 0.1661 | 0.1670 |
224
+ | 0.2411 | 9.59 | 69600 | 0.1658 | 0.1663 |
225
+ | 0.2445 | 9.64 | 70000 | 0.1652 | 0.1660 |
226
+ | 0.2445 | 9.7 | 70400 | 0.1646 | 0.1654 |
227
+ | 0.2407 | 9.75 | 70800 | 0.1646 | 0.1641 |
228
+ | 0.2483 | 9.81 | 71200 | 0.1641 | 0.1641 |
229
+ | 0.245 | 9.86 | 71600 | 0.1635 | 0.1643 |
230
+ | 0.2402 | 9.92 | 72000 | 0.1638 | 0.1634 |
231
+ | 0.2402 | 9.98 | 72400 | 0.1633 | 0.1636 |
232
+
233
+
234
+ ### Framework versions
235
+
236
+ - Transformers 4.17.0.dev0
237
+ - Pytorch 1.10.2+cu102
238
+ - Datasets 1.18.3
239
+ - Tokenizers 0.11.0