lgris commited on
Commit
45ac3e0
1 Parent(s): 601b7c6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +155 -0
README.md ADDED
@@ -0,0 +1,155 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - common_voice
7
+ model-index:
8
+ - name: wav2vec2-xls-r-1b-cv8
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # wav2vec2-xls-r-1b-cv8
16
+
17
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the common_voice dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.2008
20
+ - Wer: 0.1840
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 7.5e-05
40
+ - train_batch_size: 4
41
+ - eval_batch_size: 4
42
+ - seed: 42
43
+ - gradient_accumulation_steps: 4
44
+ - total_train_batch_size: 16
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 2000
48
+ - num_epochs: 30.0
49
+ - mixed_precision_training: Native AMP
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
55
+ | 2.1172 | 0.32 | 500 | 1.2852 | 0.9783 |
56
+ | 1.4152 | 0.64 | 1000 | 0.6434 | 0.6105 |
57
+ | 1.4342 | 0.96 | 1500 | 0.4844 | 0.3989 |
58
+ | 1.4657 | 1.29 | 2000 | 0.5080 | 0.4490 |
59
+ | 1.4961 | 1.61 | 2500 | 0.4764 | 0.4264 |
60
+ | 1.4515 | 1.93 | 3000 | 0.4519 | 0.4068 |
61
+ | 1.3924 | 2.25 | 3500 | 0.4472 | 0.4132 |
62
+ | 1.4524 | 2.57 | 4000 | 0.4455 | 0.3939 |
63
+ | 1.4328 | 2.89 | 4500 | 0.4369 | 0.4069 |
64
+ | 1.3456 | 3.22 | 5000 | 0.4234 | 0.3774 |
65
+ | 1.3725 | 3.54 | 5500 | 0.4387 | 0.3789 |
66
+ | 1.3812 | 3.86 | 6000 | 0.4298 | 0.3825 |
67
+ | 1.3282 | 4.18 | 6500 | 0.4025 | 0.3703 |
68
+ | 1.3326 | 4.5 | 7000 | 0.3917 | 0.3502 |
69
+ | 1.3028 | 4.82 | 7500 | 0.3889 | 0.3582 |
70
+ | 1.293 | 5.14 | 8000 | 0.3859 | 0.3496 |
71
+ | 1.321 | 5.47 | 8500 | 0.3875 | 0.3576 |
72
+ | 1.3165 | 5.79 | 9000 | 0.3927 | 0.3589 |
73
+ | 1.2701 | 6.11 | 9500 | 0.4058 | 0.3621 |
74
+ | 1.2718 | 6.43 | 10000 | 0.4211 | 0.3916 |
75
+ | 1.2683 | 6.75 | 10500 | 0.3968 | 0.3620 |
76
+ | 1.2643 | 7.07 | 11000 | 0.4128 | 0.3848 |
77
+ | 1.2485 | 7.4 | 11500 | 0.3849 | 0.3727 |
78
+ | 1.2608 | 7.72 | 12000 | 0.3770 | 0.3474 |
79
+ | 1.2388 | 8.04 | 12500 | 0.3774 | 0.3574 |
80
+ | 1.2524 | 8.36 | 13000 | 0.3789 | 0.3550 |
81
+ | 1.2458 | 8.68 | 13500 | 0.3770 | 0.3410 |
82
+ | 1.2505 | 9.0 | 14000 | 0.3638 | 0.3403 |
83
+ | 1.2254 | 9.32 | 14500 | 0.3770 | 0.3509 |
84
+ | 1.2459 | 9.65 | 15000 | 0.3592 | 0.3349 |
85
+ | 1.2049 | 9.97 | 15500 | 0.3600 | 0.3428 |
86
+ | 1.2097 | 10.29 | 16000 | 0.3626 | 0.3347 |
87
+ | 1.1988 | 10.61 | 16500 | 0.3740 | 0.3269 |
88
+ | 1.1671 | 10.93 | 17000 | 0.3548 | 0.3245 |
89
+ | 1.1532 | 11.25 | 17500 | 0.3394 | 0.3140 |
90
+ | 1.1459 | 11.58 | 18000 | 0.3349 | 0.3156 |
91
+ | 1.1511 | 11.9 | 18500 | 0.3272 | 0.3110 |
92
+ | 1.1465 | 12.22 | 19000 | 0.3348 | 0.3084 |
93
+ | 1.1426 | 12.54 | 19500 | 0.3193 | 0.3027 |
94
+ | 1.1278 | 12.86 | 20000 | 0.3318 | 0.3021 |
95
+ | 1.149 | 13.18 | 20500 | 0.3169 | 0.2947 |
96
+ | 1.114 | 13.5 | 21000 | 0.3224 | 0.2986 |
97
+ | 1.1249 | 13.83 | 21500 | 0.3227 | 0.2921 |
98
+ | 1.0968 | 14.15 | 22000 | 0.3033 | 0.2878 |
99
+ | 1.0851 | 14.47 | 22500 | 0.2996 | 0.2863 |
100
+ | 1.0985 | 14.79 | 23000 | 0.3011 | 0.2843 |
101
+ | 1.0808 | 15.11 | 23500 | 0.2932 | 0.2759 |
102
+ | 1.069 | 15.43 | 24000 | 0.2919 | 0.2750 |
103
+ | 1.0602 | 15.76 | 24500 | 0.2959 | 0.2713 |
104
+ | 1.0369 | 16.08 | 25000 | 0.2931 | 0.2754 |
105
+ | 1.0573 | 16.4 | 25500 | 0.2920 | 0.2722 |
106
+ | 1.051 | 16.72 | 26000 | 0.2855 | 0.2632 |
107
+ | 1.0279 | 17.04 | 26500 | 0.2850 | 0.2649 |
108
+ | 1.0496 | 17.36 | 27000 | 0.2817 | 0.2585 |
109
+ | 1.0516 | 17.68 | 27500 | 0.2961 | 0.2635 |
110
+ | 1.0244 | 18.01 | 28000 | 0.2781 | 0.2589 |
111
+ | 1.0099 | 18.33 | 28500 | 0.2783 | 0.2565 |
112
+ | 1.0016 | 18.65 | 29000 | 0.2719 | 0.2537 |
113
+ | 1.0157 | 18.97 | 29500 | 0.2621 | 0.2449 |
114
+ | 0.9572 | 19.29 | 30000 | 0.2582 | 0.2427 |
115
+ | 0.9802 | 19.61 | 30500 | 0.2707 | 0.2468 |
116
+ | 0.9577 | 19.94 | 31000 | 0.2563 | 0.2389 |
117
+ | 0.9562 | 20.26 | 31500 | 0.2592 | 0.2382 |
118
+ | 0.962 | 20.58 | 32000 | 0.2539 | 0.2341 |
119
+ | 0.9541 | 20.9 | 32500 | 0.2505 | 0.2288 |
120
+ | 0.9587 | 21.22 | 33000 | 0.2486 | 0.2302 |
121
+ | 0.9146 | 21.54 | 33500 | 0.2461 | 0.2269 |
122
+ | 0.9215 | 21.86 | 34000 | 0.2387 | 0.2228 |
123
+ | 0.9105 | 22.19 | 34500 | 0.2405 | 0.2222 |
124
+ | 0.8949 | 22.51 | 35000 | 0.2316 | 0.2191 |
125
+ | 0.9153 | 22.83 | 35500 | 0.2358 | 0.2180 |
126
+ | 0.8907 | 23.15 | 36000 | 0.2369 | 0.2168 |
127
+ | 0.8973 | 23.47 | 36500 | 0.2323 | 0.2120 |
128
+ | 0.8878 | 23.79 | 37000 | 0.2293 | 0.2104 |
129
+ | 0.8818 | 24.12 | 37500 | 0.2302 | 0.2132 |
130
+ | 0.8919 | 24.44 | 38000 | 0.2262 | 0.2083 |
131
+ | 0.8473 | 24.76 | 38500 | 0.2257 | 0.2040 |
132
+ | 0.8516 | 25.08 | 39000 | 0.2246 | 0.2031 |
133
+ | 0.8451 | 25.4 | 39500 | 0.2198 | 0.2000 |
134
+ | 0.8288 | 25.72 | 40000 | 0.2199 | 0.1990 |
135
+ | 0.8465 | 26.05 | 40500 | 0.2165 | 0.1972 |
136
+ | 0.8305 | 26.37 | 41000 | 0.2128 | 0.1957 |
137
+ | 0.8202 | 26.69 | 41500 | 0.2127 | 0.1937 |
138
+ | 0.8223 | 27.01 | 42000 | 0.2100 | 0.1934 |
139
+ | 0.8322 | 27.33 | 42500 | 0.2076 | 0.1905 |
140
+ | 0.8139 | 27.65 | 43000 | 0.2054 | 0.1880 |
141
+ | 0.8299 | 27.97 | 43500 | 0.2026 | 0.1868 |
142
+ | 0.7937 | 28.3 | 44000 | 0.2045 | 0.1872 |
143
+ | 0.7972 | 28.62 | 44500 | 0.2025 | 0.1861 |
144
+ | 0.809 | 28.94 | 45000 | 0.2026 | 0.1858 |
145
+ | 0.813 | 29.26 | 45500 | 0.2013 | 0.1838 |
146
+ | 0.7718 | 29.58 | 46000 | 0.2010 | 0.1837 |
147
+ | 0.7929 | 29.9 | 46500 | 0.2008 | 0.1840 |
148
+
149
+
150
+ ### Framework versions
151
+
152
+ - Transformers 4.17.0.dev0
153
+ - Pytorch 1.10.2+cu102
154
+ - Datasets 1.18.3.dev0
155
+ - Tokenizers 0.11.0