greenw0lf commited on
Commit
c23c296
1 Parent(s): a02e3bf

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +168 -0
README.md ADDED
@@ -0,0 +1,168 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - common_voice_8_0
7
+ metrics:
8
+ - wer
9
+ model-index:
10
+ - name: wav2vec2-large-xls-r-300m-frisian-cv-8
11
+ results:
12
+ - task:
13
+ name: Automatic Speech Recognition
14
+ type: automatic-speech-recognition
15
+ dataset:
16
+ name: common_voice_8_0
17
+ type: common_voice_8_0
18
+ config: fy-NL
19
+ split: validation
20
+ args: fy-NL
21
+ metrics:
22
+ - name: Wer
23
+ type: wer
24
+ value: 0.07238251678331667
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # wav2vec2-large-xls-r-300m-frisian-cv-8
31
+
32
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice_8_0 dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.0707
35
+ - Wer: 0.0724
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 5e-05
55
+ - train_batch_size: 32
56
+ - eval_batch_size: 8
57
+ - seed: 42
58
+ - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - lr_scheduler_warmup_ratio: 0.1
61
+ - num_epochs: 40
62
+ - mixed_precision_training: Native AMP
63
+
64
+ ### Training results
65
+
66
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
67
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
68
+ | 14.7268 | 0.43 | 400 | 8.7389 | 1.0 |
69
+ | 5.3377 | 0.86 | 800 | 3.7016 | 1.0 |
70
+ | 3.343 | 1.29 | 1200 | 3.0984 | 1.0 |
71
+ | 3.0306 | 1.71 | 1600 | 2.9643 | 1.0 |
72
+ | 2.9511 | 2.14 | 2000 | 2.9273 | 1.0 |
73
+ | 2.9078 | 2.57 | 2400 | 2.8202 | 1.0 |
74
+ | 2.4965 | 3.0 | 2800 | 1.3805 | 0.8888 |
75
+ | 1.5378 | 3.43 | 3200 | 0.6556 | 0.5720 |
76
+ | 1.119 | 3.86 | 3600 | 0.4260 | 0.4077 |
77
+ | 0.9159 | 4.29 | 4000 | 0.3457 | 0.3322 |
78
+ | 0.8037 | 4.72 | 4400 | 0.2765 | 0.2850 |
79
+ | 0.7411 | 5.14 | 4800 | 0.2447 | 0.2473 |
80
+ | 0.6767 | 5.57 | 5200 | 0.2176 | 0.2234 |
81
+ | 0.6296 | 6.0 | 5600 | 0.1996 | 0.2078 |
82
+ | 0.6165 | 6.43 | 6000 | 0.1891 | 0.1977 |
83
+ | 0.5856 | 6.86 | 6400 | 0.1763 | 0.1855 |
84
+ | 0.5674 | 7.29 | 6800 | 0.1708 | 0.1797 |
85
+ | 0.5399 | 7.72 | 7200 | 0.1593 | 0.1694 |
86
+ | 0.5195 | 8.15 | 7600 | 0.1551 | 0.1660 |
87
+ | 0.4973 | 8.57 | 8000 | 0.1509 | 0.1583 |
88
+ | 0.4907 | 9.0 | 8400 | 0.1480 | 0.1525 |
89
+ | 0.4681 | 9.43 | 8800 | 0.1389 | 0.1494 |
90
+ | 0.4513 | 9.86 | 9200 | 0.1368 | 0.1414 |
91
+ | 0.4486 | 10.29 | 9600 | 0.1294 | 0.1390 |
92
+ | 0.4381 | 10.72 | 10000 | 0.1262 | 0.1354 |
93
+ | 0.443 | 11.15 | 10400 | 0.1234 | 0.1313 |
94
+ | 0.4182 | 11.58 | 10800 | 0.1196 | 0.1294 |
95
+ | 0.4036 | 12.0 | 11200 | 0.1194 | 0.1259 |
96
+ | 0.4027 | 12.43 | 11600 | 0.1170 | 0.1226 |
97
+ | 0.4066 | 12.86 | 12000 | 0.1156 | 0.1224 |
98
+ | 0.3885 | 13.29 | 12400 | 0.1136 | 0.1174 |
99
+ | 0.3859 | 13.72 | 12800 | 0.1121 | 0.1146 |
100
+ | 0.3812 | 14.15 | 13200 | 0.1097 | 0.1141 |
101
+ | 0.3774 | 14.58 | 13600 | 0.1059 | 0.1130 |
102
+ | 0.3678 | 15.01 | 14000 | 0.1058 | 0.1096 |
103
+ | 0.3586 | 15.43 | 14400 | 0.1026 | 0.1099 |
104
+ | 0.3612 | 15.86 | 14800 | 0.1010 | 0.1076 |
105
+ | 0.3626 | 16.29 | 15200 | 0.0993 | 0.1068 |
106
+ | 0.353 | 16.72 | 15600 | 0.0974 | 0.1046 |
107
+ | 0.3564 | 17.15 | 16000 | 0.0986 | 0.1037 |
108
+ | 0.3447 | 17.58 | 16400 | 0.0977 | 0.1041 |
109
+ | 0.3454 | 18.01 | 16800 | 0.0945 | 0.1023 |
110
+ | 0.3338 | 18.44 | 17200 | 0.0904 | 0.0996 |
111
+ | 0.3359 | 18.86 | 17600 | 0.0950 | 0.1002 |
112
+ | 0.3179 | 19.29 | 18000 | 0.0911 | 0.0977 |
113
+ | 0.3202 | 19.72 | 18400 | 0.0906 | 0.0979 |
114
+ | 0.3317 | 20.15 | 18800 | 0.0894 | 0.0963 |
115
+ | 0.3187 | 20.58 | 19200 | 0.0878 | 0.0938 |
116
+ | 0.3075 | 21.01 | 19600 | 0.0893 | 0.0937 |
117
+ | 0.3032 | 21.44 | 20000 | 0.0872 | 0.0923 |
118
+ | 0.3048 | 21.86 | 20400 | 0.0848 | 0.0921 |
119
+ | 0.3045 | 22.29 | 20800 | 0.0860 | 0.0887 |
120
+ | 0.316 | 22.72 | 21200 | 0.0841 | 0.0896 |
121
+ | 0.2986 | 23.15 | 21600 | 0.0840 | 0.0876 |
122
+ | 0.294 | 23.58 | 22000 | 0.0824 | 0.0862 |
123
+ | 0.313 | 24.01 | 22400 | 0.0814 | 0.0855 |
124
+ | 0.2864 | 24.44 | 22800 | 0.0816 | 0.0861 |
125
+ | 0.2927 | 24.87 | 23200 | 0.0807 | 0.0875 |
126
+ | 0.294 | 25.29 | 23600 | 0.0829 | 0.0826 |
127
+ | 0.2834 | 25.72 | 24000 | 0.0794 | 0.0823 |
128
+ | 0.2852 | 26.15 | 24400 | 0.0781 | 0.0815 |
129
+ | 0.2823 | 26.58 | 24800 | 0.0781 | 0.0821 |
130
+ | 0.2835 | 27.01 | 25200 | 0.0788 | 0.0826 |
131
+ | 0.2763 | 27.44 | 25600 | 0.0789 | 0.0823 |
132
+ | 0.2845 | 27.87 | 26000 | 0.0767 | 0.0803 |
133
+ | 0.2777 | 28.3 | 26400 | 0.0775 | 0.0809 |
134
+ | 0.275 | 28.72 | 26800 | 0.0758 | 0.0794 |
135
+ | 0.2707 | 29.15 | 27200 | 0.0745 | 0.0790 |
136
+ | 0.2734 | 29.58 | 27600 | 0.0765 | 0.0797 |
137
+ | 0.2716 | 30.01 | 28000 | 0.0746 | 0.0780 |
138
+ | 0.2626 | 30.44 | 28400 | 0.0756 | 0.0776 |
139
+ | 0.2671 | 30.87 | 28800 | 0.0742 | 0.0763 |
140
+ | 0.2592 | 31.3 | 29200 | 0.0730 | 0.0771 |
141
+ | 0.2685 | 31.73 | 29600 | 0.0733 | 0.0760 |
142
+ | 0.2727 | 32.15 | 30000 | 0.0738 | 0.0758 |
143
+ | 0.2564 | 32.58 | 30400 | 0.0731 | 0.0763 |
144
+ | 0.2528 | 33.01 | 30800 | 0.0730 | 0.0758 |
145
+ | 0.2573 | 33.44 | 31200 | 0.0717 | 0.0746 |
146
+ | 0.2597 | 33.87 | 31600 | 0.0718 | 0.0760 |
147
+ | 0.2511 | 34.3 | 32000 | 0.0737 | 0.0750 |
148
+ | 0.2551 | 34.73 | 32400 | 0.0732 | 0.0758 |
149
+ | 0.26 | 35.16 | 32800 | 0.0724 | 0.0746 |
150
+ | 0.2563 | 35.58 | 33200 | 0.0717 | 0.0730 |
151
+ | 0.2559 | 36.01 | 33600 | 0.0707 | 0.0734 |
152
+ | 0.2499 | 36.44 | 34000 | 0.0721 | 0.0729 |
153
+ | 0.252 | 36.87 | 34400 | 0.0716 | 0.0723 |
154
+ | 0.2448 | 37.3 | 34800 | 0.0711 | 0.0725 |
155
+ | 0.248 | 37.73 | 35200 | 0.0710 | 0.0727 |
156
+ | 0.2568 | 38.16 | 35600 | 0.0710 | 0.0720 |
157
+ | 0.2471 | 38.59 | 36000 | 0.0707 | 0.0725 |
158
+ | 0.2464 | 39.01 | 36400 | 0.0705 | 0.0719 |
159
+ | 0.2477 | 39.44 | 36800 | 0.0706 | 0.0727 |
160
+ | 0.2482 | 39.87 | 37200 | 0.0707 | 0.0724 |
161
+
162
+
163
+ ### Framework versions
164
+
165
+ - Transformers 4.28.1
166
+ - Pytorch 2.0.0+cu117
167
+ - Datasets 2.11.0
168
+ - Tokenizers 0.13.3