shahukareem commited on
Commit
ab42dde
1 Parent(s): c608ef5

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +82 -101
README.md CHANGED
@@ -1,31 +1,12 @@
1
  ---
2
- language:
3
- - dv
4
  license: apache-2.0
5
  tags:
6
- - automatic-speech-recognition
7
- - mozilla-foundation/common_voice_8_0
8
  - generated_from_trainer
9
- - robust-speech-event
10
  datasets:
11
  - common_voice
12
  model-index:
13
- - name: XLS-R-300M - Dhivehi- CV8
14
- results:
15
- - task:
16
- name: Automatic Speech Recognition
17
- type: automatic-speech-recognition
18
- dataset:
19
- name: Common Voice 8
20
- type: mozilla-foundation/common_voice_8_0
21
- args: dv
22
- metrics:
23
- - name: Test WER
24
- type: wer
25
- value: 29.69
26
- - name: Test CER
27
- type: cer
28
- value: 5.48
29
  ---
30
 
31
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -35,8 +16,8 @@ should probably proofread and complete it, then remove this comment. -->
35
 
36
  This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
37
  It achieves the following results on the evaluation set:
38
- - Loss: 0.3149
39
- - Wer: 0.2947
40
 
41
  ## Model description
42
 
@@ -71,86 +52,86 @@ The following hyperparameters were used during training:
71
 
72
  | Training Loss | Epoch | Step | Validation Loss | Wer |
73
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
74
- | 3.9617 | 0.66 | 400 | 1.4251 | 0.9768 |
75
- | 0.9081 | 1.33 | 800 | 0.6068 | 0.7290 |
76
- | 0.6575 | 1.99 | 1200 | 0.4700 | 0.6234 |
77
- | 0.548 | 2.65 | 1600 | 0.4158 | 0.5868 |
78
- | 0.5031 | 3.32 | 2000 | 0.4067 | 0.5728 |
79
- | 0.4792 | 3.98 | 2400 | 0.3965 | 0.5673 |
80
- | 0.4344 | 4.64 | 2800 | 0.3862 | 0.5383 |
81
- | 0.4237 | 5.31 | 3200 | 0.3794 | 0.5316 |
82
- | 0.3984 | 5.97 | 3600 | 0.3395 | 0.5177 |
83
- | 0.3788 | 6.63 | 4000 | 0.3528 | 0.5329 |
84
- | 0.3685 | 7.3 | 4400 | 0.3404 | 0.5060 |
85
- | 0.3535 | 7.96 | 4800 | 0.3425 | 0.5069 |
86
- | 0.3391 | 8.62 | 5200 | 0.3576 | 0.5118 |
87
- | 0.331 | 9.29 | 5600 | 0.3259 | 0.4783 |
88
- | 0.3192 | 9.95 | 6000 | 0.3145 | 0.4794 |
89
- | 0.2956 | 10.61 | 6400 | 0.3111 | 0.4650 |
90
- | 0.2936 | 11.28 | 6800 | 0.3303 | 0.4741 |
91
- | 0.2868 | 11.94 | 7200 | 0.3109 | 0.4597 |
92
- | 0.2743 | 12.6 | 7600 | 0.3191 | 0.4557 |
93
- | 0.2654 | 13.27 | 8000 | 0.3286 | 0.4570 |
94
- | 0.2556 | 13.93 | 8400 | 0.3186 | 0.4468 |
95
- | 0.2452 | 14.59 | 8800 | 0.3405 | 0.4582 |
96
- | 0.241 | 15.26 | 9200 | 0.3418 | 0.4533 |
97
- | 0.2313 | 15.92 | 9600 | 0.3388 | 0.4405 |
98
- | 0.2234 | 16.58 | 10000 | 0.3659 | 0.4421 |
99
- | 0.2194 | 17.25 | 10400 | 0.3559 | 0.4490 |
100
- | 0.2168 | 17.91 | 10800 | 0.3452 | 0.4355 |
101
- | 0.2036 | 18.57 | 11200 | 0.3496 | 0.4259 |
102
- | 0.2046 | 19.24 | 11600 | 0.3282 | 0.4245 |
103
- | 0.1917 | 19.9 | 12000 | 0.3201 | 0.4052 |
104
- | 0.1908 | 20.56 | 12400 | 0.3439 | 0.4165 |
105
- | 0.1838 | 21.23 | 12800 | 0.3165 | 0.3950 |
106
- | 0.1828 | 21.89 | 13200 | 0.3332 | 0.4079 |
107
- | 0.1774 | 22.55 | 13600 | 0.3485 | 0.4072 |
108
- | 0.1776 | 23.22 | 14000 | 0.3308 | 0.3868 |
109
- | 0.1693 | 23.88 | 14400 | 0.3153 | 0.3906 |
110
- | 0.1656 | 24.54 | 14800 | 0.3408 | 0.3899 |
111
- | 0.1629 | 25.21 | 15200 | 0.3333 | 0.3854 |
112
- | 0.164 | 25.87 | 15600 | 0.3172 | 0.3775 |
113
- | 0.1505 | 26.53 | 16000 | 0.3105 | 0.3777 |
114
- | 0.1524 | 27.2 | 16400 | 0.3136 | 0.3726 |
115
- | 0.1482 | 27.86 | 16800 | 0.3110 | 0.3710 |
116
- | 0.1423 | 28.52 | 17200 | 0.3299 | 0.3687 |
117
- | 0.1419 | 29.19 | 17600 | 0.3271 | 0.3645 |
118
- | 0.135 | 29.85 | 18000 | 0.3333 | 0.3638 |
119
- | 0.1319 | 30.51 | 18400 | 0.3272 | 0.3640 |
120
- | 0.131 | 31.18 | 18800 | 0.3438 | 0.3636 |
121
- | 0.1252 | 31.84 | 19200 | 0.3266 | 0.3557 |
122
- | 0.1238 | 32.5 | 19600 | 0.3195 | 0.3516 |
123
- | 0.1203 | 33.17 | 20000 | 0.3405 | 0.3534 |
124
- | 0.1159 | 33.83 | 20400 | 0.3287 | 0.3509 |
125
- | 0.115 | 34.49 | 20800 | 0.3474 | 0.3433 |
126
- | 0.108 | 35.16 | 21200 | 0.3245 | 0.3381 |
127
- | 0.1091 | 35.82 | 21600 | 0.3185 | 0.3448 |
128
- | 0.1043 | 36.48 | 22000 | 0.3309 | 0.3363 |
129
- | 0.1034 | 37.15 | 22400 | 0.3288 | 0.3349 |
130
- | 0.1015 | 37.81 | 22800 | 0.3222 | 0.3284 |
131
- | 0.0953 | 38.47 | 23200 | 0.3272 | 0.3315 |
132
- | 0.0966 | 39.14 | 23600 | 0.3196 | 0.3239 |
133
- | 0.0938 | 39.8 | 24000 | 0.3199 | 0.3280 |
134
- | 0.0905 | 40.46 | 24400 | 0.3193 | 0.3166 |
135
- | 0.0893 | 41.13 | 24800 | 0.3224 | 0.3222 |
136
- | 0.0858 | 41.79 | 25200 | 0.3216 | 0.3142 |
137
- | 0.0839 | 42.45 | 25600 | 0.3241 | 0.3135 |
138
- | 0.0819 | 43.12 | 26000 | 0.3260 | 0.3071 |
139
- | 0.0782 | 43.78 | 26400 | 0.3202 | 0.3075 |
140
- | 0.0775 | 44.44 | 26800 | 0.3140 | 0.3067 |
141
- | 0.0751 | 45.11 | 27200 | 0.3118 | 0.3020 |
142
- | 0.0736 | 45.77 | 27600 | 0.3155 | 0.2976 |
143
- | 0.071 | 46.43 | 28000 | 0.3105 | 0.2998 |
144
- | 0.0715 | 47.1 | 28400 | 0.3065 | 0.2993 |
145
- | 0.0668 | 47.76 | 28800 | 0.3161 | 0.2972 |
146
- | 0.0698 | 48.42 | 29200 | 0.3137 | 0.2967 |
147
- | 0.0681 | 49.09 | 29600 | 0.3130 | 0.2971 |
148
- | 0.0651 | 49.75 | 30000 | 0.3149 | 0.2947 |
149
 
150
 
151
  ### Framework versions
152
 
153
- - Transformers 4.16.0.dev0
154
- - Pytorch 1.10.1+cu102
155
- - Datasets 1.17.1.dev0
156
  - Tokenizers 0.11.0
 
1
  ---
 
 
2
  license: apache-2.0
3
  tags:
 
 
4
  - generated_from_trainer
 
5
  datasets:
6
  - common_voice
7
  model-index:
8
+ - name: xls-r-300m-dv
9
+ results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
16
 
17
  This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.2855
20
+ - Wer: 0.2665
21
 
22
  ## Model description
23
 
 
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Wer |
54
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
55
+ | 4.3386 | 0.66 | 400 | 1.1411 | 0.9432 |
56
+ | 0.6543 | 1.33 | 800 | 0.5099 | 0.6749 |
57
+ | 0.4646 | 1.99 | 1200 | 0.4133 | 0.5968 |
58
+ | 0.3748 | 2.65 | 1600 | 0.3534 | 0.5515 |
59
+ | 0.3323 | 3.32 | 2000 | 0.3635 | 0.5527 |
60
+ | 0.3269 | 3.98 | 2400 | 0.3587 | 0.5423 |
61
+ | 0.2984 | 4.64 | 2800 | 0.3340 | 0.5073 |
62
+ | 0.2841 | 5.31 | 3200 | 0.3279 | 0.5004 |
63
+ | 0.2664 | 5.97 | 3600 | 0.3114 | 0.4845 |
64
+ | 0.2397 | 6.63 | 4000 | 0.3174 | 0.4920 |
65
+ | 0.2332 | 7.3 | 4400 | 0.3110 | 0.4911 |
66
+ | 0.2304 | 7.96 | 4800 | 0.3123 | 0.4785 |
67
+ | 0.2134 | 8.62 | 5200 | 0.2984 | 0.4557 |
68
+ | 0.2066 | 9.29 | 5600 | 0.3013 | 0.4723 |
69
+ | 0.1951 | 9.95 | 6000 | 0.2934 | 0.4487 |
70
+ | 0.1806 | 10.61 | 6400 | 0.2802 | 0.4547 |
71
+ | 0.1727 | 11.28 | 6800 | 0.2842 | 0.4333 |
72
+ | 0.1666 | 11.94 | 7200 | 0.2873 | 0.4272 |
73
+ | 0.1562 | 12.6 | 7600 | 0.3042 | 0.4373 |
74
+ | 0.1483 | 13.27 | 8000 | 0.3122 | 0.4313 |
75
+ | 0.1465 | 13.93 | 8400 | 0.2760 | 0.4226 |
76
+ | 0.1335 | 14.59 | 8800 | 0.3112 | 0.4243 |
77
+ | 0.1293 | 15.26 | 9200 | 0.3002 | 0.4133 |
78
+ | 0.1264 | 15.92 | 9600 | 0.2985 | 0.4145 |
79
+ | 0.1179 | 16.58 | 10000 | 0.2925 | 0.4012 |
80
+ | 0.1171 | 17.25 | 10400 | 0.3127 | 0.4012 |
81
+ | 0.1141 | 17.91 | 10800 | 0.2980 | 0.3908 |
82
+ | 0.108 | 18.57 | 11200 | 0.3108 | 0.3951 |
83
+ | 0.1045 | 19.24 | 11600 | 0.3269 | 0.3908 |
84
+ | 0.1047 | 19.9 | 12000 | 0.2998 | 0.3868 |
85
+ | 0.0937 | 20.56 | 12400 | 0.2918 | 0.3875 |
86
+ | 0.0949 | 21.23 | 12800 | 0.2906 | 0.3657 |
87
+ | 0.0879 | 21.89 | 13200 | 0.2974 | 0.3731 |
88
+ | 0.0854 | 22.55 | 13600 | 0.2943 | 0.3711 |
89
+ | 0.0851 | 23.22 | 14000 | 0.2919 | 0.3580 |
90
+ | 0.0789 | 23.88 | 14400 | 0.2983 | 0.3560 |
91
+ | 0.0796 | 24.54 | 14800 | 0.3131 | 0.3544 |
92
+ | 0.0761 | 25.21 | 15200 | 0.2996 | 0.3616 |
93
+ | 0.0755 | 25.87 | 15600 | 0.2972 | 0.3506 |
94
+ | 0.0726 | 26.53 | 16000 | 0.2902 | 0.3474 |
95
+ | 0.0707 | 27.2 | 16400 | 0.3083 | 0.3480 |
96
+ | 0.0669 | 27.86 | 16800 | 0.3035 | 0.3330 |
97
+ | 0.0637 | 28.52 | 17200 | 0.2963 | 0.3370 |
98
+ | 0.0596 | 29.19 | 17600 | 0.2830 | 0.3326 |
99
+ | 0.0583 | 29.85 | 18000 | 0.2969 | 0.3287 |
100
+ | 0.0566 | 30.51 | 18400 | 0.3002 | 0.3480 |
101
+ | 0.0574 | 31.18 | 18800 | 0.2916 | 0.3296 |
102
+ | 0.0536 | 31.84 | 19200 | 0.2933 | 0.3225 |
103
+ | 0.0548 | 32.5 | 19600 | 0.2900 | 0.3179 |
104
+ | 0.0506 | 33.17 | 20000 | 0.3073 | 0.3225 |
105
+ | 0.0511 | 33.83 | 20400 | 0.2925 | 0.3275 |
106
+ | 0.0483 | 34.49 | 20800 | 0.2919 | 0.3245 |
107
+ | 0.0456 | 35.16 | 21200 | 0.2859 | 0.3105 |
108
+ | 0.0445 | 35.82 | 21600 | 0.2864 | 0.3080 |
109
+ | 0.0437 | 36.48 | 22000 | 0.2989 | 0.3084 |
110
+ | 0.04 | 37.15 | 22400 | 0.2887 | 0.3060 |
111
+ | 0.0406 | 37.81 | 22800 | 0.2870 | 0.3013 |
112
+ | 0.0397 | 38.47 | 23200 | 0.2793 | 0.3020 |
113
+ | 0.0383 | 39.14 | 23600 | 0.2955 | 0.2943 |
114
+ | 0.0345 | 39.8 | 24000 | 0.2813 | 0.2905 |
115
+ | 0.0331 | 40.46 | 24400 | 0.2845 | 0.2845 |
116
+ | 0.0338 | 41.13 | 24800 | 0.2832 | 0.2925 |
117
+ | 0.0333 | 41.79 | 25200 | 0.2889 | 0.2849 |
118
+ | 0.0325 | 42.45 | 25600 | 0.2808 | 0.2847 |
119
+ | 0.0314 | 43.12 | 26000 | 0.2867 | 0.2801 |
120
+ | 0.0288 | 43.78 | 26400 | 0.2865 | 0.2834 |
121
+ | 0.0291 | 44.44 | 26800 | 0.2863 | 0.2806 |
122
+ | 0.0269 | 45.11 | 27200 | 0.2941 | 0.2736 |
123
+ | 0.0275 | 45.77 | 27600 | 0.2897 | 0.2736 |
124
+ | 0.0271 | 46.43 | 28000 | 0.2857 | 0.2695 |
125
+ | 0.0251 | 47.1 | 28400 | 0.2881 | 0.2702 |
126
+ | 0.0243 | 47.76 | 28800 | 0.2901 | 0.2684 |
127
+ | 0.0244 | 48.42 | 29200 | 0.2849 | 0.2679 |
128
+ | 0.0232 | 49.09 | 29600 | 0.2849 | 0.2677 |
129
+ | 0.0224 | 49.75 | 30000 | 0.2855 | 0.2665 |
130
 
131
 
132
  ### Framework versions
133
 
134
+ - Transformers 4.17.0.dev0
135
+ - Pytorch 1.10.2+cu102
136
+ - Datasets 1.18.3
137
  - Tokenizers 0.11.0