lucio commited on
Commit
fcbf49b
1 Parent(s): 7c4a099

fix model card

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -34,7 +34,7 @@ should probably proofread and complete it, then remove this comment. -->
34
  # XLS-R-300M Kyrgiz CV8
35
 
36
  This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - KY dataset.
37
- It achieves the following results on the evaluation set:
38
  - Loss: 0.5497
39
  - Wer: 0.2945
40
  - Cer: 0.0791
@@ -55,11 +55,11 @@ The model is not reliable enough to use as a substitute for live captions for ac
55
 
56
  ## Training and evaluation data
57
 
58
- The combination of `train` and `dev` of common voice official splits were used as training data. The half of the official `test` split was used as validation data as well as for final evaluation.
59
 
60
  ## Training procedure
61
 
62
- The featurization layers of the XLS-R model are frozen while tuning a final CTC/LM layer on the Uyghur CV8 example sentences. A ramped learning rate is used with an initial warmup phase of 500 steps, a max of 0.0001, and cooling back towards 0 for the remainder of the 8100 steps (300 epochs).
63
 
64
  ### Training hyperparameters
65
 
34
  # XLS-R-300M Kyrgiz CV8
35
 
36
  This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - KY dataset.
37
+ It achieves the following results on the validation set:
38
  - Loss: 0.5497
39
  - Wer: 0.2945
40
  - Cer: 0.0791
55
 
56
  ## Training and evaluation data
57
 
58
+ The combination of `train`, `dev` and `other` of common voice official splits were used as training data. The half of the official `test` split was used as validation data, as and the full `test` set was used for final evaluation.
59
 
60
  ## Training procedure
61
 
62
+ The featurization layers of the XLS-R model are frozen while tuning a final CTC/LM layer on the Kyrgiz CV8 example sentences. A ramped learning rate is used with an initial warmup phase of 500 steps, a max of 0.0001, and cooling back towards 0 for the remainder of the 8100 steps (300 epochs).
63
 
64
  ### Training hyperparameters
65