--- language: - pt license: apache-2.0 tags: - automatic-speech-recognition - generated_from_trainer - hf-asr-leaderboard - mozilla-foundation/common_voice_8_0 - pt - robust-speech-event datasets: - mozilla-foundation/common_voice_8_0 model-index: - name: wav2vec2-xls-r-1b-cv8 results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Common Voice 8 type: mozilla-foundation/common_voice_8_0 args: pt metrics: - name: Test WER type: wer value: 17.7 - name: Test CER type: cer value: 5.21 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Robust Speech Event - Dev Data type: speech-recognition-community-v2/dev_data args: sv metrics: - name: Test WER type: wer value: 45.68 - name: Test CER type: cer value: 18.67 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Robust Speech Event - Dev Data type: speech-recognition-community-v2/dev_data args: pt metrics: - name: Test WER type: wer value: 45.29 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Robust Speech Event - Test Data type: speech-recognition-community-v2/eval_data args: pt metrics: - name: Test WER type: wer value: 48.03 --- # wav2vec2-xls-r-1b-cv8 This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - PT dataset. It achieves the following results on the evaluation set: - Loss: 0.2007 - Wer: 0.1838 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 7.5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 30.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 2.1172 | 0.32 | 500 | 1.2852 | 0.9783 | | 1.4152 | 0.64 | 1000 | 0.6434 | 0.6105 | | 1.4342 | 0.96 | 1500 | 0.4844 | 0.3989 | | 1.4657 | 1.29 | 2000 | 0.5080 | 0.4490 | | 1.4961 | 1.61 | 2500 | 0.4764 | 0.4264 | | 1.4515 | 1.93 | 3000 | 0.4519 | 0.4068 | | 1.3924 | 2.25 | 3500 | 0.4472 | 0.4132 | | 1.4524 | 2.57 | 4000 | 0.4455 | 0.3939 | | 1.4328 | 2.89 | 4500 | 0.4369 | 0.4069 | | 1.3456 | 3.22 | 5000 | 0.4234 | 0.3774 | | 1.3725 | 3.54 | 5500 | 0.4387 | 0.3789 | | 1.3812 | 3.86 | 6000 | 0.4298 | 0.3825 | | 1.3282 | 4.18 | 6500 | 0.4025 | 0.3703 | | 1.3326 | 4.5 | 7000 | 0.3917 | 0.3502 | | 1.3028 | 4.82 | 7500 | 0.3889 | 0.3582 | | 1.293 | 5.14 | 8000 | 0.3859 | 0.3496 | | 1.321 | 5.47 | 8500 | 0.3875 | 0.3576 | | 1.3165 | 5.79 | 9000 | 0.3927 | 0.3589 | | 1.2701 | 6.11 | 9500 | 0.4058 | 0.3621 | | 1.2718 | 6.43 | 10000 | 0.4211 | 0.3916 | | 1.2683 | 6.75 | 10500 | 0.3968 | 0.3620 | | 1.2643 | 7.07 | 11000 | 0.4128 | 0.3848 | | 1.2485 | 7.4 | 11500 | 0.3849 | 0.3727 | | 1.2608 | 7.72 | 12000 | 0.3770 | 0.3474 | | 1.2388 | 8.04 | 12500 | 0.3774 | 0.3574 | | 1.2524 | 8.36 | 13000 | 0.3789 | 0.3550 | | 1.2458 | 8.68 | 13500 | 0.3770 | 0.3410 | | 1.2505 | 9.0 | 14000 | 0.3638 | 0.3403 | | 1.2254 | 9.32 | 14500 | 0.3770 | 0.3509 | | 1.2459 | 9.65 | 15000 | 0.3592 | 0.3349 | | 1.2049 | 9.97 | 15500 | 0.3600 | 0.3428 | | 1.2097 | 10.29 | 16000 | 0.3626 | 0.3347 | | 1.1988 | 10.61 | 16500 | 0.3740 | 0.3269 | | 1.1671 | 10.93 | 17000 | 0.3548 | 0.3245 | | 1.1532 | 11.25 | 17500 | 0.3394 | 0.3140 | | 1.1459 | 11.58 | 18000 | 0.3349 | 0.3156 | | 1.1511 | 11.9 | 18500 | 0.3272 | 0.3110 | | 1.1465 | 12.22 | 19000 | 0.3348 | 0.3084 | | 1.1426 | 12.54 | 19500 | 0.3193 | 0.3027 | | 1.1278 | 12.86 | 20000 | 0.3318 | 0.3021 | | 1.149 | 13.18 | 20500 | 0.3169 | 0.2947 | | 1.114 | 13.5 | 21000 | 0.3224 | 0.2986 | | 1.1249 | 13.83 | 21500 | 0.3227 | 0.2921 | | 1.0968 | 14.15 | 22000 | 0.3033 | 0.2878 | | 1.0851 | 14.47 | 22500 | 0.2996 | 0.2863 | | 1.0985 | 14.79 | 23000 | 0.3011 | 0.2843 | | 1.0808 | 15.11 | 23500 | 0.2932 | 0.2759 | | 1.069 | 15.43 | 24000 | 0.2919 | 0.2750 | | 1.0602 | 15.76 | 24500 | 0.2959 | 0.2713 | | 1.0369 | 16.08 | 25000 | 0.2931 | 0.2754 | | 1.0573 | 16.4 | 25500 | 0.2920 | 0.2722 | | 1.051 | 16.72 | 26000 | 0.2855 | 0.2632 | | 1.0279 | 17.04 | 26500 | 0.2850 | 0.2649 | | 1.0496 | 17.36 | 27000 | 0.2817 | 0.2585 | | 1.0516 | 17.68 | 27500 | 0.2961 | 0.2635 | | 1.0244 | 18.01 | 28000 | 0.2781 | 0.2589 | | 1.0099 | 18.33 | 28500 | 0.2783 | 0.2565 | | 1.0016 | 18.65 | 29000 | 0.2719 | 0.2537 | | 1.0157 | 18.97 | 29500 | 0.2621 | 0.2449 | | 0.9572 | 19.29 | 30000 | 0.2582 | 0.2427 | | 0.9802 | 19.61 | 30500 | 0.2707 | 0.2468 | | 0.9577 | 19.94 | 31000 | 0.2563 | 0.2389 | | 0.9562 | 20.26 | 31500 | 0.2592 | 0.2382 | | 0.962 | 20.58 | 32000 | 0.2539 | 0.2341 | | 0.9541 | 20.9 | 32500 | 0.2505 | 0.2288 | | 0.9587 | 21.22 | 33000 | 0.2486 | 0.2302 | | 0.9146 | 21.54 | 33500 | 0.2461 | 0.2269 | | 0.9215 | 21.86 | 34000 | 0.2387 | 0.2228 | | 0.9105 | 22.19 | 34500 | 0.2405 | 0.2222 | | 0.8949 | 22.51 | 35000 | 0.2316 | 0.2191 | | 0.9153 | 22.83 | 35500 | 0.2358 | 0.2180 | | 0.8907 | 23.15 | 36000 | 0.2369 | 0.2168 | | 0.8973 | 23.47 | 36500 | 0.2323 | 0.2120 | | 0.8878 | 23.79 | 37000 | 0.2293 | 0.2104 | | 0.8818 | 24.12 | 37500 | 0.2302 | 0.2132 | | 0.8919 | 24.44 | 38000 | 0.2262 | 0.2083 | | 0.8473 | 24.76 | 38500 | 0.2257 | 0.2040 | | 0.8516 | 25.08 | 39000 | 0.2246 | 0.2031 | | 0.8451 | 25.4 | 39500 | 0.2198 | 0.2000 | | 0.8288 | 25.72 | 40000 | 0.2199 | 0.1990 | | 0.8465 | 26.05 | 40500 | 0.2165 | 0.1972 | | 0.8305 | 26.37 | 41000 | 0.2128 | 0.1957 | | 0.8202 | 26.69 | 41500 | 0.2127 | 0.1937 | | 0.8223 | 27.01 | 42000 | 0.2100 | 0.1934 | | 0.8322 | 27.33 | 42500 | 0.2076 | 0.1905 | | 0.8139 | 27.65 | 43000 | 0.2054 | 0.1880 | | 0.8299 | 27.97 | 43500 | 0.2026 | 0.1868 | | 0.7937 | 28.3 | 44000 | 0.2045 | 0.1872 | | 0.7972 | 28.62 | 44500 | 0.2025 | 0.1861 | | 0.809 | 28.94 | 45000 | 0.2026 | 0.1858 | | 0.813 | 29.26 | 45500 | 0.2013 | 0.1838 | | 0.7718 | 29.58 | 46000 | 0.2010 | 0.1837 | | 0.7929 | 29.9 | 46500 | 0.2008 | 0.1840 | ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.2+cu102 - Datasets 1.18.3.dev0 - Tokenizers 0.11.0