wav2vec2-large-xls-r-300m-spanish-small
This model is a fine-tuned version of jhonparra18/wav2vec2-large-xls-r-300m-spanish-custom on the common_voice dataset. It achieves the following results on the evaluation set:
- Loss: 0.3763
- Wer: 0.1791
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.2277 | 0.26 | 400 | 0.2601 | 0.2291 |
0.2932 | 0.53 | 800 | 0.2950 | 0.2670 |
0.3019 | 0.79 | 1200 | 0.3247 | 0.2766 |
0.2987 | 1.05 | 1600 | 0.3031 | 0.2606 |
0.261 | 1.32 | 2000 | 0.2994 | 0.2620 |
0.2651 | 1.58 | 2400 | 0.3134 | 0.2700 |
0.264 | 1.85 | 2800 | 0.3016 | 0.2641 |
0.2475 | 2.11 | 3200 | 0.3135 | 0.2661 |
0.2269 | 2.37 | 3600 | 0.3029 | 0.2562 |
0.2389 | 2.64 | 4000 | 0.3035 | 0.2549 |
0.2319 | 2.9 | 4400 | 0.3022 | 0.2551 |
0.2123 | 3.16 | 4800 | 0.3256 | 0.2638 |
0.2094 | 3.43 | 5200 | 0.3227 | 0.2712 |
0.2121 | 3.69 | 5600 | 0.3085 | 0.2596 |
0.207 | 3.96 | 6000 | 0.3041 | 0.2597 |
0.1809 | 4.22 | 6400 | 0.3122 | 0.2524 |
0.1846 | 4.48 | 6800 | 0.3254 | 0.2579 |
0.1885 | 4.75 | 7200 | 0.2958 | 0.2437 |
0.1923 | 5.01 | 7600 | 0.3136 | 0.2502 |
0.1626 | 5.27 | 8000 | 0.3059 | 0.2488 |
0.1704 | 5.54 | 8400 | 0.3082 | 0.2515 |
0.1674 | 5.8 | 8800 | 0.3196 | 0.2509 |
0.1691 | 6.06 | 9200 | 0.3193 | 0.25 |
0.1499 | 6.33 | 9600 | 0.3529 | 0.2635 |
0.1568 | 6.59 | 10000 | 0.3241 | 0.2481 |
0.1538 | 6.86 | 10400 | 0.3354 | 0.2476 |
0.1503 | 7.12 | 10800 | 0.3180 | 0.2402 |
0.136 | 7.38 | 11200 | 0.3230 | 0.2397 |
0.1413 | 7.65 | 11600 | 0.3178 | 0.2451 |
0.147 | 7.91 | 12000 | 0.3170 | 0.2389 |
0.1341 | 8.17 | 12400 | 0.3380 | 0.2501 |
0.1329 | 8.44 | 12800 | 0.3265 | 0.2414 |
0.1314 | 8.7 | 13200 | 0.3281 | 0.2482 |
0.1312 | 8.97 | 13600 | 0.3259 | 0.2539 |
0.12 | 9.23 | 14000 | 0.3291 | 0.2424 |
0.1193 | 9.49 | 14400 | 0.3302 | 0.2412 |
0.1189 | 9.76 | 14800 | 0.3376 | 0.2407 |
0.1217 | 10.02 | 15200 | 0.3334 | 0.2400 |
0.1118 | 10.28 | 15600 | 0.3359 | 0.2368 |
0.1139 | 10.55 | 16000 | 0.3239 | 0.2335 |
0.1106 | 10.81 | 16400 | 0.3374 | 0.2352 |
0.1081 | 11.07 | 16800 | 0.3585 | 0.2434 |
0.1063 | 11.34 | 17200 | 0.3639 | 0.2472 |
0.1041 | 11.6 | 17600 | 0.3399 | 0.2423 |
0.1062 | 11.87 | 18000 | 0.3410 | 0.2388 |
0.1012 | 12.13 | 18400 | 0.3597 | 0.2413 |
0.0953 | 12.39 | 18800 | 0.3440 | 0.2296 |
0.097 | 12.66 | 19200 | 0.3440 | 0.2269 |
0.0968 | 12.92 | 19600 | 0.3498 | 0.2333 |
0.0902 | 13.18 | 20000 | 0.3471 | 0.2290 |
0.0868 | 13.45 | 20400 | 0.3462 | 0.2266 |
0.0892 | 13.71 | 20800 | 0.3373 | 0.2227 |
0.0902 | 13.97 | 21200 | 0.3377 | 0.2240 |
0.0846 | 14.24 | 21600 | 0.3484 | 0.2237 |
0.0839 | 14.5 | 22000 | 0.3706 | 0.2260 |
0.0834 | 14.77 | 22400 | 0.3430 | 0.2268 |
0.0841 | 15.03 | 22800 | 0.3489 | 0.2259 |
0.076 | 15.29 | 23200 | 0.3626 | 0.2281 |
0.0771 | 15.56 | 23600 | 0.3624 | 0.2268 |
0.0773 | 15.82 | 24000 | 0.3440 | 0.2252 |
0.0759 | 16.08 | 24400 | 0.3532 | 0.2170 |
0.0745 | 16.35 | 24800 | 0.3686 | 0.2188 |
0.0713 | 16.61 | 25200 | 0.3691 | 0.2195 |
0.0718 | 16.88 | 25600 | 0.3470 | 0.2108 |
0.0685 | 17.14 | 26000 | 0.3756 | 0.2179 |
0.0689 | 17.4 | 26400 | 0.3542 | 0.2149 |
0.0671 | 17.67 | 26800 | 0.3461 | 0.2165 |
0.0737 | 17.93 | 27200 | 0.3473 | 0.2238 |
0.0669 | 18.19 | 27600 | 0.3441 | 0.2138 |
0.0629 | 18.46 | 28000 | 0.3721 | 0.2155 |
0.0632 | 18.72 | 28400 | 0.3667 | 0.2126 |
0.0647 | 18.98 | 28800 | 0.3579 | 0.2097 |
0.0603 | 19.25 | 29200 | 0.3670 | 0.2130 |
0.0604 | 19.51 | 29600 | 0.3750 | 0.2142 |
0.0619 | 19.78 | 30000 | 0.3804 | 0.2160 |
0.0603 | 20.04 | 30400 | 0.3764 | 0.2124 |
0.0577 | 20.3 | 30800 | 0.3858 | 0.2097 |
0.0583 | 20.57 | 31200 | 0.3520 | 0.2089 |
0.0561 | 20.83 | 31600 | 0.3615 | 0.2079 |
0.0545 | 21.09 | 32000 | 0.3824 | 0.2032 |
0.0525 | 21.36 | 32400 | 0.3858 | 0.2091 |
0.0524 | 21.62 | 32800 | 0.3956 | 0.2099 |
0.0527 | 21.89 | 33200 | 0.3667 | 0.2025 |
0.0514 | 22.15 | 33600 | 0.3708 | 0.2032 |
0.0506 | 22.41 | 34000 | 0.3815 | 0.2053 |
0.0478 | 22.68 | 34400 | 0.3671 | 0.2007 |
0.049 | 22.94 | 34800 | 0.3758 | 0.2003 |
0.0477 | 23.2 | 35200 | 0.3786 | 0.2014 |
0.045 | 23.47 | 35600 | 0.3732 | 0.1998 |
0.0426 | 23.73 | 36000 | 0.3737 | 0.2010 |
0.0444 | 23.99 | 36400 | 0.3600 | 0.1990 |
0.0433 | 24.26 | 36800 | 0.3689 | 0.1976 |
0.0442 | 24.52 | 37200 | 0.3787 | 0.1968 |
0.0419 | 24.79 | 37600 | 0.3652 | 0.1961 |
0.042 | 25.05 | 38000 | 0.3820 | 0.1964 |
0.0419 | 25.31 | 38400 | 0.3786 | 0.1919 |
0.0376 | 25.58 | 38800 | 0.3842 | 0.1934 |
0.0385 | 25.84 | 39200 | 0.3767 | 0.1900 |
0.0396 | 26.1 | 39600 | 0.3688 | 0.1888 |
0.0371 | 26.37 | 40000 | 0.3815 | 0.1894 |
0.0363 | 26.63 | 40400 | 0.3748 | 0.1878 |
0.0377 | 26.9 | 40800 | 0.3713 | 0.1852 |
0.0352 | 27.16 | 41200 | 0.3734 | 0.1851 |
0.0355 | 27.42 | 41600 | 0.3776 | 0.1874 |
0.0333 | 27.69 | 42000 | 0.3867 | 0.1841 |
0.0348 | 27.95 | 42400 | 0.3823 | 0.1839 |
0.0329 | 28.21 | 42800 | 0.3795 | 0.1822 |
0.0325 | 28.48 | 43200 | 0.3711 | 0.1813 |
0.0328 | 28.74 | 43600 | 0.3721 | 0.1781 |
0.0312 | 29.0 | 44000 | 0.3803 | 0.1816 |
0.0318 | 29.27 | 44400 | 0.3758 | 0.1794 |
0.0302 | 29.53 | 44800 | 0.3792 | 0.1784 |
0.0339 | 29.8 | 45200 | 0.3763 | 0.1791 |
Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.