wav2vec-turkish-300m-xls-2
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the common_voice_16_1 dataset. It achieves the following results on the evaluation set:
- Loss: 0.5268
- Wer: 0.4094
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 0.1
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
1.5798 | 0.29 | 400 | 1.0275 | 0.8828 |
0.8353 | 0.58 | 800 | 0.9331 | 0.8265 |
0.7665 | 0.88 | 1200 | 0.8985 | 0.8207 |
0.7054 | 1.17 | 1600 | 0.7412 | 0.7435 |
0.6731 | 1.46 | 2000 | 0.6856 | 0.7321 |
0.6646 | 1.75 | 2400 | 0.7127 | 0.7589 |
0.6425 | 2.05 | 2800 | 0.6633 | 0.7016 |
0.5738 | 2.34 | 3200 | 0.6461 | 0.6872 |
0.5851 | 2.63 | 3600 | 0.6337 | 0.6808 |
0.5839 | 2.92 | 4000 | 0.6459 | 0.6951 |
0.5458 | 3.21 | 4400 | 0.6234 | 0.6699 |
0.5359 | 3.51 | 4800 | 0.6429 | 0.6777 |
0.5408 | 3.8 | 5200 | 0.6547 | 0.6833 |
0.5169 | 4.09 | 5600 | 0.6038 | 0.6444 |
0.4805 | 4.38 | 6000 | 0.5888 | 0.6439 |
0.4892 | 4.67 | 6400 | 0.5840 | 0.6349 |
0.4795 | 4.97 | 6800 | 0.5705 | 0.6327 |
0.4497 | 5.26 | 7200 | 0.6103 | 0.6621 |
0.4506 | 5.55 | 7600 | 0.5813 | 0.6328 |
0.4513 | 5.84 | 8000 | 0.5776 | 0.6423 |
0.4254 | 6.14 | 8400 | 0.6039 | 0.6218 |
0.424 | 6.43 | 8800 | 0.6233 | 0.6208 |
0.4246 | 6.72 | 9200 | 0.5717 | 0.6248 |
0.4233 | 7.01 | 9600 | 0.5588 | 0.5968 |
0.3829 | 7.3 | 10000 | 0.5472 | 0.5922 |
0.397 | 7.6 | 10400 | 0.5176 | 0.5713 |
0.3813 | 7.89 | 10800 | 0.5004 | 0.5721 |
0.3623 | 8.18 | 11200 | 0.5643 | 0.5959 |
0.3551 | 8.47 | 11600 | 0.5771 | 0.5949 |
0.3685 | 8.77 | 12000 | 0.5878 | 0.6092 |
0.3562 | 9.06 | 12400 | 0.5197 | 0.5660 |
0.3275 | 9.35 | 12800 | 0.5242 | 0.5536 |
0.3378 | 9.64 | 13200 | 0.5141 | 0.5627 |
0.3476 | 9.93 | 13600 | 0.5140 | 0.5657 |
0.3272 | 10.23 | 14000 | 0.5235 | 0.5599 |
0.3152 | 10.52 | 14400 | 0.5018 | 0.5521 |
0.3119 | 10.81 | 14800 | 0.5034 | 0.5576 |
0.298 | 11.1 | 15200 | 0.5228 | 0.5649 |
0.2877 | 11.4 | 15600 | 0.5256 | 0.5592 |
0.2954 | 11.69 | 16000 | 0.5207 | 0.5513 |
0.2962 | 11.98 | 16400 | 0.4810 | 0.5348 |
0.2741 | 12.27 | 16800 | 0.4870 | 0.5278 |
0.2701 | 12.56 | 17200 | 0.4870 | 0.5366 |
0.2731 | 12.86 | 17600 | 0.4736 | 0.5274 |
0.2653 | 13.15 | 18000 | 0.4971 | 0.5340 |
0.252 | 13.44 | 18400 | 0.5104 | 0.5340 |
0.2579 | 13.73 | 18800 | 0.4838 | 0.5434 |
0.2457 | 14.02 | 19200 | 0.5106 | 0.5189 |
0.2403 | 14.32 | 19600 | 0.4655 | 0.5141 |
0.2335 | 14.61 | 20000 | 0.4887 | 0.5200 |
0.2414 | 14.9 | 20400 | 0.4792 | 0.5146 |
0.237 | 15.19 | 20800 | 0.4746 | 0.5063 |
0.2236 | 15.49 | 21200 | 0.4740 | 0.4985 |
0.2215 | 15.78 | 21600 | 0.4687 | 0.5031 |
0.2186 | 16.07 | 22000 | 0.5013 | 0.5106 |
0.2132 | 16.36 | 22400 | 0.4958 | 0.5095 |
0.2085 | 16.65 | 22800 | 0.4865 | 0.4841 |
0.2069 | 16.95 | 23200 | 0.4686 | 0.4924 |
0.1895 | 17.24 | 23600 | 0.4950 | 0.4862 |
0.2019 | 17.53 | 24000 | 0.4707 | 0.4875 |
0.1941 | 17.82 | 24400 | 0.4667 | 0.4838 |
0.1901 | 18.12 | 24800 | 0.4899 | 0.4846 |
0.179 | 18.41 | 25200 | 0.4840 | 0.4795 |
0.1817 | 18.7 | 25600 | 0.4878 | 0.4804 |
0.1771 | 18.99 | 26000 | 0.5001 | 0.4767 |
0.1674 | 19.28 | 26400 | 0.5023 | 0.4749 |
0.1716 | 19.58 | 26800 | 0.4710 | 0.4677 |
0.1723 | 19.87 | 27200 | 0.4855 | 0.4685 |
0.1647 | 20.16 | 27600 | 0.5059 | 0.4699 |
0.1557 | 20.45 | 28000 | 0.4829 | 0.4611 |
0.1621 | 20.75 | 28400 | 0.4833 | 0.4634 |
0.1543 | 21.04 | 28800 | 0.5226 | 0.4699 |
0.1463 | 21.33 | 29200 | 0.5186 | 0.4654 |
0.1513 | 21.62 | 29600 | 0.5028 | 0.4760 |
0.1474 | 21.91 | 30000 | 0.4965 | 0.4573 |
0.1405 | 22.21 | 30400 | 0.4657 | 0.4511 |
0.1385 | 22.5 | 30800 | 0.5062 | 0.4537 |
0.1351 | 22.79 | 31200 | 0.4843 | 0.4524 |
0.132 | 23.08 | 31600 | 0.4935 | 0.4484 |
0.1289 | 23.37 | 32000 | 0.5018 | 0.4491 |
0.1279 | 23.67 | 32400 | 0.4874 | 0.4432 |
0.1291 | 23.96 | 32800 | 0.4813 | 0.4404 |
0.1251 | 24.25 | 33200 | 0.4866 | 0.4412 |
0.1189 | 24.54 | 33600 | 0.4975 | 0.4420 |
0.1172 | 24.84 | 34000 | 0.4829 | 0.4363 |
0.1169 | 25.13 | 34400 | 0.4943 | 0.4333 |
0.1142 | 25.42 | 34800 | 0.5113 | 0.4326 |
0.1081 | 25.71 | 35200 | 0.5121 | 0.4314 |
0.1119 | 26.0 | 35600 | 0.5067 | 0.4316 |
0.1043 | 26.3 | 36000 | 0.5221 | 0.4292 |
0.1017 | 26.59 | 36400 | 0.5230 | 0.4264 |
0.1035 | 26.88 | 36800 | 0.5141 | 0.4267 |
0.0957 | 27.17 | 37200 | 0.5320 | 0.4231 |
0.0994 | 27.47 | 37600 | 0.5173 | 0.4180 |
0.0947 | 27.76 | 38000 | 0.5218 | 0.4162 |
0.0932 | 28.05 | 38400 | 0.5163 | 0.4181 |
0.0912 | 28.34 | 38800 | 0.5277 | 0.4151 |
0.0928 | 28.63 | 39200 | 0.5152 | 0.4136 |
0.0918 | 28.93 | 39600 | 0.5145 | 0.4125 |
0.0852 | 29.22 | 40000 | 0.5257 | 0.4108 |
0.0904 | 29.51 | 40400 | 0.5239 | 0.4092 |
0.0858 | 29.8 | 40800 | 0.5268 | 0.4094 |
Framework versions
- Transformers 4.38.1
- Pytorch 2.2.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.2
- Downloads last month
- 15