tgrhn's picture
End of training
8a9bdea verified
metadata
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-1b
tags:
  - generated_from_trainer
datasets:
  - common_voice_16_1
metrics:
  - wer
model-index:
  - name: wav2vec-turkish-300m-xls
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: common_voice_16_1
          type: common_voice_16_1
          config: tr
          split: test
          args: tr
        metrics:
          - name: Wer
            type: wer
            value: 0.4719196059396685

wav2vec-turkish-300m-xls

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the common_voice_16_1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6040
  • Wer: 0.4719

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.2653 0.29 400 1.1675 0.9024
1.0108 0.58 800 1.2504 0.9083
0.9497 0.88 1200 1.2197 0.9001
0.8842 1.17 1600 0.8955 0.8195
0.8603 1.46 2000 0.8623 0.8267
0.8271 1.75 2400 0.8324 0.7950
0.8037 2.05 2800 0.7933 0.7751
0.7488 2.34 3200 0.8006 0.7841
0.7683 2.63 3600 0.8042 0.7925
0.7554 2.92 4000 0.7955 0.7728
0.7078 3.21 4400 0.8038 0.7781
0.7161 3.51 4800 0.8470 0.7967
0.7114 3.8 5200 0.8424 0.8074
0.696 4.09 5600 0.7482 0.7617
0.6581 4.38 6000 0.7177 0.7427
0.6448 4.67 6400 0.7201 0.7292
0.6343 4.97 6800 0.6995 0.7265
0.6072 5.26 7200 0.7234 0.7549
0.598 5.55 7600 0.6919 0.7215
0.5938 5.84 8000 0.7364 0.7310
0.5641 6.14 8400 0.7075 0.7074
0.5557 6.43 8800 0.6785 0.7069
0.5537 6.72 9200 0.6434 0.7019
0.5475 7.01 9600 0.6415 0.6797
0.5053 7.3 10000 0.6402 0.6751
0.518 7.6 10400 0.6214 0.6618
0.5041 7.89 10800 0.6156 0.6607
0.4853 8.18 11200 0.6600 0.6879
0.4738 8.47 11600 0.6359 0.6659
0.4766 8.77 12000 0.6674 0.6980
0.4822 9.06 12400 0.6189 0.6545
0.4421 9.35 12800 0.6090 0.6434
0.4562 9.64 13200 0.6099 0.6383
0.4559 9.93 13600 0.6009 0.6511
0.4456 10.23 14000 0.6090 0.6409
0.4277 10.52 14400 0.5940 0.6374
0.4228 10.81 14800 0.5992 0.6402
0.4139 11.1 15200 0.6287 0.6344
0.3918 11.4 15600 0.6134 0.6326
0.3957 11.69 16000 0.5952 0.6275
0.4022 11.98 16400 0.5957 0.6350
0.3733 12.27 16800 0.5466 0.6010
0.3749 12.56 17200 0.5566 0.6044
0.3736 12.86 17600 0.5453 0.5994
0.3701 13.15 18000 0.5846 0.6203
0.359 13.44 18400 0.5880 0.5997
0.3501 13.73 18800 0.5738 0.6153
0.3403 14.02 19200 0.5766 0.5855
0.3302 14.32 19600 0.5507 0.5954
0.3295 14.61 20000 0.5467 0.5899
0.3371 14.9 20400 0.5571 0.5907
0.3253 15.19 20800 0.5266 0.5745
0.3054 15.49 21200 0.5211 0.5681
0.3044 15.78 21600 0.5409 0.5698
0.2982 16.07 22000 0.5467 0.5827
0.293 16.36 22400 0.5426 0.5706
0.2901 16.65 22800 0.5404 0.5793
0.2913 16.95 23200 0.5342 0.5647
0.269 17.24 23600 0.5309 0.5623
0.2803 17.53 24000 0.5300 0.5637
0.2697 17.82 24400 0.5103 0.5539
0.2727 18.12 24800 0.5414 0.5607
0.2505 18.41 25200 0.5472 0.5508
0.2554 18.7 25600 0.5260 0.5511
0.2552 18.99 26000 0.5246 0.5389
0.2333 19.28 26400 0.5392 0.5497
0.2315 19.58 26800 0.5230 0.5395
0.2366 19.87 27200 0.5303 0.5344
0.2317 20.16 27600 0.5348 0.5350
0.2229 20.45 28000 0.5138 0.5328
0.2243 20.75 28400 0.5147 0.5235
0.2169 21.04 28800 0.5494 0.5266
0.2068 21.33 29200 0.5361 0.5266
0.2073 21.62 29600 0.5660 0.5346
0.2035 21.91 30000 0.5048 0.5196
0.1954 22.21 30400 0.5498 0.5137
0.1974 22.5 30800 0.5338 0.5157
0.1914 22.79 31200 0.5311 0.5080
0.1874 23.08 31600 0.5600 0.5020
0.1792 23.37 32000 0.5428 0.5012
0.182 23.67 32400 0.5237 0.5013
0.1825 23.96 32800 0.5383 0.4999
0.1723 24.25 33200 0.5690 0.5063
0.1673 24.54 33600 0.5525 0.5030
0.165 24.84 34000 0.5519 0.5001
0.162 25.13 34400 0.5553 0.4966
0.1597 25.42 34800 0.5614 0.4938
0.1505 25.71 35200 0.5569 0.4932
0.157 26.0 35600 0.5629 0.4931
0.1468 26.3 36000 0.5808 0.4879
0.1438 26.59 36400 0.5675 0.4871
0.1462 26.88 36800 0.5568 0.4852
0.138 27.17 37200 0.5995 0.4821
0.1394 27.47 37600 0.5810 0.4798
0.1363 27.76 38000 0.5776 0.4771
0.1318 28.05 38400 0.5909 0.4763
0.128 28.34 38800 0.5967 0.4782
0.1304 28.63 39200 0.5866 0.4758
0.1284 28.93 39600 0.5904 0.4747
0.1207 29.22 40000 0.6023 0.4739
0.1275 29.51 40400 0.6038 0.4733
0.1241 29.8 40800 0.6040 0.4719

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.2