Edit model card

wav2vec2-large-xls-r-300m-turkish-colab

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7126
  • Wer: 0.8198

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 4
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 120
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
6.7419 2.38 200 3.1913 1.0
3.0446 4.76 400 2.3247 1.0
1.3163 7.14 600 1.2629 0.9656
0.6058 9.52 800 1.2203 0.9343
0.3687 11.9 1000 1.2157 0.8849
0.2644 14.29 1200 1.3693 0.8992
0.2147 16.67 1400 1.3321 0.8623
0.1962 19.05 1600 1.3476 0.8886
0.1631 21.43 1800 1.3984 0.8755
0.15 23.81 2000 1.4602 0.8798
0.1311 26.19 2200 1.4727 0.8836
0.1174 28.57 2400 1.5257 0.8805
0.1155 30.95 2600 1.4697 0.9337
0.1046 33.33 2800 1.6076 0.8667
0.1063 35.71 3000 1.5012 0.8861
0.0996 38.1 3200 1.6204 0.8605
0.088 40.48 3400 1.4788 0.8586
0.089 42.86 3600 1.5983 0.8648
0.0805 45.24 3800 1.5045 0.8298
0.0718 47.62 4000 1.6361 0.8611
0.0718 50.0 4200 1.5088 0.8548
0.0649 52.38 4400 1.5491 0.8554
0.0685 54.76 4600 1.5939 0.8442
0.0588 57.14 4800 1.6321 0.8536
0.0591 59.52 5000 1.6468 0.8442
0.0529 61.9 5200 1.6086 0.8661
0.0482 64.29 5400 1.6622 0.8517
0.0396 66.67 5600 1.6191 0.8436
0.0463 69.05 5800 1.6231 0.8661
0.0415 71.43 6000 1.6874 0.8511
0.0383 73.81 6200 1.7054 0.8411
0.0411 76.19 6400 1.7073 0.8486
0.0346 78.57 6600 1.7137 0.8342
0.0318 80.95 6800 1.6523 0.8329
0.0299 83.33 7000 1.6893 0.8579
0.029 85.71 7200 1.7162 0.8429
0.025 88.1 7400 1.7589 0.8529
0.025 90.48 7600 1.7581 0.8398
0.0232 92.86 7800 1.8459 0.8442
0.0215 95.24 8000 1.7942 0.8448
0.0222 97.62 8200 1.6848 0.8442
0.0179 100.0 8400 1.7223 0.8298
0.0176 102.38 8600 1.7426 0.8404
0.016 104.76 8800 1.7501 0.8411
0.0153 107.14 9000 1.7185 0.8235
0.0136 109.52 9200 1.7250 0.8292
0.0117 111.9 9400 1.7159 0.8185
0.0123 114.29 9600 1.7135 0.8248
0.0121 116.67 9800 1.7189 0.8210
0.0116 119.05 10000 1.7126 0.8198

Framework versions

  • Transformers 4.11.3
  • Pytorch 1.10.0+cu113
  • Datasets 1.18.3
  • Tokenizers 0.10.3
Downloads last month
10

Dataset used to train lilitket/wav2vec2-large-xls-r-300m-turkish-colab