Edit model card

wav2vec2-burak-new-300-v2-2

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6158
  • Wer: 0.3094

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 241

Training results

Training Loss Epoch Step Validation Loss Wer
5.5201 8.62 500 3.1581 1.0
2.1532 17.24 1000 0.6883 0.5979
0.5465 25.86 1500 0.5028 0.4432
0.3287 34.48 2000 0.4986 0.4024
0.2571 43.1 2500 0.4920 0.3824
0.217 51.72 3000 0.5265 0.3724
0.1848 60.34 3500 0.5539 0.3714
0.1605 68.97 4000 0.5689 0.3670
0.1413 77.59 4500 0.5962 0.3501
0.1316 86.21 5000 0.5732 0.3494
0.1168 94.83 5500 0.5912 0.3461
0.1193 103.45 6000 0.5766 0.3378
0.0996 112.07 6500 0.5818 0.3403
0.0941 120.69 7000 0.5986 0.3315
0.0912 129.31 7500 0.5802 0.3280
0.0865 137.93 8000 0.5878 0.3290
0.0804 146.55 8500 0.5784 0.3228
0.0739 155.17 9000 0.5791 0.3180
0.0718 163.79 9500 0.5864 0.3146
0.0681 172.41 10000 0.6104 0.3178
0.0688 181.03 10500 0.5983 0.3160
0.0657 189.66 11000 0.6228 0.3203
0.0598 198.28 11500 0.6057 0.3122
0.0597 206.9 12000 0.6094 0.3129
0.0551 215.52 12500 0.6114 0.3127
0.0507 224.14 13000 0.6056 0.3094
0.0554 232.76 13500 0.6158 0.3094

Framework versions

  • Transformers 4.22.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.5.1
  • Tokenizers 0.12.1
Downloads last month
3

Dataset used to train burakyldrm/wav2vec2-burak-new-300-v2-2