Edit model card

wav2vec2-large-xls-r-300m-Arabic

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0028
  • Wer: 0.0691
  • Cer: 0.0318

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 6
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
17.5167 1.0 51 4.8815 1.0 1.0
4.0416 2.0 102 3.1278 1.0 1.0
3.1594 3.0 153 3.1154 1.0 1.0
3.1534 4.0 204 3.0524 1.0 1.0
3.0361 5.0 255 2.8827 1.0 1.0
2.799 6.0 306 2.4063 1.0 0.9571
1.9731 7.0 357 1.0794 0.9762 0.5933
0.7265 8.0 408 0.2350 0.4686 0.1378
0.2788 9.0 459 0.0527 0.1055 0.0265
0.1396 10.0 510 0.0263 0.1063 0.0386
0.0891 11.0 561 0.0245 0.0829 0.0247
0.0667 12.0 612 0.0119 0.0368 0.0091
0.0594 13.0 663 0.0154 0.0405 0.0113
0.0467 14.0 714 0.0109 0.0509 0.0203
0.0354 15.0 765 0.0129 0.0818 0.0320
0.0332 16.0 816 0.0086 0.0933 0.0486
0.0283 17.0 867 0.0135 0.0970 0.0431
0.0244 18.0 918 0.0083 0.0650 0.0276
0.0268 19.0 969 0.0071 0.0286 0.0101
0.0245 20.0 1020 0.0058 0.0320 0.0120
0.0206 21.0 1071 0.0031 0.1059 0.0455
0.0162 22.0 1122 0.0050 0.0699 0.0292
0.0152 23.0 1173 0.0054 0.0517 0.0192
0.014 24.0 1224 0.0037 0.0654 0.0266
0.0161 25.0 1275 0.0024 0.0320 0.0127
0.0156 26.0 1326 0.0027 0.0565 0.0244
0.0122 27.0 1377 0.0036 0.0587 0.0260
0.0105 28.0 1428 0.0026 0.0494 0.0216
0.0114 29.0 1479 0.0028 0.0613 0.0283
0.0087 30.0 1530 0.0028 0.0691 0.0318

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu116
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
1