Edit model card

wav2vec2-large-xls-r-300m-Arabic-phoneme

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0335
  • Per: 0.0199
  • Wer: 0.0225

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 8
  • eval_batch_size: 6
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 30.0

Training results

Training Loss Epoch Step Validation Loss Per Wer
5.3718 1.0 102 2.1140 1.0 1.0
2.036 2.0 204 2.0637 1.0 1.0
2.0175 3.0 306 2.1252 1.0 1.0
1.9463 4.0 408 1.7014 0.9942 0.9887
1.702 5.0 510 1.7257 0.9944 0.9892
1.6475 6.0 612 1.5855 0.9897 0.9798
1.4766 7.0 714 1.2777 0.9787 0.9641
1.0363 8.0 816 0.7926 0.7738 0.7731
0.5964 9.0 918 0.4220 0.3994 0.4133
0.3437 10.0 1020 0.2307 0.1387 0.1549
0.2052 11.0 1122 0.1587 0.0645 0.0738
0.1509 12.0 1224 0.1314 0.0464 0.0544
0.1256 13.0 1326 0.1070 0.0448 0.0518
0.0935 14.0 1428 0.0854 0.0394 0.0452
0.0779 15.0 1530 0.0896 0.0376 0.0440
0.0674 16.0 1632 0.0625 0.0255 0.0306
0.0558 17.0 1734 0.0573 0.0270 0.0318
0.0492 18.0 1836 0.0542 0.0248 0.0288
0.0486 19.0 1938 0.0631 0.0336 0.0369
0.047 20.0 2040 0.0482 0.0255 0.0290
0.0432 21.0 2142 0.0470 0.0262 0.0307
0.0433 22.0 2244 0.0460 0.0250 0.0290
0.0367 23.0 2346 0.0450 0.0253 0.0295
0.0343 24.0 2448 0.0444 0.0254 0.0283
0.0292 25.0 2550 0.0427 0.0248 0.0283
0.0261 26.0 2652 0.0409 0.0220 0.0250
0.025 27.0 2754 0.0360 0.0221 0.0251
0.0236 28.0 2856 0.0350 0.0208 0.0231
0.0222 29.0 2958 0.0338 0.0199 0.0222
0.0202 30.0 3060 0.0335 0.0199 0.0225

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 1.18.3
  • Tokenizers 0.13.3
Downloads last month
21

Finetuned from