Edit model card

wav2vec2-base-960h-Arabic

This model is a fine-tuned version of facebook/wav2vec2-base-960h on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7209
  • Wer: 1.0
  • Cer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 6
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.7909 1.0 51 0.8018 1.0 1.0
0.6736 2.0 102 1.1252 1.0 1.0
0.7037 3.0 153 0.7432 1.0 1.0
0.724 4.0 204 1.2762 1.0 1.0
0.8887 5.0 255 0.7064 1.0 1.0
0.835 6.0 306 1.0820 1.0 1.0
0.8042 7.0 357 1.0530 1.0 1.0
0.7475 8.0 408 0.6969 1.0 1.0
0.6998 9.0 459 0.7852 1.0 1.0
0.7048 10.0 510 0.6942 1.0 1.0
0.7883 11.0 561 0.7767 1.0 1.0
0.6773 12.0 612 0.7355 1.0 1.0
0.7421 13.0 663 1.5550 1.0 1.0
0.7573 14.0 714 0.8736 1.0 1.0
0.6911 15.0 765 1.3328 1.0 1.0
0.7129 16.0 816 0.8911 1.0 1.0
0.6619 17.0 867 1.0227 1.0 1.0
0.6807 18.0 918 0.7829 1.0 1.0
0.6409 19.0 969 0.9122 1.0 1.0
0.6588 20.0 1020 0.9179 1.0 1.0
0.6648 21.0 1071 0.8469 1.0 1.0
0.6521 22.0 1122 0.7197 1.0 1.0
0.7189 23.0 1173 0.7972 1.0 1.0
0.664 24.0 1224 1.0197 1.0 1.0
0.6611 25.0 1275 1.0786 1.0 1.0
0.6333 26.0 1326 0.8307 1.0 1.0
0.6407 27.0 1377 0.7388 1.0 1.0
0.6322 28.0 1428 0.6789 1.0 1.0
0.6283 29.0 1479 0.6917 1.0 1.0
0.6343 30.0 1530 0.7045 1.0 1.0
0.6292 31.0 1581 0.6717 1.0 1.0
0.6209 32.0 1632 0.7436 1.0 1.0
0.6088 33.0 1683 0.7392 1.0 1.0
0.6103 34.0 1734 0.7387 1.0 1.0
0.6084 35.0 1785 0.7647 1.0 1.0
0.6042 36.0 1836 0.8165 1.0 1.0
0.605 37.0 1887 0.7578 1.0 1.0
0.6042 38.0 1938 0.7359 1.0 1.0
0.6 39.0 1989 0.7287 1.0 1.0
0.5963 40.0 2040 0.7209 1.0 1.0

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu116
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
2