xlsr-arabic / README.md
Badr Abdullah
Model save
cbcced9 verified
metadata
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
datasets:
  - common_voice_17_0
metrics:
  - wer
model-index:
  - name: xlsr-arabic
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: common_voice_17_0
          type: common_voice_17_0
          config: ar
          split: validation
          args: ar
        metrics:
          - name: Wer
            type: wer
            value: 0.5205260783565256

Visualize in Weights & Biases

xlsr-arabic

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice_17_0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8019
  • Wer: 0.5205
  • Cer: 0.2091

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
9.6085 0.1127 100 8.2716 1.0 1.0
3.7055 0.2255 200 3.5635 1.0 1.0
3.6707 0.3382 300 3.4485 1.0 1.0
3.4696 0.4510 400 3.3925 1.0 0.9896
3.5699 0.5637 500 3.3520 0.9989 0.9810
2.5766 0.6764 600 2.5549 0.9997 0.8222
1.5035 0.7892 700 1.3485 0.9135 0.4360
1.508 0.9019 800 1.0530 0.8492 0.3821
1.094 1.0147 900 0.8783 0.7682 0.3325
0.9471 1.1274 1000 0.8124 0.7197 0.3019
0.8182 1.2401 1100 0.8092 0.7173 0.2788
0.8558 1.3529 1200 0.7785 0.7034 0.2655
0.7199 1.4656 1300 0.6794 0.6689 0.2554
0.7401 1.5784 1400 0.6849 0.6593 0.2509
0.8892 1.6911 1500 0.6862 0.6533 0.2456
0.6824 1.8038 1600 0.6654 0.6426 0.2455
0.5671 1.9166 1700 0.6851 0.6582 0.2378
0.5298 2.0293 1800 0.7284 0.6530 0.2476
0.5056 2.1421 1900 0.6453 0.6348 0.2311
0.4889 2.2548 2000 0.6641 0.6365 0.2342
0.6665 2.3675 2100 0.6607 0.6240 0.2305
0.4022 2.4803 2200 0.6029 0.6054 0.2255
0.5083 2.5930 2300 0.5710 0.5894 0.2159
0.5413 2.7057 2400 0.5762 0.5981 0.2219
0.5765 2.8185 2500 0.5684 0.5965 0.2232
0.6379 2.9312 2600 0.5478 0.5692 0.2121
0.388 3.0440 2700 0.5589 0.5971 0.2220
0.5047 3.1567 2800 0.5903 0.5882 0.2155
0.4911 3.2694 2900 0.5813 0.5838 0.2240
0.4059 3.3822 3000 0.5796 0.5884 0.2208
0.4182 3.4949 3100 0.6368 0.5939 0.2243
0.425 3.6077 3200 0.5325 0.5437 0.2093
0.5876 3.7204 3300 0.5463 0.5629 0.2091
0.3795 3.8331 3400 0.5265 0.5554 0.2090
0.5567 3.9459 3500 0.5372 0.5577 0.2108
0.4698 4.0586 3600 0.5723 0.5900 0.2197
0.3856 4.1714 3700 0.5992 0.5753 0.2168
0.427 4.2841 3800 0.5735 0.5790 0.2156
0.3449 4.3968 3900 0.5642 0.5750 0.2113
0.4049 4.5096 4000 0.5972 0.5825 0.2203
0.4687 4.6223 4100 0.5649 0.5612 0.2111
0.4301 4.7351 4200 0.5515 0.5622 0.2105
0.4429 4.8478 4300 0.5622 0.5709 0.2125
0.4234 4.9605 4400 0.5684 0.5496 0.2098
0.3361 5.0733 4500 0.6108 0.5541 0.2137
0.3547 5.1860 4600 0.5869 0.5508 0.2091
0.2801 5.2988 4700 0.6526 0.5586 0.2163
0.3237 5.4115 4800 0.6481 0.5576 0.2169
0.3366 5.5242 4900 0.5603 0.5345 0.2076
0.2724 5.6370 5000 0.6141 0.5491 0.2151
0.2845 5.7497 5100 0.7205 0.5605 0.2266
0.293 5.8625 5200 0.6246 0.5502 0.2142
0.2904 5.9752 5300 0.5936 0.5386 0.2097
0.3082 6.0879 5400 0.6173 0.5134 0.2032
0.35 6.2007 5500 0.6430 0.5158 0.2051
0.2101 6.3134 5600 0.5861 0.5110 0.1998
0.2822 6.4262 5700 0.6322 0.5269 0.2092
0.263 6.5389 5800 0.7677 0.5477 0.2231
0.2329 6.6516 5900 0.6837 0.5336 0.2129
0.2626 6.7644 6000 0.6350 0.5208 0.2075
0.2467 6.8771 6100 0.6082 0.5274 0.2060
0.3242 6.9899 6200 0.6619 0.5347 0.2098
0.3301 7.1026 6300 0.6798 0.5255 0.2107
0.3085 7.2153 6400 0.6934 0.5202 0.2076
0.3395 7.3281 6500 0.6981 0.5329 0.2125
0.2766 7.4408 6600 0.6886 0.5256 0.2091
0.2479 7.5536 6700 0.7543 0.5414 0.2148
0.18 7.6663 6800 0.7538 0.5198 0.2127
0.3539 7.7790 6900 0.6877 0.5290 0.2136
0.2759 7.8918 7000 0.6516 0.5110 0.2053
0.1152 8.0045 7100 0.7376 0.5293 0.2143
0.1814 8.1172 7200 0.7046 0.5156 0.2068
0.1829 8.2300 7300 0.7658 0.5190 0.2108
0.1165 8.3427 7400 0.8318 0.5210 0.2139
0.1255 8.4555 7500 0.7769 0.5188 0.2085
0.1013 8.5682 7600 0.7409 0.5153 0.2044
0.1273 8.6809 7700 0.7661 0.5181 0.2076
0.1178 8.7937 7800 0.8007 0.5218 0.2113
0.1028 8.9064 7900 0.7513 0.5137 0.2075
0.2003 9.0192 8000 0.7449 0.5133 0.2077
0.1495 9.1319 8100 0.8426 0.5140 0.2105
0.1283 9.2446 8200 0.7653 0.5112 0.2066
0.0585 9.3574 8300 0.7894 0.5176 0.2092
0.1543 9.4701 8400 0.7675 0.5147 0.2064
0.144 9.5829 8500 0.7927 0.5187 0.2096
0.1185 9.6956 8600 0.8045 0.5201 0.2101
0.1707 9.8083 8700 0.7941 0.5193 0.2089
0.0927 9.9211 8800 0.8019 0.5205 0.2091

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1