--- license: apache-2.0 tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-base-960h-Arabic results: [] --- # wav2vec2-base-960h-Arabic This model is a fine-tuned version of [facebook/wav2vec2-base-960h](https://huggingface.co/facebook/wav2vec2-base-960h) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6190 - Wer: 1.0 - Cer: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 16 - eval_batch_size: 6 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 250 - num_epochs: 40 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Cer | |:-------------:|:-----:|:----:|:---------------:|:---:|:---:| | 8.1025 | 1.0 | 51 | 4.9004 | 1.0 | 1.0 | | 1.3634 | 2.0 | 102 | 1.2766 | 1.0 | 1.0 | | 1.1559 | 3.0 | 153 | 1.1179 | 1.0 | 1.0 | | 1.0742 | 4.0 | 204 | 1.4401 | 1.0 | 1.0 | | 1.0112 | 5.0 | 255 | 0.7570 | 1.0 | 1.0 | | 0.9263 | 6.0 | 306 | 0.6297 | 1.0 | 1.0 | | 0.9983 | 7.0 | 357 | 1.7016 | 1.0 | 1.0 | | 0.7949 | 8.0 | 408 | 0.6406 | 1.0 | 1.0 | | 0.9205 | 9.0 | 459 | 1.0371 | 1.0 | 1.0 | | 0.7783 | 10.0 | 510 | 0.8300 | 1.0 | 1.0 | | 0.8202 | 11.0 | 561 | 0.7223 | 1.0 | 1.0 | | 0.7737 | 12.0 | 612 | 0.7909 | 1.0 | 1.0 | | 0.7426 | 13.0 | 663 | 0.7968 | 1.0 | 1.0 | | 0.7211 | 14.0 | 714 | 0.7648 | 1.0 | 1.0 | | 0.7526 | 15.0 | 765 | 0.6257 | 1.0 | 1.0 | | 0.7361 | 16.0 | 816 | 0.6500 | 1.0 | 1.0 | | 0.716 | 17.0 | 867 | 0.9519 | 1.0 | 1.0 | | 0.7595 | 18.0 | 918 | 0.6324 | 1.0 | 1.0 | | 0.8078 | 19.0 | 969 | 0.8474 | 1.0 | 1.0 | | 0.7761 | 20.0 | 1020 | 0.6274 | 1.0 | 1.0 | | 0.6514 | 21.0 | 1071 | 0.7698 | 1.0 | 1.0 | | 0.8607 | 22.0 | 1122 | 0.6179 | 1.0 | 1.0 | | 0.7999 | 23.0 | 1173 | 0.6416 | 1.0 | 1.0 | | 0.7267 | 24.0 | 1224 | 0.6506 | 1.0 | 1.0 | | 0.6705 | 25.0 | 1275 | 0.6232 | 1.0 | 1.0 | | 0.6669 | 26.0 | 1326 | 0.6472 | 1.0 | 1.0 | | 0.6731 | 27.0 | 1377 | 0.6190 | 1.0 | 1.0 | | 0.6532 | 28.0 | 1428 | 0.6197 | 1.0 | 1.0 | | 0.6423 | 29.0 | 1479 | 0.6608 | 1.0 | 1.0 | | 0.6574 | 30.0 | 1530 | 0.6175 | 1.0 | 1.0 | | 0.6586 | 31.0 | 1581 | 0.6320 | 1.0 | 1.0 | | 0.6339 | 32.0 | 1632 | 0.6196 | 1.0 | 1.0 | | 0.6628 | 33.0 | 1683 | 0.6176 | 1.0 | 1.0 | | 0.6222 | 34.0 | 1734 | 0.6434 | 1.0 | 1.0 | | 0.6293 | 35.0 | 1785 | 0.6301 | 1.0 | 1.0 | | 0.6337 | 36.0 | 1836 | 0.6357 | 1.0 | 1.0 | | 0.6168 | 37.0 | 1887 | 0.6179 | 1.0 | 1.0 | | 0.6093 | 38.0 | 1938 | 0.6197 | 1.0 | 1.0 | | 0.6053 | 39.0 | 1989 | 0.6188 | 1.0 | 1.0 | | 0.6014 | 40.0 | 2040 | 0.6190 | 1.0 | 1.0 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2