--- license: apache-2.0 base_model: facebook/hubert-large-ls960-ft tags: - generated_from_trainer metrics: - wer model-index: - name: hubert_arabic_mdd_50 results: [] --- # hubert_arabic_mdd_50 This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5409 - Wer: 0.0470 - Per: 0.0360 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Per | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:| | 8.874 | 1.0 | 818 | 2.7367 | 1.0 | 1.0 | | 1.4558 | 2.0 | 1636 | 0.6219 | 0.2525 | 0.2344 | | 0.3775 | 3.0 | 2454 | 0.4014 | 0.0885 | 0.0742 | | 0.2126 | 4.0 | 3272 | 0.4332 | 0.0656 | 0.0500 | | 0.1658 | 5.0 | 4090 | 0.4970 | 0.0608 | 0.0464 | | 0.1308 | 6.0 | 4908 | 0.4520 | 0.0591 | 0.0445 | | 0.1058 | 7.0 | 5726 | 0.4513 | 0.0531 | 0.0407 | | 0.0968 | 8.0 | 6544 | 0.4253 | 0.0548 | 0.0413 | | 0.0871 | 9.0 | 7362 | 0.4232 | 0.0524 | 0.0405 | | 0.0681 | 10.0 | 8180 | 0.4612 | 0.0508 | 0.0392 | | 0.0651 | 11.0 | 8998 | 0.4978 | 0.0516 | 0.0394 | | 0.0587 | 12.0 | 9816 | 0.5069 | 0.0512 | 0.0389 | | 0.046 | 13.0 | 10634 | 0.5170 | 0.0489 | 0.0370 | | 0.0439 | 14.0 | 11452 | 0.5758 | 0.0536 | 0.0417 | | 0.0401 | 15.0 | 12270 | 0.5238 | 0.0500 | 0.0384 | | 0.0346 | 16.0 | 13088 | 0.5668 | 0.0483 | 0.0368 | | 0.0334 | 17.0 | 13906 | 0.5343 | 0.0459 | 0.0350 | | 0.0336 | 18.0 | 14724 | 0.5242 | 0.0466 | 0.0361 | | 0.0266 | 19.0 | 15542 | 0.5379 | 0.0467 | 0.0359 | | 0.0245 | 20.0 | 16360 | 0.5409 | 0.0470 | 0.0360 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0