--- license: apache-2.0 base_model: facebook/hubert-large-ls960-ft tags: - generated_from_trainer metrics: - wer model-index: - name: hubert-large-ls960-ft-V2-50 results: [] --- # hubert-large-ls960-ft-V2-50 This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.9222 - Wer: 0.0732 - Per: 0.0540 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Per | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:| | 12.8849 | 1.0 | 818 | 4.5925 | 0.9551 | 0.9649 | | 2.7511 | 2.0 | 1636 | 1.7073 | 0.4693 | 0.4560 | | 1.1653 | 3.0 | 2454 | 1.1204 | 0.1534 | 0.1317 | | 0.7529 | 4.0 | 3272 | 1.0336 | 0.1055 | 0.0841 | | 0.6309 | 5.0 | 4090 | 1.0015 | 0.1023 | 0.0817 | | 0.5354 | 6.0 | 4908 | 1.0387 | 0.0992 | 0.0777 | | 0.4907 | 7.0 | 5726 | 0.9957 | 0.1087 | 0.0893 | | 0.4326 | 8.0 | 6544 | 0.8882 | 0.1091 | 0.0844 | | 0.4148 | 9.0 | 7362 | 0.9542 | 0.0830 | 0.0638 | | 0.3779 | 10.0 | 8180 | 0.9479 | 0.0690 | 0.0501 | | 0.3502 | 11.0 | 8998 | 0.9840 | 0.0689 | 0.0491 | | 0.3294 | 12.0 | 9816 | 1.0877 | 0.0694 | 0.0491 | | 0.3239 | 13.0 | 10634 | 0.8955 | 0.0731 | 0.0534 | | 0.3069 | 14.0 | 11452 | 0.8547 | 0.0776 | 0.0580 | | 0.2689 | 15.0 | 12270 | 0.9683 | 0.0720 | 0.0525 | | 0.2486 | 16.0 | 13088 | 0.9282 | 0.0704 | 0.0519 | | 0.2291 | 17.0 | 13906 | 0.9004 | 0.0671 | 0.0481 | | 0.2294 | 18.0 | 14724 | 0.9242 | 0.0747 | 0.0547 | | 0.2151 | 19.0 | 15542 | 0.9400 | 0.0747 | 0.0554 | | 0.2109 | 20.0 | 16360 | 0.9222 | 0.0732 | 0.0540 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0