--- license: apache-2.0 tags: - generated_from_trainer datasets: - librispeech_asr metrics: - wer model-index: - name: whisper-ft-libri-en results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: librispeech_asr type: librispeech_asr config: clean split: test args: clean metrics: - name: Wer type: wer value: 31.616341030195382 --- # whisper-ft-libri-en This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the librispeech_asr dataset. It achieves the following results on the evaluation set: - Loss: 0.8069 - Wer: 31.6163 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 7.740176574997311e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2 - training_steps: 400 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 2.1717 | 0.38 | 5 | 2.1709 | 98.0462 | | 1.2371 | 0.77 | 10 | 1.2719 | 79.9290 | | 0.7577 | 1.15 | 15 | 1.0510 | 35.3464 | | 0.5325 | 1.54 | 20 | 0.9475 | 32.6821 | | 0.5545 | 1.92 | 25 | 0.8607 | 30.3730 | | 0.2957 | 2.31 | 30 | 0.8051 | 33.3925 | | 0.1846 | 2.69 | 35 | 0.7487 | 30.1954 | | 0.0748 | 3.08 | 40 | 0.6882 | 32.1492 | | 0.0709 | 3.46 | 45 | 0.6692 | 31.2611 | | 0.0908 | 3.85 | 50 | 0.6465 | 29.4849 | | 0.0764 | 4.23 | 55 | 0.6578 | 28.9520 | | 0.0259 | 4.62 | 60 | 0.6637 | 30.0178 | | 0.0178 | 5.0 | 65 | 0.6955 | 30.3730 | | 0.0131 | 5.38 | 70 | 0.6869 | 33.2149 | | 0.0162 | 5.77 | 75 | 0.7000 | 32.3268 | | 0.0081 | 6.15 | 80 | 0.6814 | 32.3268 | | 0.0075 | 6.54 | 85 | 0.6897 | 31.0835 | | 0.0069 | 6.92 | 90 | 0.7151 | 32.6821 | | 0.0062 | 7.31 | 95 | 0.7181 | 30.3730 | | 0.0056 | 7.69 | 100 | 0.7173 | 30.0178 | | 0.0052 | 8.08 | 105 | 0.7411 | 31.9716 | | 0.0073 | 8.46 | 110 | 0.7526 | 32.5044 | | 0.0061 | 8.85 | 115 | 0.7467 | 32.8597 | | 0.0034 | 9.23 | 120 | 0.7314 | 31.7940 | | 0.0122 | 9.62 | 125 | 0.7276 | 31.7940 | | 0.0429 | 10.0 | 130 | 0.7417 | 32.5044 | | 0.0032 | 10.38 | 135 | 0.7555 | 31.9716 | | 0.0141 | 10.77 | 140 | 0.7636 | 31.2611 | | 0.0038 | 11.15 | 145 | 0.7607 | 31.9716 | | 0.0038 | 11.54 | 150 | 0.7716 | 33.0373 | | 0.0035 | 11.92 | 155 | 0.7985 | 34.2806 | | 0.0038 | 12.31 | 160 | 0.7797 | 32.1492 | | 0.0036 | 12.69 | 165 | 0.7767 | 31.4387 | | 0.0022 | 13.08 | 170 | 0.7830 | 31.7940 | | 0.0033 | 13.46 | 175 | 0.7992 | 30.7282 | | 0.0019 | 13.85 | 180 | 0.7541 | 30.0178 | | 0.0016 | 14.23 | 185 | 0.7587 | 30.0178 | | 0.0027 | 14.62 | 190 | 0.7766 | 30.3730 | | 0.0016 | 15.0 | 195 | 0.8056 | 32.8597 | | 0.0015 | 15.38 | 200 | 0.8096 | 32.5044 | | 0.0012 | 15.77 | 205 | 0.7931 | 32.6821 | | 0.001 | 16.15 | 210 | 0.7829 | 31.6163 | | 0.0045 | 16.54 | 215 | 0.7774 | 30.9059 | | 0.0009 | 16.92 | 220 | 0.7750 | 30.1954 | | 0.0009 | 17.31 | 225 | 0.7780 | 28.9520 | | 0.0008 | 17.69 | 230 | 0.7803 | 29.1297 | | 0.0007 | 18.08 | 235 | 0.7807 | 29.6625 | | 0.0025 | 18.46 | 240 | 0.7813 | 30.1954 | | 0.0007 | 18.85 | 245 | 0.7840 | 30.0178 | | 0.0006 | 19.23 | 250 | 0.7860 | 30.0178 | | 0.0007 | 19.62 | 255 | 0.7839 | 30.1954 | | 0.0005 | 20.0 | 260 | 0.7834 | 30.1954 | | 0.0006 | 20.38 | 265 | 0.7844 | 30.3730 | | 0.0102 | 20.77 | 270 | 0.7859 | 30.7282 | | 0.0006 | 21.15 | 275 | 0.7901 | 30.7282 | | 0.0006 | 21.54 | 280 | 0.7950 | 30.7282 | | 0.0006 | 21.92 | 285 | 0.7975 | 31.0835 | | 0.0006 | 22.31 | 290 | 0.7984 | 30.7282 | | 0.0006 | 22.69 | 295 | 0.7954 | 30.3730 | | 0.0005 | 23.08 | 300 | 0.7935 | 31.0835 | | 0.0005 | 23.46 | 305 | 0.7928 | 31.0835 | | 0.0005 | 23.85 | 310 | 0.7933 | 31.2611 | | 0.0038 | 24.23 | 315 | 0.7950 | 30.9059 | | 0.0005 | 24.62 | 320 | 0.7976 | 31.6163 | | 0.0004 | 25.0 | 325 | 0.7995 | 31.7940 | | 0.0004 | 25.38 | 330 | 0.8006 | 31.4387 | | 0.0004 | 25.77 | 335 | 0.8005 | 31.6163 | | 0.0005 | 26.15 | 340 | 0.8011 | 31.4387 | | 0.0004 | 26.54 | 345 | 0.8020 | 31.6163 | | 0.0004 | 26.92 | 350 | 0.8024 | 31.4387 | | 0.0017 | 27.31 | 355 | 0.8029 | 31.4387 | | 0.0004 | 27.69 | 360 | 0.8035 | 31.4387 | | 0.0004 | 28.08 | 365 | 0.8045 | 31.4387 | | 0.0004 | 28.46 | 370 | 0.8049 | 31.4387 | | 0.0004 | 28.85 | 375 | 0.8056 | 31.4387 | | 0.0011 | 29.23 | 380 | 0.8060 | 31.4387 | | 0.0004 | 29.62 | 385 | 0.8065 | 31.4387 | | 0.0004 | 30.0 | 390 | 0.8065 | 31.4387 | | 0.0004 | 30.38 | 395 | 0.8068 | 31.4387 | | 0.0004 | 30.77 | 400 | 0.8069 | 31.6163 | ### Framework versions - Transformers 4.26.0.dev0 - Pytorch 1.12.1+cu113 - Datasets 2.7.1 - Tokenizers 0.13.2