--- license: apache-2.0 tags: - generated_from_trainer metrics: - wer model-index: - name: whisper-base-nl-3 results: [] --- # whisper-base-nl-3 This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7910 - Wer: 31.4005 - Cer: 9.9570 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 35000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Cer | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:-------:|:---------------:|:-------:| | 0.5761 | 0.06 | 1000 | 10.1154 | 0.5675 | 28.1532 | | 0.48 | 0.13 | 2000 | 9.6911 | 0.5239 | 26.4364 | | 0.4094 | 0.19 | 3000 | 9.1532 | 0.4925 | 24.8355 | | 0.4792 | 0.26 | 4000 | 8.8414 | 0.4702 | 24.1105 | | 0.3444 | 0.32 | 5000 | 8.8531 | 0.4544 | 23.9017 | | 0.3943 | 0.39 | 6000 | 8.3602 | 0.4446 | 22.7353 | | 0.4925 | 0.45 | 7000 | 8.3724 | 0.4348 | 22.1788 | | 0.4455 | 0.52 | 8000 | 8.2989 | 0.4270 | 21.7549 | | 0.3987 | 0.58 | 9000 | 7.9417 | 0.4139 | 20.8424 | | 0.3373 | 0.65 | 10000 | 7.8871 | 0.4116 | 21.2144 | | 0.3808 | 0.71 | 11000 | 7.6264 | 0.4016 | 20.5092 | | 0.4214 | 0.78 | 12000 | 7.4153 | 0.3949 | 20.0938 | | 0.3029 | 0.84 | 13000 | 7.3581 | 0.3902 | 19.7347 | | 0.3549 | 1.66 | 14000 | 7.1195 | 0.3908 | 19.4115 | | 0.3385 | 1.78 | 15000 | 7.7792 | 0.3906 | 20.2051 | | 0.3282 | 1.9 | 16000 | 7.1081 | 0.3923 | 19.2651 | | 0.3196 | 2.02 | 17000 | 7.2249 | 0.3923 | 19.3352 | | 0.3251 | 2.14 | 18000 | 7.1761 | 0.3981 | 19.4831 | | 0.4162 | 2.25 | 19000 | 7.0590 | 0.3958 | 19.0577 | | 0.2851 | 2.37 | 20000 | 7.0167 | 0.3953 | 19.2095 | | 0.2982 | 2.49 | 21000 | 6.8426 | 0.3929 | 18.8100 | | 0.3642 | 2.61 | 22000 | 6.8867 | 0.3954 | 18.6972 | | 0.2297 | 2.73 | 23000 | 6.9384 | 0.3916 | 18.7330 | | 0.2313 | 2.85 | 24000 | 6.7785 | 0.3930 | 18.6034 | | 0.2833 | 2.97 | 25000 | 6.8552 | 0.3910 | 18.5981 | | 0.2509 | 3.09 | 26000 | 6.8165 | 0.3949 | 18.5180 | | 0.2085 | 3.2 | 27000 | 6.8113 | 0.3985 | 18.6133 | | 0.2055 | 3.32 | 28000 | 6.8624 | 0.3995 | 18.7612 | | 0.175 | 3.44 | 29000 | 6.7727 | 0.4009 | 18.4814 | | 0.1701 | 3.56 | 30000 | 7.0136 | 0.3998 | 18.8344 | | 0.6832 | 33.81 | 31000 | 0.5425 | 24.6216 | 7.8509 | | 0.5676 | 34.9 | 32000 | 0.5141 | 23.6790 | 7.3776 | | 0.4863 | 35.99 | 33000 | 0.5003 | 23.0542 | 7.2441 | | 0.5007 | 37.08 | 34000 | 0.4948 | 22.9234 | 7.1545 | | 0.4519 | 38.17 | 35000 | 0.4922 | 22.8248 | 7.1257 | ### Framework versions - Transformers 4.26.0.dev0 - Pytorch 1.13.1+cu117 - Datasets 2.8.1.dev0 - Tokenizers 0.13.2