whisper-small-nl / README.md
qmeeus's picture
Update README.md
7cdbf95
metadata
license: apache-2.0
tags:
  - generated_from_trainer
  - whisper-event
metrics:
  - wer
model-index:
  - name: whisper-small-nl
    results: []

whisper-small-nl

This model is a fine-tuned version of openai/whisper-small on the CGN dataset. It achieves the following results on the evaluation set:

  • Wer: 15.8367

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 512
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • training_steps: 6000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.8378 0.1 100 0.4933 23.8827
0.5547 0.2 200 0.4476 21.0578
0.3905 0.3 300 0.4335 21.1689
0.3766 0.4 400 0.4267 20.0528
0.4164 0.5 500 0.4139 21.4329
0.2939 0.6 600 0.3864 18.3671
0.2632 0.7 700 0.3864 18.4319
0.6066 0.8 800 0.3804 19.2748
0.2075 1.09 900 0.3794 18.8904
0.2102 1.19 1000 0.3777 19.8814
0.2045 2.49 2000 0.3194 16.1628
0.0652 4.97 3000 0.3425 16.3672
0.0167 7.46 4000 0.3915 15.8187
0.0064 9.95 5000 0.4190 15.7298
0.0041 12.44 6000 0.4315 15.8367

Framework versions

  • Transformers 4.26.0.dev0
  • Pytorch 1.13.0+cu117
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2