whisper-0-dutch / README.md
golesheed's picture
End of training
3693604 verified
metadata
language:
  - nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper Large V2
    results: []

Whisper Large V2

This model is a fine-tuned version of openai/whisper-large-v2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2651
  • Wer: 9.8186

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Wer
0.5952 0.13 30 0.3084 14.4155
0.3011 0.25 60 0.2771 17.2921
0.278 0.38 90 0.2652 12.2835
0.2685 0.51 120 0.2496 16.0334
0.2858 0.63 150 0.2387 11.5359
0.2544 0.76 180 0.2352 12.7537
0.2445 0.89 210 0.2288 10.5050
0.2361 1.01 240 0.2276 12.7537
0.1265 1.14 270 0.2309 12.4792
0.1338 1.27 300 0.2316 12.7041
0.1392 1.39 330 0.2285 10.8437
0.1415 1.52 360 0.2284 11.8630
0.1283 1.65 390 0.2266 10.9430
0.1311 1.77 420 0.2288 12.4880
0.1222 1.9 450 0.2201 10.8145
0.1168 2.03 480 0.2257 13.6386
0.0552 2.15 510 0.2346 12.0908
0.0613 2.28 540 0.2244 13.8138
0.0569 2.41 570 0.2306 10.9197
0.0587 2.53 600 0.2332 9.7515
0.0558 2.66 630 0.2352 11.8075
0.0601 2.78 660 0.2295 10.7590
0.0536 2.91 690 0.2294 10.9021
0.051 3.04 720 0.2353 11.0394
0.0244 3.16 750 0.2439 10.4845
0.0218 3.29 780 0.2483 11.0511
0.0218 3.42 810 0.2434 10.5517
0.0222 3.54 840 0.2510 9.8741
0.0209 3.67 870 0.2436 10.8466
0.0219 3.8 900 0.2476 10.4465
0.0228 3.92 930 0.2433 11.1767
0.0149 4.05 960 0.2499 10.1808
0.0079 4.18 990 0.2625 10.4290
0.0083 4.3 1020 0.2650 9.9909
0.0085 4.43 1050 0.2641 10.1194
0.0085 4.56 1080 0.2637 10.5750
0.0077 4.68 1110 0.2649 10.0055
0.0077 4.81 1140 0.2654 9.9734
0.0085 4.94 1170 0.2651 9.8186

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.6
  • Tokenizers 0.15.0