metadata
language:
- nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Large V2
results: []
Whisper Large V2
This model is a fine-tuned version of openai/whisper-large-v2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2941
- Wer: 9.7158
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.6299 | 0.09 | 30 | 0.3564 | 16.3717 |
0.3398 | 0.19 | 60 | 0.3210 | 12.9819 |
0.3187 | 0.28 | 90 | 0.2997 | 19.9971 |
0.2773 | 0.38 | 120 | 0.2939 | 15.2908 |
0.2745 | 0.47 | 150 | 0.2780 | 15.0405 |
0.2677 | 0.57 | 180 | 0.2697 | 12.3840 |
0.2467 | 0.66 | 210 | 0.2698 | 13.6033 |
0.2467 | 0.76 | 240 | 0.2735 | 16.5749 |
0.2455 | 0.85 | 270 | 0.2639 | 12.0188 |
0.269 | 0.95 | 300 | 0.2597 | 13.3412 |
0.1851 | 1.04 | 330 | 0.2643 | 12.3428 |
0.1265 | 1.14 | 360 | 0.2561 | 13.4649 |
0.1377 | 1.23 | 390 | 0.2662 | 12.8081 |
0.134 | 1.33 | 420 | 0.2640 | 12.3310 |
0.1371 | 1.42 | 450 | 0.2630 | 11.8480 |
0.1307 | 1.52 | 480 | 0.2616 | 11.9187 |
0.1423 | 1.61 | 510 | 0.2535 | 11.3150 |
0.1406 | 1.71 | 540 | 0.2525 | 10.9675 |
0.1312 | 1.8 | 570 | 0.2483 | 13.9479 |
0.1214 | 1.9 | 600 | 0.2534 | 12.3192 |
0.1252 | 1.99 | 630 | 0.2531 | 11.7243 |
0.0657 | 2.09 | 660 | 0.2619 | 11.0558 |
0.0578 | 2.18 | 690 | 0.2698 | 12.2191 |
0.0548 | 2.28 | 720 | 0.2662 | 10.3667 |
0.0596 | 2.37 | 750 | 0.2685 | 12.3222 |
0.0573 | 2.47 | 780 | 0.2698 | 10.5581 |
0.0589 | 2.56 | 810 | 0.2661 | 11.7391 |
0.0554 | 2.66 | 840 | 0.2608 | 11.7332 |
0.0625 | 2.75 | 870 | 0.2622 | 10.7760 |
0.0586 | 2.85 | 900 | 0.2603 | 10.7201 |
0.0647 | 2.94 | 930 | 0.2576 | 10.5669 |
0.0486 | 3.04 | 960 | 0.2647 | 10.2518 |
0.0245 | 3.13 | 990 | 0.2749 | 10.6140 |
0.0256 | 3.23 | 1020 | 0.2707 | 10.2813 |
0.0242 | 3.32 | 1050 | 0.2724 | 11.6566 |
0.0225 | 3.42 | 1080 | 0.2699 | 10.6347 |
0.0205 | 3.51 | 1110 | 0.2748 | 10.0427 |
0.0217 | 3.61 | 1140 | 0.2747 | 10.0339 |
0.0216 | 3.7 | 1170 | 0.2775 | 9.9190 |
0.0222 | 3.8 | 1200 | 0.2770 | 10.2371 |
0.0204 | 3.89 | 1230 | 0.2722 | 10.1782 |
0.0185 | 3.99 | 1260 | 0.2725 | 9.7835 |
0.0111 | 4.08 | 1290 | 0.2834 | 9.8866 |
0.0085 | 4.18 | 1320 | 0.2854 | 9.7894 |
0.0082 | 4.27 | 1350 | 0.2868 | 9.7629 |
0.0075 | 4.37 | 1380 | 0.2906 | 9.7776 |
0.0079 | 4.46 | 1410 | 0.2918 | 9.7394 |
0.0071 | 4.56 | 1440 | 0.2902 | 9.6157 |
0.0076 | 4.65 | 1470 | 0.2921 | 9.5921 |
0.0071 | 4.75 | 1500 | 0.2940 | 9.5774 |
0.0069 | 4.84 | 1530 | 0.2936 | 9.7276 |
0.0071 | 4.94 | 1560 | 0.2941 | 9.7158 |
Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0