Edit model card

Whisper Large V2

This model is a fine-tuned version of openai/whisper-large-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2970
  • Wer: 11.6236

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 12
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Wer
0.6426 0.09 30 0.3506 16.5079
0.3086 0.19 60 0.3201 12.7211
0.3121 0.28 90 0.2968 11.7123
0.2956 0.38 120 0.2937 13.6116
0.3067 0.47 150 0.2769 14.4193
0.2787 0.57 180 0.2717 14.5051
0.2382 0.66 210 0.2732 17.2889
0.232 0.76 240 0.2742 19.2148
0.2653 0.85 270 0.2632 18.9604
0.2726 0.95 300 0.2497 15.6174
0.1879 1.04 330 0.2603 12.4549
0.124 1.14 360 0.2590 11.1769
0.1244 1.23 390 0.2679 17.9486
0.1482 1.33 420 0.2590 16.1263
0.1312 1.42 450 0.2628 15.2595
0.1358 1.52 480 0.2550 13.0347
0.1302 1.61 510 0.2545 15.1648
0.132 1.71 540 0.2508 15.3127
0.1402 1.8 570 0.2418 12.2330
0.137 1.9 600 0.2444 13.2329
0.1346 1.99 630 0.2432 13.1649
0.0664 2.09 660 0.2594 11.6058
0.0562 2.18 690 0.2655 10.9431
0.0551 2.28 720 0.2613 13.3690
0.0625 2.37 750 0.2555 20.2769
0.0627 2.47 780 0.2602 17.7268
0.0586 2.56 810 0.2647 11.5319
0.0604 2.66 840 0.2615 11.0378
0.062 2.75 870 0.2570 12.0111
0.0548 2.85 900 0.2575 14.5317
0.0576 2.94 930 0.2585 12.2182
0.0448 3.04 960 0.2619 13.0406
0.023 3.13 990 0.2730 12.9578
0.0241 3.23 1020 0.2773 11.9667
0.023 3.32 1050 0.2738 11.7656
0.0222 3.42 1080 0.2767 11.8602
0.0201 3.51 1110 0.2723 11.3455
0.0195 3.61 1140 0.2803 10.6946
0.0221 3.7 1170 0.2744 11.3899
0.0202 3.8 1200 0.2764 11.3070
0.0223 3.89 1230 0.2725 11.2567
0.021 3.99 1260 0.2781 10.6148
0.01 4.08 1290 0.2854 10.7508
0.0081 4.18 1320 0.2914 10.5970
0.0086 4.27 1350 0.2918 11.0408
0.0073 4.37 1380 0.2946 11.2301
0.0085 4.46 1410 0.2950 10.8721
0.007 4.56 1440 0.2957 11.4224
0.0063 4.65 1470 0.2965 11.4431
0.0073 4.75 1500 0.2976 11.0970
0.0078 4.84 1530 0.2972 11.5289
0.0068 4.94 1560 0.2970 11.6236

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
1.54B params
Tensor type
F32
·

Finetuned from