abbenedek's picture
End of training
5db6593 verified
metadata
license: apache-2.0
base_model: openai/whisper-base.en
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: abbenedekwhisper-base.en-finetuned-D3K
    results: []

abbenedekwhisper-base.en-finetuned-D3K

This model is a fine-tuned version of openai/whisper-base.en on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5119
  • Cer: 21.9254
  • Wer: 27.8146
  • Ser: 16.6667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 32
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 40
  • training_steps: 400
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer Ser
7.7093 0.21 20 6.8991 20.3673 29.4702 28.9474
6.8825 0.43 40 5.5765 25.0417 35.0993 30.7018
5.2019 0.64 60 4.4037 30.4953 41.7219 34.2105
3.8746 0.85 80 3.3761 33.5559 45.0331 32.4561
2.9591 1.06 100 2.7073 32.1091 43.0464 30.7018
2.2502 1.28 120 2.2711 33.3890 44.3709 30.7018
1.8478 1.49 140 1.9790 31.4413 41.0596 27.1930
1.6338 1.7 160 1.8207 32.2204 42.0530 26.3158
1.5065 1.91 180 1.7257 26.4329 34.4371 20.1754
1.3667 2.13 200 1.6645 24.5965 31.4570 19.2982
1.2807 2.34 220 1.6155 24.8191 31.7881 20.1754
1.2212 2.55 240 1.5851 26.2104 33.4437 21.0526
1.2299 2.77 260 1.5669 24.4853 31.4570 19.2982
1.1623 2.98 280 1.5543 23.0940 29.8013 18.4211
1.1275 3.19 300 1.5421 23.0940 29.8013 18.4211
1.1304 3.4 320 1.5320 23.5392 29.8013 18.4211
1.1113 3.62 340 1.5236 21.9254 27.8146 16.6667
1.1189 3.83 360 1.5168 21.9254 27.8146 16.6667
1.1134 4.04 380 1.5134 21.9254 27.8146 16.6667
1.097 4.26 400 1.5119 21.9254 27.8146 16.6667

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu121
  • Datasets 2.14.5
  • Tokenizers 0.15.2