abbenedekwhisper-tiny.en-finetuned-D3K
This model is a fine-tuned version of openai/whisper-tiny.en on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.7482
- Cer: 20.4786
- Wer: 36.7550
- Ser: 32.4561
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 40
- training_steps: 400
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Cer | Wer | Ser |
---|---|---|---|---|---|---|
6.5477 | 0.21 | 20 | 6.1679 | 28.8815 | 65.5629 | 47.3684 |
5.8649 | 0.43 | 40 | 5.1645 | 26.6555 | 59.6026 | 42.1053 |
4.4955 | 0.64 | 60 | 3.9916 | 28.6032 | 57.6159 | 40.3509 |
3.2461 | 0.85 | 80 | 3.0695 | 25.2087 | 49.0066 | 35.0877 |
2.464 | 1.06 | 100 | 2.4087 | 22.6489 | 42.0530 | 33.3333 |
2.0699 | 1.28 | 120 | 2.1494 | 22.7045 | 40.0662 | 32.4561 |
1.9909 | 1.49 | 140 | 2.0336 | 22.5932 | 39.0728 | 32.4561 |
1.8902 | 1.7 | 160 | 1.9646 | 21.5915 | 38.0795 | 33.3333 |
1.8438 | 1.91 | 180 | 1.9208 | 23.0384 | 38.7417 | 34.2105 |
1.7684 | 2.13 | 200 | 1.8850 | 22.2037 | 38.4106 | 34.2105 |
1.729 | 2.34 | 220 | 1.8487 | 21.9254 | 37.4172 | 33.3333 |
1.6901 | 2.55 | 240 | 1.8251 | 21.9254 | 37.0861 | 32.4561 |
1.7107 | 2.77 | 260 | 1.8077 | 21.9811 | 37.7483 | 33.3333 |
1.665 | 2.98 | 280 | 1.7931 | 22.0924 | 39.4040 | 34.2105 |
1.6214 | 3.19 | 300 | 1.7799 | 22.0924 | 39.4040 | 34.2105 |
1.6124 | 3.4 | 320 | 1.7692 | 21.2020 | 37.4172 | 33.3333 |
1.6064 | 3.62 | 340 | 1.7602 | 21.2020 | 37.4172 | 33.3333 |
1.6154 | 3.83 | 360 | 1.7538 | 20.4786 | 36.7550 | 32.4561 |
1.6167 | 4.04 | 380 | 1.7497 | 20.4786 | 36.7550 | 32.4561 |
1.5972 | 4.26 | 400 | 1.7482 | 20.4786 | 36.7550 | 32.4561 |
Framework versions
- Transformers 4.39.3
- Pytorch 2.2.2+cu121
- Datasets 2.14.5
- Tokenizers 0.15.2
- Downloads last month
- 1