makhataei's picture
End of training
0b43003
metadata
language:
  - fa
license: apache-2.0
base_model: makhataei/Whisper-Small-Common-Voice
tags:
  - fa-asr
  - generated_from_trainer
datasets:
  - mozilla-foundation/common_voice_15_0
metrics:
  - wer
model-index:
  - name: Whisper Small Persian
    results: []

Whisper Small Persian

This model is a fine-tuned version of makhataei/Whisper-Small-Common-Voice on the Common Voice 15.0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8695
  • Wer: 50.4804

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-08
  • train_batch_size: 10
  • eval_batch_size: 10
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 40
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Wer
0.0002 0.14 100 0.8627 50.5962
0.0002 0.28 200 0.8627 50.5945
0.0002 0.42 300 0.8628 50.5945
0.0001 0.56 400 0.8631 50.2720
0.0001 0.7 500 0.8634 50.2869
0.0002 0.83 600 0.8637 50.2638
0.0002 0.97 700 0.8638 50.2704
0.0002 1.11 800 0.8639 50.2935
0.0002 1.25 900 0.8640 50.2704
0.0002 1.39 1000 0.8641 50.2340
0.0002 1.53 1100 0.8644 50.2538
0.0002 1.67 1200 0.8645 50.2522
0.0002 1.81 1300 0.8646 50.2671
0.0002 1.95 1400 0.8648 50.2241
0.0002 2.09 1500 0.8650 50.2390
0.0001 2.23 1600 0.8653 50.2274
0.0001 2.36 1700 0.8653 50.2257
0.0002 2.5 1800 0.8653 50.2290
0.0002 2.64 1900 0.8653 50.2373
0.0002 2.78 2000 0.8653 50.2307
0.0001 2.92 2100 0.8655 50.2158
0.0002 3.06 2200 0.8656 50.2175
0.0002 3.2 2300 0.8658 50.2108
0.0002 3.34 2400 0.8659 50.2175
0.0001 3.48 2500 0.8660 50.2274
0.0001 3.62 2600 0.8661 50.2257
0.0002 3.76 2700 0.8662 50.2323
0.0002 3.89 2800 0.8663 50.2108
0.0002 4.03 2900 0.8665 50.1827
0.0002 4.17 3000 0.8666 50.2158
0.0001 4.31 3100 0.8668 50.2191
0.0002 4.45 3200 0.8668 50.2224
0.0002 4.59 3300 0.8669 50.1976
0.0002 4.73 3400 0.8668 50.1893
0.0002 4.87 3500 0.8668 50.1976
0.0001 5.01 3600 0.8670 50.1976
0.0002 5.15 3700 0.8671 50.1893
0.0001 5.29 3800 0.8672 50.1893
0.0002 5.42 3900 0.8673 50.1860
0.0001 5.56 4000 0.8674 50.1728
0.0002 5.7 4100 0.8674 50.1927
0.0002 5.84 4200 0.8675 50.2935
0.0001 5.98 4300 0.8676 50.1943
0.0002 6.12 4400 0.8676 50.2009
0.0002 6.26 4500 0.8677 50.2605
0.0002 6.4 4600 0.8677 50.2737
0.0002 6.54 4700 0.8679 50.2638
0.0001 6.68 4800 0.8681 50.2621
0.0002 6.82 4900 0.8681 50.2654
0.0001 6.95 5000 0.8681 50.2770
0.0001 7.09 5100 0.8682 50.2638
0.0002 7.23 5200 0.8682 50.2737
0.0002 7.37 5300 0.8682 50.2886
0.0002 7.51 5400 0.8683 50.2820
0.0002 7.65 5500 0.8683 50.3084
0.0001 7.79 5600 0.8684 50.2803
0.0001 7.93 5700 0.8685 50.2952
0.0001 8.07 5800 0.8686 50.2770
0.0001 8.21 5900 0.8687 50.2803
0.0001 8.34 6000 0.8688 50.2820
0.0001 8.48 6100 0.8689 50.3018
0.0002 8.62 6200 0.8689 50.2853
0.0001 8.76 6300 0.8689 50.2886
0.0002 8.9 6400 0.8689 50.2753
0.0001 9.04 6500 0.8689 50.4606
0.0001 9.18 6600 0.8690 50.4721
0.0001 9.32 6700 0.8690 50.4754
0.0002 9.46 6800 0.8690 50.4738
0.0002 9.6 6900 0.8691 50.4655
0.0001 9.74 7000 0.8692 50.4672
0.0002 9.87 7100 0.8692 50.4705
0.0002 10.01 7200 0.8692 50.4688
0.0001 10.15 7300 0.8692 50.4771
0.0001 10.29 7400 0.8692 50.4771
0.0001 10.43 7500 0.8692 50.4771
0.0001 10.57 7600 0.8692 50.4837
0.0001 10.71 7700 0.8693 50.4820
0.0001 10.85 7800 0.8693 50.4887
0.0001 10.99 7900 0.8693 50.4820
0.0001 11.13 8000 0.8694 50.4887
0.0001 11.27 8100 0.8694 50.4804
0.0002 11.4 8200 0.8694 50.4754
0.0001 11.54 8300 0.8694 50.4721
0.0002 11.68 8400 0.8694 50.4771
0.0001 11.82 8500 0.8694 50.4738
0.0002 11.96 8600 0.8695 50.4771
0.0002 12.1 8700 0.8695 50.4787
0.0001 12.24 8800 0.8695 50.4787
0.0001 12.38 8900 0.8695 50.4754
0.0001 12.52 9000 0.8695 50.4787
0.0001 12.66 9100 0.8695 50.4787
0.0001 12.8 9200 0.8695 50.4804
0.0001 12.93 9300 0.8695 50.4804
0.0001 13.07 9400 0.8695 50.4804
0.0002 13.21 9500 0.8695 50.4804
0.0002 13.35 9600 0.8695 50.4804
0.0001 13.49 9700 0.8695 50.4804
0.0002 13.63 9800 0.8695 50.4804
0.0002 13.77 9900 0.8695 50.4804
0.0001 13.91 10000 0.8695 50.4804

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.15.0
  • Tokenizers 0.15.0