makhataei's picture
End of training
fb7edfb verified
|
raw
history blame
7.89 kB
metadata
language:
  - fa
license: apache-2.0
base_model: makhataei/Whisper-Small-Common-Voice
tags:
  - fa-asr
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper Small Persian
    results: []

Whisper Small Persian

This model is a fine-tuned version of makhataei/Whisper-Small-Common-Voice on the Ctejarat dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4755
  • Wer: 26.8240

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-07
  • train_batch_size: 11
  • eval_batch_size: 11
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 88
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Wer
0.9585 47.06 100 0.7643 39.2704
0.8272 94.12 200 0.7201 39.0558
0.6499 141.18 300 0.6593 39.6996
0.4717 188.24 400 0.5965 36.2661
0.2999 235.29 500 0.5380 34.3348
0.1746 282.35 600 0.4967 34.9785
0.0995 329.41 700 0.4795 35.8369
0.0509 376.47 800 0.4714 33.0472
0.0248 423.53 900 0.4669 30.9013
0.0144 470.59 1000 0.4643 30.6867
0.0093 517.65 1100 0.4625 29.8283
0.0066 564.71 1200 0.4618 29.3991
0.0051 611.76 1300 0.4616 29.6137
0.004 658.82 1400 0.4615 29.3991
0.0034 705.88 1500 0.4616 28.7554
0.0026 752.94 1600 0.4618 29.1845
0.0022 800.0 1700 0.4620 28.7554
0.0019 847.06 1800 0.4622 28.7554
0.0017 894.12 1900 0.4623 28.7554
0.0015 941.18 2000 0.4626 28.7554
0.0013 988.24 2100 0.4628 28.7554
0.0012 1035.29 2200 0.4630 28.3262
0.0011 1082.35 2300 0.4633 28.3262
0.001 1129.41 2400 0.4634 28.3262
0.0009 1176.47 2500 0.4636 28.3262
0.0008 1223.53 2600 0.4638 28.3262
0.0008 1270.59 2700 0.4640 27.8970
0.0007 1317.65 2800 0.4641 28.3262
0.0007 1364.71 2900 0.4644 28.3262
0.0006 1411.76 3000 0.4645 28.1116
0.0006 1458.82 3100 0.4647 27.8970
0.0005 1505.88 3200 0.4648 27.8970
0.0005 1552.94 3300 0.4650 28.1116
0.0005 1600.0 3400 0.4652 28.1116
0.0005 1647.06 3500 0.4654 27.8970
0.0004 1694.12 3600 0.4656 27.8970
0.0004 1741.18 3700 0.4657 27.8970
0.0004 1788.24 3800 0.4659 27.8970
0.0004 1835.29 3900 0.4661 27.4678
0.0004 1882.35 4000 0.4662 27.4678
0.0003 1929.41 4100 0.4664 27.4678
0.0003 1976.47 4200 0.4666 27.4678
0.0003 2023.53 4300 0.4668 27.4678
0.0003 2070.59 4400 0.4670 27.4678
0.0003 2117.65 4500 0.4672 27.4678
0.0003 2164.71 4600 0.4674 27.4678
0.0002 2211.76 4700 0.4676 27.4678
0.0002 2258.82 4800 0.4678 27.2532
0.0002 2305.88 4900 0.4680 27.2532
0.0002 2352.94 5000 0.4682 27.0386
0.0002 2400.0 5100 0.4684 27.0386
0.0002 2447.06 5200 0.4685 27.0386
0.0002 2494.12 5300 0.4688 27.0386
0.0002 2541.18 5400 0.4689 27.0386
0.0002 2588.24 5500 0.4691 27.0386
0.0002 2635.29 5600 0.4693 27.0386
0.0002 2682.35 5700 0.4695 27.0386
0.0002 2729.41 5800 0.4697 27.0386
0.0002 2776.47 5900 0.4699 27.0386
0.0002 2823.53 6000 0.4700 27.0386
0.0001 2870.59 6100 0.4702 27.0386
0.0001 2917.65 6200 0.4704 27.0386
0.0001 2964.71 6300 0.4706 27.0386
0.0001 3011.76 6400 0.4708 27.0386
0.0001 3058.82 6500 0.4710 27.0386
0.0001 3105.88 6600 0.4712 27.2532
0.0001 3152.94 6700 0.4714 27.2532
0.0001 3200.0 6800 0.4716 27.0386
0.0001 3247.06 6900 0.4718 27.0386
0.0001 3294.12 7000 0.4720 27.0386
0.0001 3341.18 7100 0.4721 27.0386
0.0001 3388.24 7200 0.4723 26.8240
0.0001 3435.29 7300 0.4725 26.8240
0.0001 3482.35 7400 0.4727 26.6094
0.0001 3529.41 7500 0.4728 26.6094
0.0001 3576.47 7600 0.4730 26.6094
0.0001 3623.53 7700 0.4732 26.6094
0.0001 3670.59 7800 0.4733 26.6094
0.0001 3717.65 7900 0.4735 26.6094
0.0001 3764.71 8000 0.4736 26.6094
0.0001 3811.76 8100 0.4738 26.6094
0.0001 3858.82 8200 0.4739 26.6094
0.0001 3905.88 8300 0.4741 26.6094
0.0001 3952.94 8400 0.4742 26.6094
0.0001 4000.0 8500 0.4744 26.6094
0.0001 4047.06 8600 0.4745 26.6094
0.0001 4094.12 8700 0.4746 26.6094
0.0001 4141.18 8800 0.4748 26.6094
0.0001 4188.24 8900 0.4749 26.6094
0.0001 4235.29 9000 0.4750 26.6094
0.0001 4282.35 9100 0.4751 26.6094
0.0001 4329.41 9200 0.4751 27.0386
0.0001 4376.47 9300 0.4752 27.0386
0.0001 4423.53 9400 0.4753 27.0386
0.0001 4470.59 9500 0.4754 27.0386
0.0001 4517.65 9600 0.4754 27.0386
0.0001 4564.71 9700 0.4755 26.8240
0.0001 4611.76 9800 0.4755 26.8240
0.0001 4658.82 9900 0.4755 26.8240
0.0001 4705.88 10000 0.4755 26.8240

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.15.0
  • Tokenizers 0.15.0