--- language: - fa license: apache-2.0 base_model: makhataei/Whisper-Small-Common-Voice tags: - fa-asr - generated_from_trainer datasets: - mozilla-foundation/common_voice_15_0 metrics: - wer model-index: - name: Whisper Small Persian results: [] --- # Whisper Small Persian This model is a fine-tuned version of [makhataei/Whisper-Small-Common-Voice](https://huggingface.co/makhataei/Whisper-Small-Common-Voice) on the Common Voice 15.0 dataset. It achieves the following results on the evaluation set: - Loss: 0.9637 - Wer: 53.0122 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 10 - eval_batch_size: 10 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 40 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 10000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:-------:| | 0.0003 | 0.14 | 100 | 0.8127 | 50.1960 | | 0.0003 | 0.28 | 200 | 0.8106 | 50.8591 | | 0.0003 | 0.42 | 300 | 0.8138 | 50.2935 | | 0.0005 | 0.56 | 400 | 0.8216 | 51.2345 | | 0.0003 | 0.7 | 500 | 0.8295 | 50.0918 | | 0.0003 | 0.83 | 600 | 0.8331 | 53.4124 | | 0.0003 | 0.97 | 700 | 0.8269 | 54.9288 | | 0.0003 | 1.11 | 800 | 0.8295 | 51.0178 | | 0.0005 | 1.25 | 900 | 0.8341 | 50.1149 | | 0.0007 | 1.39 | 1000 | 0.8423 | 51.9819 | | 0.0005 | 1.53 | 1100 | 0.8324 | 52.3706 | | 0.0003 | 1.67 | 1200 | 0.8411 | 51.8662 | | 0.0002 | 1.81 | 1300 | 0.8545 | 52.8402 | | 0.0004 | 1.95 | 1400 | 0.8619 | 54.0242 | | 0.0002 | 2.09 | 1500 | 0.8556 | 54.8296 | | 0.0004 | 2.23 | 1600 | 0.8291 | 53.9581 | | 0.0003 | 2.36 | 1700 | 0.8633 | 51.2047 | | 0.0003 | 2.5 | 1800 | 0.8557 | 53.7249 | | 0.0005 | 2.64 | 1900 | 0.8551 | 51.7190 | | 0.0003 | 2.78 | 2000 | 0.8418 | 52.9030 | | 0.0002 | 2.92 | 2100 | 0.8522 | 50.9467 | | 0.0002 | 3.06 | 2200 | 0.8798 | 51.2047 | | 0.0003 | 3.2 | 2300 | 0.8545 | 51.4395 | | 0.0002 | 3.34 | 2400 | 0.8633 | 51.0212 | | 0.0007 | 3.48 | 2500 | 0.8644 | 53.8440 | | 0.0002 | 3.62 | 2600 | 0.8598 | 52.5029 | | 0.0002 | 3.76 | 2700 | 0.8578 | 52.0679 | | 0.0002 | 3.89 | 2800 | 0.8672 | 52.1027 | | 0.0001 | 4.03 | 2900 | 0.8655 | 52.3706 | | 0.0001 | 4.17 | 3000 | 0.8741 | 52.2350 | | 0.0001 | 4.31 | 3100 | 0.8716 | 53.0056 | | 0.0001 | 4.45 | 3200 | 0.8758 | 51.0327 | | 0.0005 | 4.59 | 3300 | 0.8636 | 51.8662 | | 0.0001 | 4.73 | 3400 | 0.8725 | 51.0807 | | 0.0001 | 4.87 | 3500 | 0.8781 | 51.1700 | | 0.0001 | 5.01 | 3600 | 0.8806 | 50.7450 | | 0.0001 | 5.15 | 3700 | 0.8835 | 50.6210 | | 0.0001 | 5.29 | 3800 | 0.8852 | 51.1121 | | 0.0001 | 5.42 | 3900 | 0.8874 | 51.1700 | | 0.0001 | 5.56 | 4000 | 0.8894 | 51.3998 | | 0.0002 | 5.7 | 4100 | 0.8899 | 51.4246 | | 0.0001 | 5.84 | 4200 | 0.8927 | 51.6992 | | 0.0001 | 5.98 | 4300 | 0.8933 | 51.8993 | | 0.0001 | 6.12 | 4400 | 0.8966 | 51.7835 | | 0.0001 | 6.26 | 4500 | 0.8980 | 51.8381 | | 0.0001 | 6.4 | 4600 | 0.8973 | 51.7107 | | 0.0001 | 6.54 | 4700 | 0.9008 | 51.5553 | | 0.0001 | 6.68 | 4800 | 0.9029 | 51.1220 | | 0.0001 | 6.82 | 4900 | 0.9030 | 51.3221 | | 0.0001 | 6.95 | 5000 | 0.9039 | 52.1605 | | 0.0001 | 7.09 | 5100 | 0.9084 | 52.1440 | | 0.0001 | 7.23 | 5200 | 0.9106 | 51.9505 | | 0.0001 | 7.37 | 5300 | 0.9117 | 52.6219 | | 0.0001 | 7.51 | 5400 | 0.9133 | 52.4830 | | 0.0002 | 7.65 | 5500 | 0.9187 | 51.3320 | | 0.0001 | 7.79 | 5600 | 0.9184 | 52.3954 | | 0.0001 | 7.93 | 5700 | 0.9185 | 52.5392 | | 0.0001 | 8.07 | 5800 | 0.9209 | 53.1263 | | 0.0001 | 8.21 | 5900 | 0.9232 | 53.0965 | | 0.0001 | 8.34 | 6000 | 0.9242 | 53.6737 | | 0.0001 | 8.48 | 6100 | 0.9220 | 52.6996 | | 0.0001 | 8.62 | 6200 | 0.9228 | 52.6500 | | 0.0001 | 8.76 | 6300 | 0.9255 | 52.3838 | | 0.0001 | 8.9 | 6400 | 0.9269 | 53.0138 | | 0.0001 | 9.04 | 6500 | 0.9298 | 52.9345 | | 0.0001 | 9.18 | 6600 | 0.9317 | 53.2222 | | 0.0001 | 9.32 | 6700 | 0.9337 | 53.1974 | | 0.0001 | 9.46 | 6800 | 0.9354 | 52.9130 | | 0.0001 | 9.6 | 6900 | 0.9379 | 52.8865 | | 0.0001 | 9.74 | 7000 | 0.9407 | 52.9560 | | 0.0001 | 9.87 | 7100 | 0.9399 | 52.5045 | | 0.0001 | 10.01 | 7200 | 0.9394 | 52.9113 | | 0.0001 | 10.15 | 7300 | 0.9423 | 52.9064 | | 0.0001 | 10.29 | 7400 | 0.9422 | 52.9477 | | 0.0001 | 10.43 | 7500 | 0.9445 | 53.2305 | | 0.0001 | 10.57 | 7600 | 0.9452 | 53.1842 | | 0.0001 | 10.71 | 7700 | 0.9478 | 53.3562 | | 0.0001 | 10.85 | 7800 | 0.9451 | 52.9113 | | 0.0001 | 10.99 | 7900 | 0.9476 | 52.6616 | | 0.0 | 11.13 | 8000 | 0.9502 | 52.3606 | | 0.0 | 11.27 | 8100 | 0.9518 | 52.7294 | | 0.0 | 11.4 | 8200 | 0.9523 | 52.8799 | | 0.0 | 11.54 | 8300 | 0.9540 | 52.8419 | | 0.0001 | 11.68 | 8400 | 0.9542 | 53.0486 | | 0.0 | 11.82 | 8500 | 0.9569 | 53.0453 | | 0.0 | 11.96 | 8600 | 0.9576 | 52.9576 | | 0.0 | 12.1 | 8700 | 0.9589 | 53.2371 | | 0.0 | 12.24 | 8800 | 0.9599 | 53.2057 | | 0.0 | 12.38 | 8900 | 0.9605 | 53.3165 | | 0.0 | 12.52 | 9000 | 0.9603 | 52.9576 | | 0.0 | 12.66 | 9100 | 0.9608 | 52.5789 | | 0.0 | 12.8 | 9200 | 0.9609 | 53.2288 | | 0.0 | 12.93 | 9300 | 0.9611 | 53.1759 | | 0.0 | 13.07 | 9400 | 0.9618 | 53.1296 | | 0.0001 | 13.21 | 9500 | 0.9632 | 53.0618 | | 0.0 | 13.35 | 9600 | 0.9632 | 52.9593 | | 0.0 | 13.49 | 9700 | 0.9633 | 52.9923 | | 0.0 | 13.63 | 9800 | 0.9635 | 53.1379 | | 0.0 | 13.77 | 9900 | 0.9637 | 53.0122 | | 0.0 | 13.91 | 10000 | 0.9637 | 53.0122 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.0.1+cu117 - Datasets 2.15.0 - Tokenizers 0.15.0