Edit model card

whisper-kor_noising_5

This model is a fine-tuned version of openai/whisper-small on the whisper-kor_noising_5 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0488
  • Wer: 3.6157
  • Cer: 1.8298

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.4197 0.19 100 0.4320 27.3865 13.2459
0.4509 0.37 200 0.3964 26.1729 12.8052
0.3414 0.56 300 0.3536 23.3204 10.5281
0.358 0.75 400 0.3171 22.9576 11.1100
0.2625 0.93 500 0.2747 20.5930 9.9462
0.1761 1.12 600 0.2419 17.1400 7.7968
0.1427 1.31 700 0.2118 14.9756 6.9055
0.1444 1.5 800 0.1740 12.4984 5.7619
0.126 1.68 900 0.1509 12.2732 6.4951
0.1047 1.87 1000 0.1282 10.3340 5.4255
0.0328 2.06 1100 0.1073 7.7318 3.8547
0.0321 2.24 1200 0.1017 7.3564 3.7437
0.0314 2.43 1300 0.0934 6.6058 3.2728
0.0271 2.62 1400 0.0824 6.1804 3.2089
0.0272 2.8 1500 0.0728 5.6174 2.8725
0.0213 2.99 1600 0.0690 5.9427 3.3434
0.0106 3.18 1700 0.0649 4.8668 2.5732
0.0087 3.36 1800 0.0619 4.8417 2.4992
0.0087 3.55 1900 0.0599 4.4414 2.2704
0.0086 3.74 2000 0.0563 4.5165 2.3646
0.0059 3.93 2100 0.0542 4.4039 2.2570
0.0033 4.11 2200 0.0540 4.0786 2.0350
0.0029 4.3 2300 0.0536 4.1787 2.1090
0.0028 4.49 2400 0.0514 3.9660 1.9946
0.003 4.67 2500 0.0511 3.9034 1.9475
0.003 4.86 2600 0.0505 3.6032 1.8937
0.0021 5.05 2700 0.0493 3.6157 1.8903
0.0018 5.23 2800 0.0495 3.5781 1.8130
0.0025 5.42 2900 0.0496 3.8033 1.9307
0.0018 5.61 3000 0.0495 3.6407 1.8500
0.0017 5.79 3100 0.0495 3.7408 1.8903
0.0017 5.98 3200 0.0491 3.6782 1.8903
0.0015 6.17 3300 0.0493 3.6532 1.8702
0.0015 6.36 3400 0.0490 3.6532 1.8702
0.0017 6.54 3500 0.0490 3.6282 1.8567
0.0014 6.73 3600 0.0490 3.7408 1.9273
0.0014 6.92 3700 0.0488 3.6282 1.8332
0.0014 7.1 3800 0.0488 3.6282 1.8365
0.0014 7.29 3900 0.0488 3.6157 1.8332
0.0013 7.48 4000 0.0488 3.6157 1.8298

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
7
Safetensors
Model size
242M params
Tensor type
F32
·

Finetuned from

Evaluation results