Edit model card

whisper-small-emergency

This model is a fine-tuned version of openai/whisper-small on the whisper-small-kor dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2212
  • Wer: 21.7895
  • Cer: 10.3463

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.2586 0.02 100 2.0061 38.3870 19.3958
0.9821 0.05 200 0.8927 37.3486 16.9619
0.7244 0.07 300 0.6577 32.9353 15.2939
0.505 0.1 400 0.4395 33.8006 16.8397
0.4397 0.12 500 0.3908 29.2489 13.3393
0.3602 0.15 600 0.3710 27.7259 12.8741
0.4321 0.17 700 0.3558 27.7778 12.8224
0.3979 0.19 800 0.3504 27.5528 12.6063
0.2614 0.22 900 0.3434 28.2451 13.5601
0.3725 0.24 1000 0.3362 26.8086 12.8177
0.4098 0.27 1100 0.3329 26.8086 13.1466
0.3083 0.29 1200 0.3240 25.6663 12.0566
0.324 0.32 1300 0.3169 24.7490 11.3659
0.3437 0.34 1400 0.3090 24.2471 10.9383
0.3719 0.36 1500 0.3064 24.4548 11.2155
0.3563 0.39 1600 0.3013 24.0222 11.0887
0.3493 0.41 1700 0.3036 24.1606 11.1779
0.3132 0.44 1800 0.3011 24.0741 11.1685
0.3024 0.46 1900 0.2920 24.4202 11.2014
0.2982 0.49 2000 0.2873 22.9664 10.4262
0.3309 0.51 2100 0.2880 23.3991 10.8208
0.3209 0.53 2200 0.2811 21.9280 10.2288
0.2778 0.56 2300 0.2883 22.6895 10.5060
0.3391 0.58 2400 0.2796 21.9280 10.1818
0.3261 0.61 2500 0.2757 22.3607 10.1865
0.2711 0.63 2600 0.2746 22.9491 10.4356
0.2723 0.66 2700 0.2708 22.3088 10.5624
0.3152 0.68 2800 0.2681 21.8934 10.0127
0.248 0.7 2900 0.2679 22.2568 10.0644
0.2354 0.73 3000 0.2665 21.7203 9.8576
0.2828 0.75 3100 0.2628 21.4261 9.9422
0.2759 0.78 3200 0.2652 21.2703 9.8623
0.2904 0.8 3300 0.2606 21.2876 9.8388
0.2844 0.83 3400 0.2600 21.8761 10.0362
0.2815 0.85 3500 0.2554 20.9069 9.5992
0.2713 0.87 3600 0.2573 20.8550 9.5334
0.2748 0.9 3700 0.2566 21.5126 9.8811
0.2447 0.92 3800 0.2526 20.5088 9.3455
0.3255 0.95 3900 0.2517 20.3358 11.3048
0.2786 0.97 4000 0.2489 20.8030 9.5898
0.245 1.0 4100 0.2523 21.4607 9.7167
0.1655 1.02 4200 0.2470 20.4396 9.5287
0.1898 1.04 4300 0.2422 19.9550 9.0871
0.1394 1.07 4400 0.2429 20.0242 9.2750
0.1592 1.09 4500 0.2433 19.9896 9.0824
0.1542 1.12 4600 0.2428 20.2492 9.3126
0.1296 1.14 4700 0.2437 19.4531 8.9038
0.1477 1.17 4800 0.2432 19.7300 11.0605
0.1551 1.19 4900 0.2436 20.0762 11.3236
0.1581 1.21 5000 0.2435 19.7992 10.9994
0.2033 1.24 5100 0.2434 19.8339 9.1763
0.1444 1.26 5200 0.2399 19.8165 10.9806
0.1543 1.29 5300 0.2371 19.1762 10.8913
0.1735 1.31 5400 0.2350 19.4185 9.0166
0.1552 1.34 5500 0.2363 19.0897 8.8098
0.1495 1.36 5600 0.2332 19.1070 8.8145
0.1636 1.38 5700 0.2350 18.6051 10.5718
0.1827 1.41 5800 0.2333 18.4493 8.5091
0.1464 1.43 5900 0.2344 19.2454 8.8850
0.1999 1.46 6000 0.2325 23.1222 10.9900
0.1547 1.48 6100 0.2318 19.3839 8.8709
0.1296 1.51 6200 0.2339 19.3146 8.9085
0.1535 1.53 6300 0.2317 22.5684 10.8302
0.1467 1.55 6400 0.2310 19.1070 8.7958
0.1709 1.58 6500 0.2338 18.9685 8.7441
0.1359 1.6 6600 0.2295 19.0550 8.6548
0.1611 1.63 6700 0.2293 18.5877 8.5608
0.1232 1.65 6800 0.2309 19.4012 8.9273
0.1692 1.68 6900 0.2288 18.6224 8.8756
0.1544 1.7 7000 0.2265 18.3454 8.5467
0.1282 1.72 7100 0.2256 18.6570 8.6642
0.1414 1.75 7200 0.2258 22.1011 10.2993
0.157 1.77 7300 0.2259 18.8474 8.6501
0.1592 1.8 7400 0.2249 18.6570 8.5702
0.0998 1.82 7500 0.2246 18.8127 8.6125
0.1486 1.85 7600 0.2225 18.3281 8.3024
0.1336 1.87 7700 0.2221 18.5704 8.4387
0.1388 1.9 7800 0.2222 18.5531 8.5044
0.1341 1.92 7900 0.2212 22.0665 10.4215
0.1548 1.94 8000 0.2215 21.8588 10.3275
0.1276 1.97 8100 0.2182 21.8069 10.3040
0.1567 1.99 8200 0.2200 18.1031 8.3541
0.1054 2.02 8300 0.2201 21.5646 10.2335
0.0793 2.04 8400 0.2219 21.1838 10.1161
0.0944 2.07 8500 0.2225 21.5819 10.3510
0.0824 2.09 8600 0.2230 21.7203 10.2476
0.0863 2.11 8700 0.2222 21.6684 10.2241
0.1102 2.14 8800 0.2233 21.5819 10.3228
0.0852 2.16 8900 0.2226 21.8588 10.2946
0.0796 2.19 9000 0.2227 21.9626 10.3651
0.1023 2.21 9100 0.2223 21.7722 10.4309
0.08 2.24 9200 0.2216 21.4780 10.2664
0.0703 2.26 9300 0.2218 21.5992 10.2429
0.0923 2.28 9400 0.2212 21.4434 10.2006
0.0694 2.31 9500 0.2217 21.4780 10.2194
0.1033 2.33 9600 0.2216 21.5126 10.2382
0.0913 2.36 9700 0.2214 21.5299 10.2194
0.0882 2.38 9800 0.2212 21.7376 10.2758
0.0852 2.41 9900 0.2212 21.7203 10.3087
0.0862 2.43 10000 0.2212 21.7895 10.3463

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
8

Finetuned from

Evaluation results