whisper-kor_noising_6
This model is a fine-tuned version of openai/whisper-small on the whisper-kor_noising_6 dataset. It achieves the following results on the evaluation set:
- Loss: 0.2294
- Wer: 0
- Cer: 6.9785
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 8000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.1548 | 0.08 | 200 | 0.1871 | 0 | 6.7119 |
0.1861 | 0.16 | 400 | 0.2027 | 0 | 8.5976 |
0.2078 | 0.24 | 600 | 0.2138 | 0 | 6.9863 |
0.2083 | 0.31 | 800 | 0.2229 | 0 | 9.1818 |
0.194 | 0.39 | 1000 | 0.2235 | 0 | 7.3078 |
0.1953 | 0.47 | 1200 | 0.2284 | 0 | 7.4411 |
0.2188 | 0.55 | 1400 | 0.2253 | 0 | 8.4369 |
0.172 | 0.63 | 1600 | 0.2283 | 0 | 7.2804 |
0.1764 | 0.71 | 1800 | 0.2243 | 0 | 7.2804 |
0.1821 | 0.79 | 2000 | 0.2221 | 0 | 7.1784 |
0.1671 | 0.87 | 2200 | 0.2249 | 0 | 7.0138 |
0.1786 | 0.94 | 2400 | 0.2242 | 0 | 7.2019 |
0.0852 | 1.02 | 2600 | 0.2250 | 0 | 7.3039 |
0.0919 | 1.1 | 2800 | 0.2242 | 0 | 6.9981 |
0.0887 | 1.18 | 3000 | 0.2236 | 0 | 7.4293 |
0.0937 | 1.26 | 3200 | 0.2249 | 0 | 7.0843 |
0.0943 | 1.34 | 3400 | 0.2242 | 0 | 7.0647 |
0.0912 | 1.42 | 3600 | 0.2237 | 0 | 7.2176 |
0.086 | 1.5 | 3800 | 0.2259 | 0 | 7.6058 |
0.0879 | 1.57 | 4000 | 0.2242 | 0 | 7.2725 |
0.0881 | 1.65 | 4200 | 0.2216 | 0 | 6.8961 |
0.088 | 1.73 | 4400 | 0.2197 | 0 | 7.1431 |
0.0746 | 1.81 | 4600 | 0.2200 | 0 | 7.2215 |
0.0776 | 1.89 | 4800 | 0.2194 | 0 | 7.0334 |
0.0821 | 1.97 | 5000 | 0.2209 | 0 | 7.3392 |
0.0372 | 2.05 | 5200 | 0.2306 | 0 | 7.0843 |
0.033 | 2.13 | 5400 | 0.2257 | 0 | 7.1431 |
0.0425 | 2.2 | 5600 | 0.2252 | 0 | 6.9589 |
0.0367 | 2.28 | 5800 | 0.2263 | 0 | 8.1546 |
0.0352 | 2.36 | 6000 | 0.2265 | 0 | 7.4215 |
0.0339 | 2.44 | 6200 | 0.2268 | 0 | 7.0804 |
0.0409 | 2.52 | 6400 | 0.2265 | 0 | 7.0255 |
0.0364 | 2.6 | 6600 | 0.2260 | 0 | 6.9079 |
0.039 | 2.68 | 6800 | 0.2261 | 0 | 6.8177 |
0.0375 | 2.75 | 7000 | 0.2266 | 0 | 6.9902 |
0.0381 | 2.83 | 7200 | 0.2256 | 0 | 6.8648 |
0.0291 | 2.91 | 7400 | 0.2252 | 0 | 7.1000 |
0.0342 | 2.99 | 7600 | 0.2236 | 0 | 6.9393 |
0.019 | 3.07 | 7800 | 0.2282 | 0 | 6.9942 |
0.0188 | 3.15 | 8000 | 0.2294 | 0 | 6.9785 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 7