whisper-small-ko-E10_Yfreq
This model is a fine-tuned version of openai/whisper-small on the aihub elder over 70 dataset. It achieves the following results on the evaluation set:
- Loss: 0.2085
- Cer: 6.3029
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Cer |
---|---|---|---|---|
0.2909 | 0.13 | 100 | 0.2830 | 7.4307 |
0.1788 | 0.26 | 200 | 0.2478 | 6.5378 |
0.1644 | 0.39 | 300 | 0.2375 | 6.4967 |
0.1614 | 0.52 | 400 | 0.2265 | 6.3675 |
0.1458 | 0.64 | 500 | 0.2243 | 6.1971 |
0.1368 | 0.77 | 600 | 0.2217 | 7.0665 |
0.1226 | 0.9 | 700 | 0.2216 | 6.3029 |
0.0553 | 1.03 | 800 | 0.2162 | 5.9563 |
0.0499 | 1.16 | 900 | 0.2187 | 5.9680 |
0.0597 | 1.29 | 1000 | 0.2153 | 5.9211 |
0.0456 | 1.42 | 1100 | 0.2121 | 6.5789 |
0.0495 | 1.55 | 1200 | 0.2128 | 6.6024 |
0.0558 | 1.68 | 1300 | 0.2095 | 6.3675 |
0.044 | 1.81 | 1400 | 0.2081 | 6.3969 |
0.0424 | 1.93 | 1500 | 0.2085 | 6.3029 |
Framework versions
- Transformers 4.37.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
- Downloads last month
- 8