Whisper Small Hungarian (training in progress)
This model is a fine-tuned version of openai/whisper-small on the Common Voice 16 dataset of Mozilla Foundation. It achieves the following results on the evaluation set:
Tempolary at step 3500:
- Wer: 18.8314
Unfortunatly the colab disconected, this is the end... :( maybe later continue
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.25e-05
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_steps: 400
- planed training_steps: 6000
- executed steps: 3500 only (colab dc)
- mixed_precision_training: Native AMP
Training results
Steps | Training Loss | Validation Loss | Wer Ortho | Wer |
---|---|---|---|---|
500 | 0.354600 | 0.349688 | 34.385555 | 31.246555 |
1000 | 0.283800 | 0.290485 | 29.696507 | 26.625776 |
1500 | 0.248800 | 0.255122 | 26.360826 | 23.300925 |
2000 | 0.198300 | 0.234539 | 24.557530 | 21.714145 |
2500 | 0.196300 | 0.224310 | 23.557423 | 20.698512 |
3000 | 0.153000 | 0.210894 | 22.088291 | 19.231356 |
3500 | 0.109100 | 0.210817 | 21.465313 | 18.831435 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0
- Downloads last month
- 3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Hungarians/whisper-small-cv16-hu
Base model
openai/whisper-smallDataset used to train Hungarians/whisper-small-cv16-hu
Evaluation results
- Wer on Common Voice 16.0 - Hungariantest set self-reported18.831