Edit model card

Whisper Small Hi - Gopika Krishnan

This model is a fine-tuned version of openai/whisper-small on the Konnakol dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2352
  • Wer: 87.6457

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.1507 21.7391 500 1.2891 87.7622
0.0428 43.4783 1000 1.4133 93.7063
0.0111 65.2174 1500 1.7252 89.3939
0.0063 86.9565 2000 1.8134 85.8974
0.0035 108.6957 2500 2.0195 85.7809
0.003 130.4348 3000 2.0771 87.8788
0.0027 152.1739 3500 2.1378 87.5291
0.0025 173.9130 4000 2.1730 86.4802
0.0025 195.6522 4500 2.2126 87.8788
0.0025 217.3913 5000 2.2352 87.6457

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
242M params
Tensor type
F32
·

Finetuned from

Evaluation results