Edit model card

Whisper Tiny Bengali - Raiyan Ahmed

This model is a fine-tuned version of openai/whisper-tiny on the Common Voice 16.1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1982
  • Wer: 49.3515

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.5615 0.3021 200 0.5681 92.0473
0.4104 0.6042 400 0.4525 83.0059
0.336 0.9063 600 0.3315 74.3195
0.257 1.2085 800 0.3217 75.2095
0.2262 1.5106 1000 0.2550 65.7941
0.1906 1.8127 1200 0.2147 59.0769
0.1924 2.1148 1400 0.2816 67.6071
0.1886 2.4169 1600 0.2658 68.2982
0.175 2.7190 1800 0.2401 65.5598
0.1268 3.0211 2000 0.2279 57.7160
0.1206 3.3233 2200 0.2190 58.5680
0.1085 3.6254 2400 0.2048 54.5160
0.1049 3.9275 2600 0.1929 53.0769
0.047 4.2296 2800 0.2100 52.4805
0.0452 4.5317 3000 0.2054 50.8852
0.0411 4.8338 3200 0.1982 49.3515

Framework versions

  • Transformers 4.41.0.dev0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
37.8M params
Tensor type
F32
·

Finetuned from

Dataset used to train raiyan007/whisper-tiny-common16.1

Evaluation results