Edit model card

workstation_whisper_base_finetune_teacher__babble_noise_mozilla_100_epochs_batch_4

This model is a fine-tuned version of openai/whisper-base.en on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3964
  • Wer: 36.5051

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 256
  • total_train_batch_size: 1024
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.0214 7.35 500 0.8448 36.1291
0.3301 14.7 1000 0.9065 35.5511
0.0745 22.06 1500 1.1071 36.1535
0.0089 29.41 2000 1.2245 36.1082
0.0026 36.76 2500 1.3039 36.3171
0.0015 44.12 3000 1.3551 36.4216
0.001 51.47 3500 1.3964 36.5051

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1
  • Datasets 2.7.1
  • Tokenizers 0.11.0
Downloads last month
7