Edit model card

pratyush_whisper_small_distil_libri360_enc_8_dec_6_batch_2_epoch_50

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9043
  • Wer: 13.0427

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 2
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 512
  • total_train_batch_size: 1024
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
6.8528 0.49 100 5.4707 94.5857
5.4115 0.98 200 4.7525 89.6667
4.44 1.48 300 2.5672 47.3154
2.71 1.97 400 2.0272 26.9788
2.2003 2.46 500 1.8737 20.2713
2.0566 2.95 600 1.8204 17.6620
1.9829 3.45 700 1.7948 16.2944
1.9501 3.94 800 1.7809 15.3891
1.9173 4.43 900 1.7755 15.0537
1.9025 4.93 1000 1.7754 14.7302
1.8847 5.42 1100 1.7820 14.6116
1.8776 5.91 1200 1.7795 14.1585
1.8661 6.4 1300 1.7807 13.9664
1.8647 6.9 1400 1.7841 13.9940
1.858 7.39 1500 1.7921 13.9489
1.8608 7.88 1600 1.7997 13.9269
1.858 8.37 1700 1.8084 13.9370
1.8621 8.87 1800 1.8160 13.8414
1.8633 9.36 1900 1.8221 13.9627
1.8663 9.85 2000 1.8259 14.0013
1.8667 10.34 2100 1.8429 13.9379
1.865 10.84 2200 1.8406 13.9011
1.8614 11.33 2300 1.8401 13.5887
1.8564 11.82 2400 1.8587 13.5739
1.8552 12.32 2500 1.8514 13.5620
1.8523 12.81 2600 1.8561 13.3295
1.8551 13.3 2700 1.8581 13.3148
1.8521 13.79 2800 1.8650 13.1594
1.8522 14.29 2900 1.8729 13.2385
1.8513 14.78 3000 1.8754 13.1778
1.8524 15.27 3100 1.8814 13.0611
1.8495 15.76 3200 1.8867 13.2504
1.8553 16.26 3300 1.8860 13.0942
1.8531 16.75 3400 1.8884 12.8175
1.8545 17.24 3500 1.9003 12.8598
1.8533 17.73 3600 1.8982 13.0381
1.8548 18.23 3700 1.9005 13.0299
1.8542 18.72 3800 1.9067 13.0289
1.8552 19.21 3900 1.9043 13.0427

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.12.1
  • Datasets 2.7.0
  • Tokenizers 0.11.0
Downloads last month
2
Inference Examples
Inference API (serverless) is not available, repository is disabled.