patrickvonplaten's picture
update model card README.md
2737a6e
metadata
license: apache-2.0
tags:
  - automatic-speech-recognition
  - timit_asr
  - generated_from_trainer
datasets:
  - timit_asr
model-index:
  - name: sew-d-small-100k-timit
    results: []

sew-d-small-100k-timit

This model is a fine-tuned version of asapp/sew-d-small-100k on the TIMIT_ASR - NA dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7561
  • Wer: 0.7971

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
4.2068 0.69 100 4.0802 1.0
2.9806 1.38 200 2.9792 1.0
2.9781 2.07 300 2.9408 1.0
2.9655 2.76 400 2.9143 1.0
2.8953 3.45 500 2.8774 1.0
2.7712 4.14 600 2.7769 0.9999
2.6662 4.83 700 2.6425 0.9789
2.632 5.52 800 2.5142 1.0318
2.3794 6.21 900 2.4360 1.1475
2.1406 6.9 1000 2.2932 0.9962
2.223 7.59 1100 2.1590 0.9281
2.3607 8.28 1200 2.0553 0.8682
2.1058 8.97 1300 2.0443 0.8902
1.8191 9.66 1400 1.9586 0.8237
1.7013 10.34 1500 1.9586 0.8689
2.2289 11.03 1600 1.9082 0.8611
1.9125 11.72 1700 1.8772 0.8150
1.6424 12.41 1800 1.8671 0.7871
1.6553 13.1 1900 1.8192 0.8121
2.0382 13.79 2000 1.8146 0.8440
1.8785 14.48 2100 1.8094 0.8202
1.6148 15.17 2200 1.8131 0.8234
1.4948 15.86 2300 1.7969 0.8256
1.8844 16.55 2400 1.7790 0.8067
1.8099 17.24 2500 1.7783 0.8073
1.5488 17.93 2600 1.7668 0.7971
1.458 18.62 2700 1.7623 0.7973
1.7656 19.31 2800 1.7574 0.8013
1.7583 20.0 2900 1.7561 0.7971

Framework versions

  • Transformers 4.12.0.dev0
  • Pytorch 1.8.1
  • Datasets 1.14.1.dev0
  • Tokenizers 0.10.3