Edit model card

wav2vec2-base-timit-demo-google-colab

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5320
  • Wer: 0.3362

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
3.5752 1.0 500 2.2162 1.0392
0.9682 2.01 1000 0.5208 0.5191
0.4414 3.01 1500 0.4623 0.4528
0.2971 4.02 2000 0.4553 0.4421
0.2328 5.02 2500 0.4960 0.4208
0.1891 6.02 3000 0.4213 0.3812
0.1571 7.03 3500 0.4863 0.3865
0.1544 8.03 4000 0.4677 0.3910
0.1302 9.04 4500 0.6305 0.4042
0.1111 10.04 5000 0.5104 0.3830
0.0986 11.04 5500 0.5332 0.3808
0.088 12.05 6000 0.4494 0.3674
0.0827 13.05 6500 0.4779 0.3748
0.074 14.06 7000 0.5315 0.3738
0.0664 15.06 7500 0.5367 0.3661
0.0568 16.06 8000 0.5707 0.3817
0.057 17.07 8500 0.5381 0.3719
0.0561 18.07 9000 0.5353 0.3705
0.0487 19.08 9500 0.5087 0.3579
0.0444 20.08 10000 0.4910 0.3596
0.0433 21.08 10500 0.4931 0.3497
0.0363 22.09 11000 0.5414 0.3488
0.0318 23.09 11500 0.5405 0.3472
0.033 24.1 12000 0.5476 0.3449
0.0262 25.1 12500 0.5529 0.3443
0.0255 26.1 13000 0.5299 0.3417
0.0252 27.11 13500 0.5092 0.3363
0.0221 28.11 14000 0.5309 0.3357
0.0223 29.12 14500 0.5320 0.3362

Framework versions

  • Transformers 4.17.0
  • Pytorch 1.12.1+cu113
  • Datasets 1.18.3
  • Tokenizers 0.12.1
Downloads last month
9