Edit model card

wav2vec2-demo-F03-2

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4472
  • Wer: 0.8797

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
24.7815 0.97 500 3.3881 1.0
3.3791 1.94 1000 3.2550 1.0
2.9748 2.91 1500 2.8719 1.0
2.8305 3.88 2000 2.7878 1.0
2.6289 4.85 2500 2.5009 1.2082
2.1553 5.83 3000 1.8680 1.3270
1.4669 6.8 3500 1.5138 1.3266
1.0475 7.77 4000 1.3531 1.2078
0.8132 8.74 4500 1.2666 1.1926
0.665 9.71 5000 1.2461 1.0980
0.5538 10.68 5500 1.3152 1.0719
0.4759 11.65 6000 1.3190 1.0902
0.4221 12.62 6500 1.3077 1.0184
0.3836 13.59 7000 1.3410 1.0645
0.3409 14.56 7500 1.2378 1.0246
0.3068 15.53 8000 1.3002 1.0152
0.2911 16.5 8500 1.3603 1.0074
0.2647 17.48 9000 1.3479 0.9375
0.2446 18.45 9500 1.3462 0.9187
0.2208 19.42 10000 1.4029 0.9109
0.2203 20.39 10500 1.4036 0.8977
0.204 21.36 11000 1.3755 0.8668
0.191 22.33 11500 1.3042 0.8691
0.1784 23.3 12000 1.4343 0.8758
0.1692 24.27 12500 1.4351 0.8492
0.1527 25.24 13000 1.4415 0.8645
0.1444 26.21 13500 1.4684 0.8812
0.1432 27.18 14000 1.4406 0.8906
0.1385 28.16 14500 1.4900 0.8840
0.1392 29.13 15000 1.4472 0.8797

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu113
  • Datasets 1.18.3
  • Tokenizers 0.13.2
Downloads last month
8