wav2vec2-3 / README.md
jadasdn's picture
End of training
80a604b
metadata
license: apache-2.0
base_model: jadasdn/wav2vec2-2
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-3
    results: []

wav2vec2-3

This model is a fine-tuned version of jadasdn/wav2vec2-2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9689
  • Wer: 0.3821

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.5494 0.5 500 0.4696 0.3879
0.4979 1.0 1000 0.5284 0.4156
0.438 1.5 1500 0.5127 0.4121
0.4577 2.0 2000 0.5421 0.4162
0.3528 2.5 2500 0.5667 0.4117
0.3685 3.0 3000 0.5354 0.4109
0.3047 3.5 3500 0.5899 0.4244
0.3103 4.0 4000 0.5587 0.4107
0.2626 4.5 4500 0.6265 0.4255
0.2735 5.0 5000 0.6153 0.4104
0.2244 5.5 5500 0.5916 0.4058
0.2418 6.0 6000 0.5921 0.4134
0.2013 6.5 6500 0.6632 0.4079
0.211 7.0 7000 0.6563 0.4206
0.182 7.5 7500 0.6546 0.4096
0.1912 8.0 8000 0.6429 0.4043
0.1621 8.5 8500 0.7105 0.4060
0.1695 9.0 9000 0.6985 0.4027
0.1512 9.5 9500 0.7729 0.4060
0.1521 10.0 10000 0.7261 0.4047
0.1355 10.5 10500 0.7665 0.4097
0.1385 11.0 11000 0.7366 0.4044
0.1275 11.5 11500 0.7494 0.4046
0.1278 12.0 12000 0.7756 0.4028
0.1165 12.5 12500 0.8292 0.4064
0.1176 13.0 13000 0.7808 0.4053
0.1091 13.5 13500 0.8157 0.4052
0.1092 14.0 14000 0.8295 0.4071
0.1024 14.5 14500 0.8617 0.4032
0.1005 15.0 15000 0.8066 0.4042
0.0932 15.5 15500 0.8588 0.4022
0.0932 16.0 16000 0.8429 0.4036
0.087 16.5 16500 0.8975 0.4037
0.0866 17.0 17000 0.8916 0.3976
0.0835 17.5 17500 0.8566 0.3955
0.0828 18.0 18000 0.8581 0.3993
0.0756 18.5 18500 0.8827 0.3962
0.0789 19.0 19000 0.9072 0.3957
0.071 19.5 19500 0.9349 0.3920
0.0719 20.0 20000 0.9064 0.3931
0.0698 20.5 20500 0.9208 0.3932
0.0671 21.0 21000 0.8935 0.3927
0.0656 21.5 21500 0.9271 0.3936
0.0612 22.0 22000 0.9792 0.3979
0.0576 22.5 22500 0.9530 0.3917
0.0588 23.0 23000 0.9617 0.3928
0.0532 23.5 23500 0.9754 0.3854
0.0563 24.0 24000 0.9559 0.3915
0.053 24.5 24500 0.9845 0.3894
0.0512 25.0 25000 0.9629 0.3876
0.0516 25.5 25500 0.9565 0.3873
0.0491 26.0 26000 0.9726 0.3877
0.0473 26.5 26500 0.9831 0.3877
0.0441 27.0 27000 0.9645 0.3839
0.042 27.5 27500 1.0040 0.3851
0.0458 28.0 28000 0.9685 0.3839
0.043 28.5 28500 0.9687 0.3841
0.0399 29.0 29000 0.9764 0.3831
0.0421 29.5 29500 0.9691 0.3828
0.041 30.0 30000 0.9689 0.3821

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0