Edit model card

wav2vec2-timit-xls-r-53-wandb-colab

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3325
  • Wer: 0.2897
  • Cer: 0.0940

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 0.69 400 3.1507 1.0 0.9806
4.3857 1.38 800 3.0109 1.0 0.9806
2.6835 2.08 1200 0.6181 0.5756 0.1795
0.9327 2.77 1600 0.4239 0.4718 0.1456
0.5602 3.46 2000 0.3691 0.4141 0.1301
0.5602 4.15 2400 0.3386 0.3894 0.1231
0.4407 4.84 2800 0.3122 0.3676 0.1177
0.3437 5.54 3200 0.3149 0.3601 0.1152
0.3154 6.23 3600 0.3146 0.3495 0.1119
0.267 6.92 4000 0.3039 0.3427 0.1089
0.267 7.61 4400 0.3313 0.3409 0.1092
0.2354 8.3 4800 0.2986 0.3365 0.1064
0.2191 9.0 5200 0.3235 0.3353 0.1074
0.1937 9.69 5600 0.3117 0.3320 0.1071
0.1803 10.38 6000 0.3102 0.3233 0.1040
0.1803 11.07 6400 0.3176 0.3196 0.1030
0.1635 11.76 6800 0.3166 0.3220 0.1036
0.1551 12.46 7200 0.2836 0.3160 0.1021
0.1566 13.15 7600 0.3146 0.3186 0.1032
0.1424 13.84 8000 0.3392 0.3167 0.1036
0.1424 14.53 8400 0.3254 0.3109 0.1001
0.1379 15.22 8800 0.3249 0.3127 0.1009
0.1192 15.92 9200 0.3408 0.3119 0.1010
0.1178 16.61 9600 0.3551 0.3061 0.0997
0.1112 17.3 10000 0.3250 0.3059 0.0991
0.1112 17.99 10400 0.3127 0.3037 0.0983
0.1022 18.69 10800 0.3370 0.3067 0.0994
0.1031 19.38 11200 0.3351 0.3048 0.0991
0.0926 20.07 11600 0.3433 0.2994 0.0974
0.0861 20.76 12000 0.3145 0.3003 0.0971
0.0861 21.45 12400 0.3367 0.2980 0.0973
0.0935 22.15 12800 0.3139 0.3016 0.0986
0.0784 22.84 13200 0.3181 0.2990 0.0972
0.078 23.53 13600 0.3347 0.2938 0.0961
0.0761 24.22 14000 0.3371 0.2921 0.0949
0.0761 24.91 14400 0.3274 0.2916 0.0952
0.0784 25.61 14800 0.3152 0.2927 0.0942
0.0714 26.3 15200 0.3237 0.2924 0.0943
0.0671 26.99 15600 0.3183 0.2914 0.0945
0.0684 27.68 16000 0.3307 0.2931 0.0950
0.0684 28.37 16400 0.3383 0.2913 0.0940
0.07 29.07 16800 0.3318 0.2901 0.0940
0.0624 29.76 17200 0.3325 0.2897 0.0940

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 1.18.3
  • Tokenizers 0.13.3
Downloads last month
3

Finetuned from