sulaimank's picture
End of training
c662b59 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xls-r-300m-lg-CV-Fleurs-200hrs-v11
    results: []

wav2vec2-xls-r-300m-lg-CV-Fleurs-200hrs-v11

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5459
  • Wer: 0.2673
  • Cer: 0.0600

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 80
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.5847 1.0 9604 0.3651 0.4380 0.1030
0.3444 2.0 19208 0.3452 0.4003 0.0948
0.3045 3.0 28812 0.3180 0.3849 0.0894
0.2762 4.0 38416 0.3008 0.3744 0.0859
0.2549 5.0 48020 0.3076 0.3745 0.0862
0.2392 6.0 57624 0.2865 0.3569 0.0814
0.225 7.0 67228 0.2839 0.3475 0.0799
0.2137 8.0 76832 0.2730 0.3496 0.0793
0.2031 9.0 86436 0.2895 0.3487 0.0803
0.1922 10.0 96040 0.2667 0.3519 0.0801
0.184 11.0 105644 0.2853 0.3340 0.0762
0.1748 12.0 115248 0.2582 0.3318 0.0756
0.1666 13.0 124852 0.2601 0.3324 0.0743
0.1605 14.0 134456 0.2627 0.3282 0.0752
0.1532 15.0 144060 0.2682 0.3248 0.0748
0.1462 16.0 153664 0.2782 0.3255 0.0745
0.139 17.0 163268 0.2798 0.3255 0.0737
0.1328 18.0 172872 0.2794 0.3212 0.0725
0.1267 19.0 182476 0.2855 0.3158 0.0723
0.121 20.0 192080 0.2801 0.3108 0.0702
0.1151 21.0 201684 0.2737 0.3060 0.0689
0.1104 22.0 211288 0.2774 0.3211 0.0727
0.1052 23.0 220892 0.2842 0.3125 0.0716
0.1001 24.0 230496 0.2951 0.3120 0.0698
0.095 25.0 240100 0.2926 0.3101 0.0703
0.0912 26.0 249704 0.2863 0.3042 0.0701
0.0881 27.0 259308 0.3057 0.3106 0.0701
0.0838 28.0 268912 0.3083 0.3099 0.0704
0.0811 29.0 278516 0.3272 0.3107 0.0701
0.0779 30.0 288120 0.3337 0.3110 0.0696
0.0746 31.0 297724 0.3389 0.3006 0.0679
0.0716 32.0 307328 0.3380 0.3021 0.0686
0.0691 33.0 316932 0.3334 0.3036 0.0676
0.0667 34.0 326536 0.3280 0.3033 0.0676
0.0646 35.0 336140 0.3451 0.3076 0.0682
0.0623 36.0 345744 0.3544 0.3012 0.0677
0.0604 37.0 355348 0.3688 0.3036 0.0686
0.0581 38.0 364952 0.3706 0.3040 0.0683
0.0567 39.0 374556 0.3936 0.2999 0.0679
0.0552 40.0 384160 0.3663 0.3034 0.0678
0.0529 41.0 393764 0.3894 0.3009 0.0681
0.052 42.0 403368 0.3807 0.2945 0.0669
0.0498 43.0 412972 0.3960 0.2945 0.0668
0.0487 44.0 422576 0.4331 0.2949 0.0677
0.0471 45.0 432180 0.4023 0.2926 0.0663
0.0458 46.0 441784 0.3923 0.2919 0.0660
0.0447 47.0 451388 0.4166 0.2957 0.0659
0.0428 48.0 460992 0.4066 0.2932 0.0656
0.0417 49.0 470596 0.4177 0.2929 0.0671
0.0409 50.0 480200 0.4262 0.2909 0.0656
0.0391 51.0 489804 0.4366 0.2875 0.0655
0.0381 52.0 499408 0.4492 0.2916 0.0647
0.0374 53.0 509012 0.4400 0.2822 0.0639
0.0362 54.0 518616 0.4480 0.2856 0.0644
0.0349 55.0 528220 0.4560 0.2845 0.0647
0.0339 56.0 537824 0.4718 0.2838 0.0642
0.0332 57.0 547428 0.4646 0.2854 0.0642
0.0324 58.0 557032 0.4635 0.2835 0.0639
0.0316 59.0 566636 0.4836 0.2847 0.0638
0.0302 60.0 576240 0.4745 0.2814 0.0634
0.0296 61.0 585844 0.4663 0.2786 0.0630
0.0285 62.0 595448 0.4630 0.2768 0.0625
0.028 63.0 605052 0.4861 0.2763 0.0626
0.027 64.0 614656 0.5029 0.2800 0.0632
0.0261 65.0 624260 0.4905 0.2791 0.0625
0.0253 66.0 633864 0.4920 0.2783 0.0621
0.0247 67.0 643468 0.4926 0.2796 0.0620
0.0238 68.0 653072 0.5030 0.2752 0.0619
0.0234 69.0 662676 0.4909 0.2734 0.0614
0.0229 70.0 672280 0.5069 0.2731 0.0612
0.022 71.0 681884 0.5141 0.2680 0.0607
0.0214 72.0 691488 0.5336 0.2692 0.0605
0.021 73.0 701092 0.5030 0.2676 0.0606
0.0204 74.0 710696 0.5245 0.2654 0.0599
0.0196 75.0 720300 0.5345 0.2653 0.0602
0.0194 76.0 729904 0.5288 0.2694 0.0607
0.0186 77.0 739508 0.5339 0.2666 0.0599
0.0182 78.0 749112 0.5432 0.2675 0.0602
0.018 79.0 758716 0.5457 0.2660 0.0600
0.018 79.9917 768240 0.5459 0.2673 0.0600

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.1.0+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0