You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-lg-CV-Fleurs-100hrs-v10

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6365
  • Wer: 0.2931
  • Cer: 0.0656

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.8123 0.9999 4598 0.4324 0.4954 0.1197
0.4178 2.0 9197 0.3691 0.4337 0.1018
0.3645 2.9999 13795 0.3475 0.4050 0.0953
0.3323 4.0 18394 0.3526 0.3988 0.0928
0.3081 4.9999 22992 0.3250 0.3959 0.0905
0.2869 6.0 27591 0.3213 0.3985 0.0920
0.2708 6.9999 32189 0.3122 0.3849 0.0877
0.2574 8.0 36788 0.3229 0.3790 0.0875
0.2424 8.9999 41386 0.3118 0.3654 0.0855
0.2311 10.0 45985 0.3190 0.3533 0.0830
0.2198 10.9999 50583 0.3085 0.3568 0.0831
0.2115 12.0 55182 0.3070 0.3592 0.0822
0.2027 12.9999 59780 0.3034 0.3586 0.0833
0.1934 14.0 64379 0.3135 0.3612 0.0824
0.1848 14.9999 68977 0.3126 0.3521 0.0803
0.1763 16.0 73576 0.3159 0.3577 0.0807
0.1693 16.9999 78174 0.3276 0.3602 0.0813
0.1626 18.0 82773 0.3286 0.3453 0.0804
0.156 18.9999 87371 0.3176 0.3476 0.0819
0.1488 20.0 91970 0.3171 0.3379 0.0782
0.1417 20.9999 96568 0.3363 0.3377 0.0769
0.1363 22.0 101167 0.3244 0.3484 0.0790
0.1307 22.9999 105765 0.3232 0.3336 0.0769
0.1244 24.0 110364 0.3464 0.3400 0.0776
0.1198 24.9999 114962 0.3575 0.3364 0.0760
0.1154 26.0 119561 0.3416 0.3421 0.0785
0.1113 26.9999 124159 0.3718 0.3414 0.0789
0.1075 28.0 128758 0.3571 0.3384 0.0760
0.1037 28.9999 133356 0.3716 0.3409 0.0781
0.1016 30.0 137955 0.3909 0.3414 0.0768
0.0976 30.9999 142553 0.3745 0.3383 0.0760
0.0946 32.0 147152 0.4116 0.3380 0.0775
0.0913 32.9999 151750 0.3901 0.3319 0.0759
0.0881 34.0 156349 0.3900 0.3401 0.0762
0.0853 34.9999 160947 0.4045 0.3300 0.0749
0.0828 36.0 165546 0.4353 0.3320 0.0760
0.0805 36.9999 170144 0.4063 0.3356 0.0770
0.0789 38.0 174743 0.4147 0.3272 0.0757
0.0769 38.9999 179341 0.4324 0.3215 0.0744
0.0744 40.0 183940 0.4317 0.3433 0.0759
0.072 40.9999 188538 0.4173 0.3273 0.0748
0.0706 42.0 193137 0.4326 0.3283 0.0749
0.0681 42.9999 197735 0.4483 0.3212 0.0738
0.0671 44.0 202334 0.4612 0.3296 0.0750
0.0648 44.9999 206932 0.4639 0.3244 0.0747
0.0634 46.0 211531 0.4635 0.3247 0.0736
0.0609 46.9999 216129 0.4613 0.3184 0.0731
0.0598 48.0 220728 0.4987 0.3308 0.0739
0.059 48.9999 225326 0.4680 0.3237 0.0730
0.0585 50.0 229925 0.4722 0.3265 0.0737
0.0565 50.9999 234523 0.4736 0.3160 0.0724
0.0548 52.0 239122 0.4904 0.3226 0.0725
0.0533 52.9999 243720 0.5052 0.3189 0.0721
0.0522 54.0 248319 0.4949 0.3169 0.0713
0.0516 54.9999 252917 0.4909 0.3132 0.0713
0.0498 56.0 257516 0.5333 0.3131 0.0708
0.0484 56.9999 262114 0.5058 0.3182 0.0718
0.048 58.0 266713 0.5239 0.3180 0.0721
0.0466 58.9999 271311 0.4904 0.3157 0.0717
0.0458 60.0 275910 0.5162 0.3148 0.0708
0.0446 60.9999 280508 0.4864 0.3154 0.0703
0.0435 62.0 285107 0.5206 0.3157 0.0707
0.0427 62.9999 289705 0.5272 0.3092 0.0699
0.042 64.0 294304 0.5192 0.3160 0.0706
0.0405 64.9999 298902 0.5194 0.3070 0.0691
0.0401 66.0 303501 0.5382 0.3144 0.0707
0.0387 66.9999 308099 0.5159 0.3069 0.0692
0.0385 68.0 312698 0.5353 0.3138 0.0708
0.0374 68.9999 317296 0.4952 0.3070 0.0696
0.0368 70.0 321895 0.5551 0.3076 0.0691
0.0358 70.9999 326493 0.5521 0.3083 0.0690
0.0347 72.0 331092 0.5671 0.3054 0.0686
0.0336 72.9999 335690 0.5652 0.3028 0.0691
0.0327 74.0 340289 0.5574 0.3027 0.0688
0.0321 74.9999 344887 0.5515 0.2998 0.0678
0.0313 76.0 349486 0.5528 0.3015 0.0681
0.0307 76.9999 354084 0.5727 0.3008 0.0677
0.0302 78.0 358683 0.5684 0.3017 0.0677
0.03 78.9999 363281 0.5654 0.3028 0.0679
0.0285 80.0 367880 0.5822 0.3049 0.0678
0.0282 80.9999 372478 0.5999 0.3042 0.0676
0.0275 82.0 377077 0.5716 0.3048 0.0679
0.027 82.9999 381675 0.6061 0.2974 0.0673
0.0261 84.0 386274 0.5713 0.3069 0.0681
0.0252 84.9999 390872 0.6035 0.3054 0.0681
0.0248 86.0 395471 0.6045 0.2998 0.0675
0.0238 86.9999 400069 0.6126 0.3003 0.0678
0.0239 88.0 404668 0.6153 0.2965 0.0665
0.0235 88.9999 409266 0.6171 0.2966 0.0667
0.0227 90.0 413865 0.6168 0.2967 0.0665
0.0219 90.9999 418463 0.6202 0.2948 0.0663
0.0222 92.0 423062 0.6212 0.2935 0.0660
0.0215 92.9999 427660 0.6165 0.2937 0.0660
0.0208 94.0 432259 0.6102 0.2952 0.0661
0.0204 94.9999 436857 0.6251 0.2932 0.0659
0.0204 96.0 441456 0.6254 0.2923 0.0657
0.0193 96.9999 446054 0.6297 0.2939 0.0658
0.0195 98.0 450653 0.6331 0.2939 0.0657
0.0193 98.9999 455251 0.6314 0.2933 0.0657
0.0189 99.9891 459800 0.6365 0.2931 0.0656

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
85
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-lg-CV-Fleurs-100hrs-v10

Finetuned
(524)
this model