You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-5hrs-v2

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4635
  • Wer: 0.3074
  • Cer: 0.0975

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
10.1117 1.0 394 4.3490 1.0 1.0
3.7471 2.0 788 3.1184 1.0 1.0
3.0204 3.0 1182 2.9072 1.0 1.0
2.9064 4.0 1576 2.8651 1.0 1.0
2.8666 5.0 1970 2.8332 1.0 1.0
2.8284 6.0 2364 2.8052 1.0 1.0
2.7745 7.0 2758 2.6265 1.0 0.9998
2.4996 8.0 3152 2.0214 1.0 0.5429
1.9287 9.0 3546 1.4032 0.9991 0.3316
1.5136 10.0 3940 1.1202 0.9125 0.2549
1.2999 11.0 4334 0.9748 0.7513 0.2049
1.1427 12.0 4728 0.8717 0.6135 0.1733
1.0469 13.0 5122 0.8070 0.5762 0.1658
0.9806 14.0 5516 0.7431 0.5455 0.1568
0.9187 15.0 5910 0.7146 0.5244 0.1512
0.8663 16.0 6304 0.6732 0.4882 0.1419
0.8084 17.0 6698 0.6555 0.4738 0.1390
0.7852 18.0 7092 0.6295 0.4550 0.1350
0.7505 19.0 7486 0.6140 0.4433 0.1311
0.7193 20.0 7880 0.6098 0.4250 0.1268
0.6927 21.0 8274 0.5722 0.4215 0.1260
0.6564 22.0 8668 0.5742 0.4074 0.1217
0.6699 23.0 9062 0.5864 0.4032 0.1220
0.6269 24.0 9456 0.5620 0.3918 0.1189
0.6142 25.0 9850 0.5484 0.3901 0.1177
0.6001 26.0 10244 0.5475 0.3829 0.1164
0.5953 27.0 10638 0.5361 0.3700 0.1132
0.584 28.0 11032 0.5283 0.3718 0.1136
0.5555 29.0 11426 0.5323 0.3693 0.1129
0.5453 30.0 11820 0.5048 0.3665 0.1126
0.5363 31.0 12214 0.5200 0.3542 0.1093
0.5196 32.0 12608 0.5118 0.3512 0.1086
0.5098 33.0 13002 0.5059 0.3514 0.1087
0.5042 34.0 13396 0.5109 0.3531 0.1094
0.4955 35.0 13790 0.4946 0.3456 0.1080
0.4958 36.0 14184 0.4943 0.3399 0.1069
0.4743 37.0 14578 0.4953 0.3395 0.1065
0.4739 38.0 14972 0.4996 0.3432 0.1077
0.4616 39.0 15366 0.4896 0.3393 0.1051
0.4509 40.0 15760 0.4793 0.3336 0.1042
0.4523 41.0 16154 0.4955 0.3316 0.1037
0.4301 42.0 16548 0.4827 0.3384 0.1047
0.4312 43.0 16942 0.4763 0.3247 0.1016
0.4157 44.0 17336 0.4786 0.3244 0.1016
0.4177 45.0 17730 0.4948 0.3273 0.1025
0.4207 46.0 18124 0.4869 0.3202 0.1016
0.4028 47.0 18518 0.4852 0.3259 0.1030
0.41 48.0 18912 0.4745 0.3231 0.1015
0.4042 49.0 19306 0.4866 0.3191 0.1003
0.3953 50.0 19700 0.4785 0.3111 0.0975
0.3889 51.0 20094 0.4798 0.3196 0.0992
0.3853 52.0 20488 0.4858 0.3182 0.0993
0.3819 53.0 20882 0.4848 0.3179 0.1005
0.3761 54.0 21276 0.4759 0.3134 0.0992
0.3652 55.0 21670 0.4643 0.3160 0.0996
0.3734 56.0 22064 0.4868 0.3062 0.0975
0.3574 57.0 22458 0.4636 0.3105 0.0989
0.3538 58.0 22852 0.4596 0.3139 0.1002
0.3731 59.0 23246 0.4738 0.3150 0.0996
0.3497 60.0 23640 0.4609 0.3152 0.0994
0.3393 61.0 24034 0.4627 0.3063 0.0975
0.3557 62.0 24428 0.4860 0.3032 0.0959
0.3461 63.0 24822 0.4818 0.3059 0.0975
0.3395 64.0 25216 0.4809 0.3049 0.0969
0.3314 65.0 25610 0.4736 0.3039 0.0958
0.3372 66.0 26004 0.4698 0.3066 0.0974
0.3356 67.0 26398 0.4809 0.2984 0.0949
0.3269 68.0 26792 0.4627 0.3074 0.0969
0.3206 69.0 27186 0.4681 0.3005 0.0966
0.3282 70.0 27580 0.4734 0.3031 0.0965
0.3256 71.0 27974 0.4781 0.3006 0.0952
0.3187 72.0 28368 0.4684 0.3061 0.0973
0.3229 73.0 28762 0.4793 0.2988 0.0953
0.3068 74.0 29156 0.4662 0.3010 0.0961
0.318 75.0 29550 0.4826 0.2996 0.0957
0.3053 76.0 29944 0.4695 0.2986 0.0946
0.3117 77.0 30338 0.4635 0.3074 0.0975

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
42
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-5hrs-v2

Finetuned
(524)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-Fleurs_AMMI_AFRIVOICE_LRSC-ln-5hrs-v2