Edit model card

wav2vec2-large-xlsr-sw-common-voice-16

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the common_voice_16_0 dataset. It achieves the following results on the evaluation set:

  • Loss: inf
  • Wer: 0.3082

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
5.8691 0.11 400 inf 1.0
1.673 0.22 800 inf 0.7178
0.6454 0.33 1200 inf 0.6274
0.5527 0.44 1600 inf 0.5747
0.4989 0.55 2000 inf 0.5174
0.4827 0.66 2400 inf 0.5302
0.4462 0.77 2800 inf 0.4916
0.4374 0.88 3200 inf 0.4769
0.4183 0.99 3600 inf 0.4687
0.3854 1.1 4000 inf 0.4669
0.3802 1.2 4400 inf 0.4513
0.3727 1.31 4800 inf 0.4505
0.3694 1.42 5200 inf 0.4405
0.3709 1.53 5600 inf 0.4364
0.363 1.64 6000 inf 0.4318
0.3669 1.75 6400 inf 0.4398
0.3597 1.86 6800 inf 0.4353
0.3541 1.97 7200 inf 0.4251
0.3277 2.08 7600 inf 0.4153
0.3211 2.19 8000 inf 0.4178
0.3225 2.3 8400 inf 0.4267
0.3215 2.41 8800 inf 0.4139
0.3224 2.52 9200 inf 0.4054
0.3106 2.63 9600 inf 0.4155
0.3141 2.74 10000 inf 0.4188
0.3189 2.85 10400 inf 0.4036
0.3213 2.96 10800 inf 0.4071
0.3005 3.07 11200 inf 0.3954
0.2872 3.18 11600 inf 0.3974
0.2855 3.29 12000 inf 0.3982
0.2898 3.39 12400 inf 0.3987
0.288 3.5 12800 inf 0.4021
0.2941 3.61 13200 inf 0.3955
0.2951 3.72 13600 inf 0.4022
0.2916 3.83 14000 inf 0.3960
0.2896 3.94 14400 inf 0.3903
0.2794 4.05 14800 inf 0.3918
0.2707 4.16 15200 inf 0.3873
0.2682 4.27 15600 inf 0.3927
0.2654 4.38 16000 inf 0.3844
0.2699 4.49 16400 inf 0.3909
0.2723 4.6 16800 inf 0.3904
0.2762 4.71 17200 inf 0.3857
0.2621 4.82 17600 inf 0.3795
0.26 4.93 18000 inf 0.3764
0.2659 5.04 18400 inf 0.3842
0.2479 5.15 18800 inf 0.3719
0.2518 5.26 19200 inf 0.3822
0.2591 5.37 19600 inf 0.3837
0.2491 5.48 20000 inf 0.3871
0.2466 5.59 20400 inf 0.3747
0.2519 5.69 20800 inf 0.3788
0.2516 5.8 21200 inf 0.3781
0.2422 5.91 21600 inf 0.3751
0.2439 6.02 22000 inf 0.3693
0.2327 6.13 22400 inf 0.3752
0.2327 6.24 22800 inf 0.3706
0.2302 6.35 23200 inf 0.3687
0.2313 6.46 23600 inf 0.3690
0.2363 6.57 24000 inf 0.3686
0.2329 6.68 24400 inf 0.3681
0.2328 6.79 24800 inf 0.3626
0.2329 6.9 25200 inf 0.3652
0.2254 7.01 25600 inf 0.3606
0.2124 7.12 26000 inf 0.3648
0.2206 7.23 26400 inf 0.3686
0.2151 7.34 26800 inf 0.3646
0.2167 7.45 27200 inf 0.3630
0.2196 7.56 27600 inf 0.3597
0.2089 7.67 28000 inf 0.3561
0.2183 7.78 28400 inf 0.3593
0.2148 7.89 28800 inf 0.3580
0.2232 7.99 29200 inf 0.3597
0.2002 8.1 29600 inf 0.3581
0.1924 8.21 30000 inf 0.3585
0.2046 8.32 30400 inf 0.3606
0.2057 8.43 30800 inf 0.3611
0.2042 8.54 31200 inf 0.3618
0.21 8.65 31600 inf 0.3599
0.2076 8.76 32000 inf 0.3568
0.208 8.87 32400 inf 0.3564
0.2154 8.98 32800 inf 0.3566
0.1991 9.09 33200 inf 0.3621
0.1986 9.2 33600 inf 0.3571
0.1898 9.31 34000 inf 0.3515
0.1961 9.42 34400 inf 0.3559
0.1947 9.53 34800 inf 0.3521
0.1886 9.64 35200 inf 0.3500
0.1901 9.75 35600 inf 0.3557
0.1998 9.86 36000 inf 0.3547
0.1873 9.97 36400 inf 0.3498
0.1858 10.08 36800 inf 0.3552
0.1804 10.18 37200 inf 0.3518
0.18 10.29 37600 inf 0.3504
0.1777 10.4 38000 inf 0.3532
0.1777 10.51 38400 inf 0.3530
0.1801 10.62 38800 inf 0.3515
0.1839 10.73 39200 inf 0.3538
0.1913 10.84 39600 inf 0.3554
0.1909 10.95 40000 inf 0.3479
0.1812 11.06 40400 inf 0.3467
0.1664 11.17 40800 inf 0.3491
0.175 11.28 41200 inf 0.3446
0.1733 11.39 41600 inf 0.3464
0.1709 11.5 42000 inf 0.3467
0.1777 11.61 42400 inf 0.3469
0.1735 11.72 42800 inf 0.3452
0.1765 11.83 43200 inf 0.3471
0.1738 11.94 43600 inf 0.3496
0.1649 12.05 44000 inf 0.3445
0.1601 12.16 44400 inf 0.3464
0.1603 12.27 44800 inf 0.3416
0.1634 12.38 45200 inf 0.3445
0.1628 12.48 45600 inf 0.3452
0.1621 12.59 46000 inf 0.3403
0.1596 12.7 46400 inf 0.3394
0.1589 12.81 46800 inf 0.3401
0.1632 12.92 47200 inf 0.3403
0.163 13.03 47600 inf 0.3429
0.1516 13.14 48000 inf 0.3417
0.1506 13.25 48400 inf 0.3417
0.1568 13.36 48800 inf 0.3410
0.1543 13.47 49200 inf 0.3409
0.1574 13.58 49600 inf 0.3408
0.1555 13.69 50000 inf 0.3424
0.1535 13.8 50400 inf 0.3395
0.1539 13.91 50800 inf 0.3409
0.1528 14.02 51200 inf 0.3406
0.1411 14.13 51600 inf 0.3366
0.1413 14.24 52000 inf 0.3402
0.1477 14.35 52400 inf 0.3386
0.1433 14.46 52800 inf 0.3356
0.1446 14.57 53200 inf 0.3357
0.1427 14.67 53600 inf 0.3378
0.1462 14.78 54000 inf 0.3328
0.1436 14.89 54400 inf 0.3358
0.1434 15.0 54800 inf 0.3366
0.135 15.11 55200 inf 0.3354
0.1375 15.22 55600 inf 0.3355
0.1366 15.33 56000 inf 0.3356
0.1389 15.44 56400 inf 0.3336
0.1378 15.55 56800 inf 0.3364
0.1362 15.66 57200 inf 0.3325
0.1376 15.77 57600 inf 0.3361
0.1323 15.88 58000 inf 0.3364
0.1343 15.99 58400 inf 0.3332
0.1257 16.1 58800 inf 0.3339
0.1239 16.21 59200 inf 0.3316
0.1292 16.32 59600 inf 0.3313
0.1297 16.43 60000 inf 0.3332
0.1265 16.54 60400 inf 0.3313
0.1271 16.65 60800 inf 0.3310
0.1315 16.76 61200 inf 0.3307
0.1271 16.87 61600 inf 0.3337
0.1298 16.97 62000 inf 0.3318
0.1211 17.08 62400 inf 0.3326
0.1192 17.19 62800 inf 0.3290
0.1232 17.3 63200 inf 0.3291
0.1229 17.41 63600 inf 0.3349
0.1162 17.52 64000 inf 0.3281
0.1207 17.63 64400 inf 0.3308
0.1179 17.74 64800 inf 0.3257
0.1207 17.85 65200 inf 0.3290
0.1256 17.96 65600 inf 0.3297
0.119 18.07 66000 inf 0.3279
0.1111 18.18 66400 inf 0.3302
0.1086 18.29 66800 inf 0.3285
0.1179 18.4 67200 inf 0.3274
0.1099 18.51 67600 inf 0.3281
0.1141 18.62 68000 inf 0.3281
0.1091 18.73 68400 inf 0.3301
0.1147 18.84 68800 inf 0.3270
0.1158 18.95 69200 inf 0.3246
0.1111 19.06 69600 inf 0.3227
0.1075 19.16 70000 inf 0.3249
0.1051 19.27 70400 inf 0.3253
0.1029 19.38 70800 inf 0.3252
0.1039 19.49 71200 inf 0.3264
0.1063 19.6 71600 inf 0.3242
0.1071 19.71 72000 inf 0.3250
0.1063 19.82 72400 inf 0.3248
0.1085 19.93 72800 inf 0.3247
0.1038 20.04 73200 inf 0.3242
0.1017 20.15 73600 inf 0.3255
0.099 20.26 74000 inf 0.3247
0.0971 20.37 74400 inf 0.3258
0.1002 20.48 74800 inf 0.3223
0.1013 20.59 75200 inf 0.3230
0.1018 20.7 75600 inf 0.3232
0.0967 20.81 76000 inf 0.3215
0.1008 20.92 76400 inf 0.3212
0.0975 21.03 76800 inf 0.3191
0.0893 21.14 77200 inf 0.3210
0.0911 21.25 77600 inf 0.3206
0.0959 21.36 78000 inf 0.3211
0.094 21.46 78400 inf 0.3198
0.0939 21.57 78800 inf 0.3202
0.0936 21.68 79200 inf 0.3202
0.0938 21.79 79600 inf 0.3195
0.0938 21.9 80000 inf 0.3184
0.0916 22.01 80400 inf 0.3185
0.0858 22.12 80800 inf 0.3177
0.0909 22.23 81200 inf 0.3211
0.0915 22.34 81600 inf 0.3222
0.088 22.45 82000 inf 0.3194
0.0902 22.56 82400 inf 0.3199
0.0868 22.67 82800 inf 0.3174
0.0871 22.78 83200 inf 0.3201
0.0908 22.89 83600 inf 0.3177
0.0842 23.0 84000 inf 0.3187
0.0842 23.11 84400 inf 0.3168
0.0815 23.22 84800 inf 0.3187
0.084 23.33 85200 inf 0.3201
0.0835 23.44 85600 inf 0.3185
0.0821 23.55 86000 inf 0.3189
0.0836 23.66 86400 inf 0.3179
0.0816 23.76 86800 inf 0.3174
0.0847 23.87 87200 inf 0.3172
0.0828 23.98 87600 inf 0.3178
0.0796 24.09 88000 inf 0.3144
0.0793 24.2 88400 inf 0.3149
0.0773 24.31 88800 inf 0.3165
0.0808 24.42 89200 inf 0.3154
0.0743 24.53 89600 inf 0.3159
0.078 24.64 90000 inf 0.3145
0.0792 24.75 90400 inf 0.3170
0.0775 24.86 90800 inf 0.3134
0.0763 24.97 91200 inf 0.3144
0.0705 25.08 91600 inf 0.3138
0.0724 25.19 92000 inf 0.3156
0.0732 25.3 92400 inf 0.3158
0.0743 25.41 92800 inf 0.3144
0.0729 25.52 93200 inf 0.3133
0.071 25.63 93600 inf 0.3139
0.0764 25.74 94000 inf 0.3122
0.0726 25.85 94400 inf 0.3128
0.0714 25.95 94800 inf 0.3135
0.0725 26.06 95200 inf 0.3147
0.0711 26.17 95600 inf 0.3130
0.0684 26.28 96000 inf 0.3125
0.0683 26.39 96400 inf 0.3144
0.0698 26.5 96800 inf 0.3135
0.0687 26.61 97200 inf 0.3131
0.0675 26.72 97600 inf 0.3119
0.0678 26.83 98000 inf 0.3105
0.0677 26.94 98400 inf 0.3102
0.068 27.05 98800 inf 0.3128
0.0694 27.16 99200 inf 0.3111
0.0681 27.27 99600 inf 0.3118
0.0656 27.38 100000 inf 0.3110
0.065 27.49 100400 inf 0.3113
0.0649 27.6 100800 inf 0.3113
0.0643 27.71 101200 inf 0.3107
0.0651 27.82 101600 inf 0.3102
0.0643 27.93 102000 inf 0.3109
0.063 28.04 102400 inf 0.3110
0.0604 28.15 102800 inf 0.3108
0.062 28.25 103200 inf 0.3110
0.0623 28.36 103600 inf 0.3106
0.063 28.47 104000 inf 0.3102
0.0619 28.58 104400 inf 0.3101
0.0636 28.69 104800 inf 0.3108
0.0636 28.8 105200 inf 0.3099
0.0643 28.91 105600 inf 0.3089
0.0607 29.02 106000 inf 0.3094
0.0597 29.13 106400 inf 0.3091
0.0616 29.24 106800 inf 0.3087
0.0594 29.35 107200 inf 0.3087
0.0614 29.46 107600 inf 0.3087
0.06 29.57 108000 inf 0.3082
0.0617 29.68 108400 inf 0.3085
0.0574 29.79 108800 inf 0.3082
0.06 29.9 109200 inf 0.3082

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
316M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results