Edit model card

wav2vec2-common_voice13

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the MOZILLA-FOUNDATION/COMMON_VOICE_13_0 - TR dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3366
  • Wer: 0.2937

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 20.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
4.0301 0.08 100 4.0286 1.0
3.1668 0.15 200 3.2323 1.0
2.3413 0.23 300 2.1300 0.9986
1.4546 0.31 400 0.8731 0.7629
1.4595 0.38 500 0.7366 0.7386
1.1903 0.46 600 0.6131 0.6645
1.1586 0.53 700 0.5491 0.6195
0.8275 0.61 800 0.5159 0.5923
1.0042 0.69 900 0.5153 0.6040
0.9428 0.76 1000 0.4629 0.5602
0.7592 0.84 1100 0.4670 0.5520
0.8284 0.92 1200 0.4455 0.5760
0.7736 0.99 1300 0.4571 0.5480
0.4047 1.07 1400 0.3962 0.4940
0.3543 1.14 1500 0.4018 0.4969
0.3898 1.22 1600 0.3901 0.4862
0.3827 1.3 1700 0.3982 0.4954
0.3316 1.37 1800 0.4139 0.5032
0.3365 1.45 1900 0.3964 0.4878
0.251 1.53 2000 0.4028 0.4899
0.2419 1.6 2100 0.3991 0.5190
0.3094 1.68 2200 0.3700 0.4865
0.3459 1.75 2300 0.3652 0.4850
0.3085 1.83 2400 0.3806 0.4742
0.4463 1.91 2500 0.3804 0.4729
0.2359 1.98 2600 0.3696 0.4635
0.1502 2.06 2700 0.3764 0.4602
0.2819 2.14 2800 0.3740 0.4499
0.22 2.21 2900 0.3811 0.4597
0.287 2.29 3000 0.3562 0.4334
0.2531 2.36 3100 0.3700 0.4442
0.3143 2.44 3200 0.3548 0.4333
0.203 2.52 3300 0.3659 0.4558
0.2609 2.59 3400 0.3557 0.4468
0.191 2.67 3500 0.3476 0.4281
0.1354 2.75 3600 0.3650 0.4354
0.2345 2.82 3700 0.3479 0.4385
0.1951 2.9 3800 0.3508 0.4489
0.2991 2.97 3900 0.3585 0.4356
0.1579 3.05 4000 0.3603 0.4326
0.2319 3.13 4100 0.3442 0.4201
0.1941 3.2 4200 0.3344 0.4116
0.2561 3.28 4300 0.3475 0.4200
0.3208 3.36 4400 0.3505 0.4089
0.2555 3.43 4500 0.3593 0.4271
0.1927 3.51 4600 0.3536 0.4299
0.1994 3.59 4700 0.3672 0.4400
0.1357 3.66 4800 0.3433 0.4223
0.2043 3.74 4900 0.3471 0.4226
0.194 3.81 5000 0.3380 0.4230
0.1779 3.89 5100 0.3400 0.4130
0.1934 3.97 5200 0.3438 0.4104
0.1432 4.04 5300 0.3632 0.4254
0.1642 4.12 5400 0.3425 0.4237
0.2208 4.2 5500 0.3580 0.4132
0.1923 4.27 5600 0.3469 0.4143
0.2084 4.35 5700 0.3619 0.4252
0.2484 4.42 5800 0.3452 0.4210
0.1899 4.5 5900 0.3465 0.4136
0.1253 4.58 6000 0.3625 0.4150
0.1353 4.65 6100 0.3415 0.4182
0.2264 4.73 6200 0.3446 0.4153
0.2016 4.81 6300 0.3343 0.4087
0.1634 4.88 6400 0.3500 0.4253
0.2517 4.96 6500 0.3453 0.4291
0.1826 5.03 6600 0.3442 0.4106
0.174 5.11 6700 0.3478 0.3999
0.271 5.19 6800 0.3423 0.4023
0.1812 5.26 6900 0.3679 0.4200
0.3 5.34 7000 0.3583 0.4191
0.2678 5.42 7100 0.3534 0.4141
0.236 5.49 7200 0.3361 0.4041
0.1558 5.57 7300 0.3495 0.4126
0.2603 5.64 7400 0.3359 0.3969
0.1285 5.72 7500 0.3296 0.3994
0.4608 5.8 7600 0.3453 0.3933
0.1516 5.87 7700 0.3509 0.4028
0.2655 5.95 7800 0.3607 0.4109
0.22 6.03 7900 0.3392 0.3850
0.0787 6.1 8000 0.3395 0.3842
0.1297 6.18 8100 0.3356 0.3822
0.1747 6.25 8200 0.3275 0.3874
0.1647 6.33 8300 0.3554 0.3941
0.1314 6.41 8400 0.3287 0.3826
0.1264 6.48 8500 0.3122 0.3876
0.1229 6.56 8600 0.3525 0.3994
0.108 6.64 8700 0.3387 0.3968
0.185 6.71 8800 0.3333 0.3840
0.0924 6.79 8900 0.3366 0.3827
0.1226 6.86 9000 0.3243 0.3788
0.2005 6.94 9100 0.3324 0.3765
0.133 7.02 9200 0.3294 0.3688
0.0633 7.09 9300 0.3279 0.3738
0.0593 7.17 9400 0.3311 0.3639
0.088 7.25 9500 0.3221 0.3765
0.1489 7.32 9600 0.3421 0.3788
0.1175 7.4 9700 0.3191 0.3786
0.0983 7.48 9800 0.3303 0.3764
0.1493 7.55 9900 0.3371 0.3836
0.1091 7.63 10000 0.3410 0.3739
0.1058 7.7 10100 0.3262 0.3730
0.0849 7.78 10200 0.3379 0.3812
0.1362 7.86 10300 0.3291 0.3781
0.1227 7.93 10400 0.3235 0.3760
0.1647 8.01 10500 0.3285 0.3686
0.1013 8.09 10600 0.3319 0.3729
0.1432 8.16 10700 0.3280 0.3731
0.1345 8.24 10800 0.3237 0.3707
0.0813 8.31 10900 0.3285 0.3748
0.1063 8.39 11000 0.3321 0.3748
0.1342 8.47 11100 0.3171 0.3647
0.1202 8.54 11200 0.3209 0.3636
0.0987 8.62 11300 0.3224 0.3625
0.1357 8.7 11400 0.3245 0.3646
0.1038 8.77 11500 0.3172 0.3702
0.0961 8.85 11600 0.3080 0.3611
0.1836 8.92 11700 0.3112 0.3681
0.0951 9.0 11800 0.3157 0.3649
0.1162 9.08 11900 0.3188 0.3714
0.1157 9.15 12000 0.3383 0.3775
0.1268 9.23 12100 0.3204 0.3752
0.1402 9.31 12200 0.3441 0.3707
0.1094 9.38 12300 0.3415 0.3675
0.1122 9.46 12400 0.3150 0.3596
0.0932 9.53 12500 0.3195 0.3561
0.1176 9.61 12600 0.3250 0.3675
0.1287 9.69 12700 0.3253 0.3615
0.0886 9.76 12800 0.3276 0.3636
0.1016 9.84 12900 0.3185 0.3592
0.0902 9.92 13000 0.3177 0.3643
0.1304 9.99 13100 0.3131 0.3530
0.099 10.07 13200 0.3094 0.3525
0.1142 10.14 13300 0.3298 0.3609
0.1836 10.22 13400 0.3213 0.3526
0.1533 10.3 13500 0.3163 0.3579
0.1436 10.37 13600 0.3352 0.3543
0.1215 10.45 13700 0.3355 0.3458
0.0971 10.53 13800 0.3232 0.3579
0.1215 10.6 13900 0.3168 0.3441
0.0906 10.68 14000 0.3266 0.3498
0.125 10.76 14100 0.3318 0.3414
0.0831 10.83 14200 0.3030 0.3480
0.1588 10.91 14300 0.3155 0.3455
0.1191 10.98 14400 0.3287 0.3487
0.074 11.06 14500 0.3176 0.3431
0.1075 11.14 14600 0.3219 0.3446
0.0679 11.21 14700 0.3158 0.3414
0.0789 11.29 14800 0.3305 0.3491
0.1426 11.37 14900 0.3281 0.3485
0.1154 11.44 15000 0.3368 0.3482
0.1313 11.52 15100 0.3285 0.3415
0.0786 11.59 15200 0.3138 0.3439
0.0595 11.67 15300 0.3135 0.3431
0.0868 11.75 15400 0.3049 0.3396
0.0812 11.82 15500 0.3050 0.3373
0.1199 11.9 15600 0.3238 0.3392
0.1243 11.98 15700 0.3123 0.3368
0.0663 12.05 15800 0.3226 0.3373
0.0285 12.13 15900 0.3260 0.3367
0.0607 12.2 16000 0.3236 0.3406
0.064 12.28 16100 0.3297 0.3357
0.0554 12.36 16200 0.3357 0.3383
0.0561 12.43 16300 0.3211 0.3387
0.0785 12.51 16400 0.3140 0.3386
0.0539 12.59 16500 0.3130 0.3361
0.0873 12.66 16600 0.3244 0.3344
0.0774 12.74 16700 0.3128 0.3274
0.0853 12.81 16800 0.3185 0.3395
0.0701 12.89 16900 0.3244 0.3327
0.0486 12.97 17000 0.3100 0.3317
0.1087 13.04 17100 0.3351 0.3327
0.0716 13.12 17200 0.3474 0.3383
0.0653 13.2 17300 0.3361 0.3364
0.0936 13.27 17400 0.3423 0.3352
0.0761 13.35 17500 0.3261 0.3304
0.0723 13.42 17600 0.3298 0.3333
0.0756 13.5 17700 0.3282 0.3367
0.058 13.58 17800 0.3386 0.3303
0.0619 13.65 17900 0.3354 0.3306
0.081 13.73 18000 0.3413 0.3317
0.0893 13.81 18100 0.3257 0.3278
0.0858 13.88 18200 0.3312 0.3255
0.0756 13.96 18300 0.3279 0.3326
0.0946 14.04 18400 0.3412 0.3272
0.1452 14.11 18500 0.3394 0.3266
0.0772 14.19 18600 0.3271 0.3261
0.0748 14.26 18700 0.3338 0.3272
0.0789 14.34 18800 0.3461 0.3254
0.0967 14.42 18900 0.3163 0.3250
0.0938 14.49 19000 0.3273 0.3261
0.1134 14.57 19100 0.3301 0.3284
0.1051 14.65 19200 0.3187 0.3215
0.0936 14.72 19300 0.3211 0.3197
0.0528 14.8 19400 0.3381 0.3270
0.1497 14.87 19500 0.3291 0.3235
0.1168 14.95 19600 0.3290 0.3238
0.028 15.03 19700 0.3333 0.3209
0.0773 15.1 19800 0.3359 0.3206
0.0972 15.18 19900 0.3262 0.3163
0.0391 15.26 20000 0.3335 0.3180
0.0571 15.33 20100 0.3445 0.3198
0.0365 15.41 20200 0.3318 0.3170
0.0535 15.48 20300 0.3257 0.3147
0.0739 15.56 20400 0.3359 0.3136
0.0753 15.64 20500 0.3216 0.3195
0.1507 15.71 20600 0.3326 0.3154
0.062 15.79 20700 0.3310 0.3164
0.0595 15.87 20800 0.3134 0.3162
0.0456 15.94 20900 0.3146 0.3127
0.0977 16.02 21000 0.3328 0.3117
0.036 16.09 21100 0.3266 0.3134
0.0308 16.17 21200 0.3306 0.3136
0.0612 16.25 21300 0.3207 0.3160
0.0269 16.32 21400 0.3429 0.3143
0.0897 16.4 21500 0.3355 0.3111
0.0458 16.48 21600 0.3238 0.3065
0.0155 16.55 21700 0.3167 0.3042
0.0519 16.63 21800 0.3296 0.3099
0.0807 16.7 21900 0.3250 0.3048
0.0406 16.78 22000 0.3283 0.3087
0.0773 16.86 22100 0.3217 0.3047
0.1027 16.93 22200 0.3279 0.3108
0.0315 17.01 22300 0.3173 0.3058
0.0457 17.09 22400 0.3387 0.3085
0.0516 17.16 22500 0.3309 0.3050
0.0413 17.24 22600 0.3363 0.3067
0.0601 17.32 22700 0.3325 0.3048
0.0435 17.39 22800 0.3298 0.3058
0.0571 17.47 22900 0.3244 0.3033
0.0656 17.54 23000 0.3350 0.3056
0.0485 17.62 23100 0.3406 0.3051
0.0619 17.7 23200 0.3268 0.3033
0.0495 17.77 23300 0.3268 0.3031
0.0416 17.85 23400 0.3268 0.3038
0.0646 17.93 23500 0.3314 0.3009
0.0294 18.0 23600 0.3251 0.3028
0.0372 18.08 23700 0.3364 0.2962
0.04 18.15 23800 0.3358 0.2967
0.0367 18.23 23900 0.3317 0.3031
0.0312 18.31 24000 0.3272 0.2998
0.0419 18.38 24100 0.3358 0.2996
0.0477 18.46 24200 0.3283 0.2996
0.0256 18.54 24300 0.3310 0.2995
0.0269 18.61 24400 0.3325 0.2997
0.0309 18.69 24500 0.3345 0.2974
0.0441 18.76 24600 0.3345 0.3003
0.0496 18.84 24700 0.3396 0.2985
0.0425 18.92 24800 0.3425 0.2965
0.0196 18.99 24900 0.3373 0.2964
0.0348 19.07 25000 0.3361 0.2955
0.0466 19.15 25100 0.3328 0.2959
0.0422 19.22 25200 0.3343 0.2964
0.0271 19.3 25300 0.3369 0.2945
0.053 19.37 25400 0.3330 0.2953
0.0662 19.45 25500 0.3343 0.2958
0.0718 19.53 25600 0.3330 0.2952
0.0212 19.6 25700 0.3352 0.2940
0.0971 19.68 25800 0.3374 0.2935
0.0413 19.76 25900 0.3362 0.2933
0.0477 19.83 26000 0.3356 0.2940
0.1068 19.91 26100 0.3365 0.2937
0.108 19.98 26200 0.3366 0.2935

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1
  • Datasets 2.13.1
  • Tokenizers 0.13.2
Downloads last month
4
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results