Edit model card

wav2vec2-turkish-300m-2

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice_16_1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3293
  • Wer: 0.2786

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.234 0.1461 400 0.7428 0.7198
0.6888 0.2922 800 0.7178 0.7417
0.5978 0.4383 1200 0.5479 0.6138
0.5608 0.5844 1600 0.5362 0.5827
0.52 0.7305 2000 0.6510 0.6688
0.5019 0.8766 2400 0.5023 0.5676
0.4791 1.0226 2800 0.4218 0.5065
0.4217 1.1687 3200 0.4133 0.4860
0.416 1.3148 3600 0.4329 0.4964
0.4067 1.4609 4000 0.4084 0.4871
0.4046 1.6070 4400 0.4238 0.5113
0.4108 1.7531 4800 0.4063 0.4854
0.4018 1.8992 5200 0.4123 0.4861
0.3878 2.0453 5600 0.3969 0.4763
0.351 2.1914 6000 0.4012 0.4690
0.3454 2.3375 6400 0.4011 0.4741
0.3487 2.4836 6800 0.3908 0.4816
0.3417 2.6297 7200 0.3884 0.4578
0.3408 2.7757 7600 0.4002 0.4674
0.3374 2.9218 8000 0.3914 0.4524
0.3121 3.0679 8400 0.4217 0.4763
0.3025 3.2140 8800 0.3905 0.4598
0.3016 3.3601 9200 0.3831 0.4505
0.2974 3.5062 9600 0.3948 0.4568
0.3032 3.6523 10000 0.3888 0.4512
0.2929 3.7984 10400 0.3900 0.4461
0.2922 3.9445 10800 0.3846 0.4540
0.2745 4.0906 11200 0.3748 0.4374
0.2665 4.2367 11600 0.3993 0.4336
0.2696 4.3828 12000 0.3657 0.4333
0.2622 4.5289 12400 0.3628 0.4341
0.2585 4.6749 12800 0.3802 0.4288
0.2604 4.8210 13200 0.3604 0.4236
0.2575 4.9671 13600 0.3650 0.4322
0.2313 5.1132 14000 0.3521 0.4021
0.2323 5.2593 14400 0.3513 0.4009
0.2351 5.4054 14800 0.3265 0.3963
0.2229 5.5515 15200 0.3523 0.3978
0.234 5.6976 15600 0.3375 0.3931
0.229 5.8437 16000 0.3380 0.3945
0.2313 5.9898 16400 0.3403 0.3957
0.2069 6.1359 16800 0.3522 0.3979
0.2162 6.2820 17200 0.3685 0.4061
0.2144 6.4280 17600 0.3308 0.3878
0.2115 6.5741 18000 0.3530 0.3974
0.2108 6.7202 18400 0.3191 0.3802
0.2107 6.8663 18800 0.3313 0.3818
0.1977 7.0124 19200 0.3454 0.3807
0.1903 7.1585 19600 0.3386 0.3785
0.1924 7.3046 20000 0.3369 0.3841
0.1912 7.4507 20400 0.3385 0.3782
0.1879 7.5968 20800 0.3302 0.3728
0.1903 7.7429 21200 0.3254 0.3636
0.1828 7.8890 21600 0.3499 0.3723
0.1803 8.0351 22000 0.3371 0.3834
0.1711 8.1812 22400 0.3498 0.3879
0.169 8.3272 22800 0.3332 0.3731
0.1657 8.4733 23200 0.3223 0.3665
0.1682 8.6194 23600 0.3386 0.3696
0.1732 8.7655 24000 0.3564 0.3726
0.1723 8.9116 24400 0.3336 0.3685
0.1681 9.0577 24800 0.3328 0.3543
0.1547 9.2038 25200 0.3358 0.3533
0.1572 9.3499 25600 0.3088 0.3563
0.1518 9.4960 26000 0.3219 0.3513
0.1532 9.6421 26400 0.3060 0.3491
0.154 9.7882 26800 0.3091 0.3457
0.1478 9.9343 27200 0.3159 0.3401
0.1499 10.0804 27600 0.3219 0.3485
0.1337 10.2264 28000 0.3109 0.3443
0.1364 10.3725 28400 0.3281 0.3456
0.1329 10.5186 28800 0.3143 0.3408
0.146 10.6647 29200 0.3285 0.3383
0.1403 10.8108 29600 0.3180 0.3387
0.14 10.9569 30000 0.3086 0.3350
0.1258 11.1030 30400 0.3253 0.3345
0.1229 11.2491 30800 0.3236 0.3392
0.1241 11.3952 31200 0.3257 0.3349
0.1224 11.5413 31600 0.3260 0.3287
0.1218 11.6874 32000 0.3153 0.3330
0.1267 11.8335 32400 0.3141 0.3298
0.1246 11.9795 32800 0.3144 0.3281
0.113 12.1256 33200 0.3415 0.3367
0.1121 12.2717 33600 0.3262 0.3294
0.1147 12.4178 34000 0.3378 0.3287
0.114 12.5639 34400 0.3121 0.3240
0.1054 12.7100 34800 0.3288 0.3199
0.1081 12.8561 35200 0.3010 0.3220
0.1137 13.0022 35600 0.3261 0.3229
0.102 13.1483 36000 0.3168 0.3177
0.1 13.2944 36400 0.3224 0.3173
0.1003 13.4405 36800 0.3175 0.3205
0.098 13.5866 37200 0.3021 0.3158
0.0974 13.7327 37600 0.3057 0.3154
0.0952 13.8787 38000 0.3257 0.3155
0.095 14.0248 38400 0.3229 0.3097
0.0902 14.1709 38800 0.3285 0.3152
0.0917 14.3170 39200 0.3279 0.3160
0.0905 14.4631 39600 0.3278 0.3111
0.092 14.6092 40000 0.3209 0.3105
0.0862 14.7553 40400 0.3109 0.3064
0.0912 14.9014 40800 0.3116 0.3056
0.086 15.0475 41200 0.3383 0.3038
0.0832 15.1936 41600 0.3189 0.3018
0.0773 15.3397 42000 0.3150 0.3033
0.0817 15.4858 42400 0.3253 0.3040
0.0775 15.6318 42800 0.3223 0.3030
0.0767 15.7779 43200 0.3225 0.2970
0.0796 15.9240 43600 0.3368 0.3047
0.0763 16.0701 44000 0.3252 0.2971
0.075 16.2162 44400 0.3188 0.3002
0.0744 16.3623 44800 0.3207 0.2947
0.0729 16.5084 45200 0.3214 0.2956
0.0732 16.6545 45600 0.3278 0.2927
0.0694 16.8006 46000 0.3364 0.2924
0.0753 16.9467 46400 0.3263 0.2881
0.072 17.0928 46800 0.3317 0.2909
0.0658 17.2389 47200 0.3376 0.2902
0.07 17.3850 47600 0.3282 0.2902
0.0636 17.5310 48000 0.3321 0.2920
0.0655 17.6771 48400 0.3274 0.2884
0.0623 17.8232 48800 0.3316 0.2890
0.0621 17.9693 49200 0.3209 0.2872
0.0637 18.1154 49600 0.3281 0.2830
0.0616 18.2615 50000 0.3393 0.2852
0.0605 18.4076 50400 0.3371 0.2849
0.0614 18.5537 50800 0.3277 0.2836
0.0571 18.6998 51200 0.3317 0.2821
0.0555 18.8459 51600 0.3364 0.2816
0.061 18.9920 52000 0.3251 0.2797
0.0571 19.1381 52400 0.3343 0.2811
0.0614 19.2841 52800 0.3300 0.2810
0.0534 19.4302 53200 0.3324 0.2795
0.0562 19.5763 53600 0.3322 0.2789
0.0545 19.7224 54000 0.3310 0.2789
0.0596 19.8685 54400 0.3293 0.2786

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.2.2+cu121
  • Datasets 2.17.1
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
315M params
Tensor type
F32
·

Finetuned from

Evaluation results