Edit model card

wav2vec-turkish

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice_16_1 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3312
  • Wer: 0.2795

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
5.433 0.29 400 1.1787 0.9025
0.7193 0.58 800 0.6239 0.6505
0.5243 0.88 1200 0.5098 0.5901
0.4514 1.17 1600 0.4618 0.5131
0.419 1.46 2000 0.4341 0.4990
0.3975 1.75 2400 0.4016 0.4809
0.3756 2.05 2800 0.3926 0.4684
0.3421 2.34 3200 0.3841 0.4639
0.3418 2.63 3600 0.3889 0.4551
0.3409 2.92 4000 0.3615 0.4295
0.3039 3.21 4400 0.3939 0.4562
0.2934 3.51 4800 0.3866 0.4531
0.2971 3.8 5200 0.3891 0.4497
0.2953 4.09 5600 0.3694 0.4405
0.2836 4.38 6000 0.3583 0.4252
0.2721 4.67 6400 0.3562 0.4164
0.2685 4.97 6800 0.3574 0.4215
0.251 5.26 7200 0.3660 0.4239
0.2537 5.55 7600 0.3723 0.4308
0.2629 5.84 8000 0.3758 0.4359
0.2469 6.14 8400 0.3799 0.4295
0.2342 6.43 8800 0.3453 0.3947
0.2306 6.72 9200 0.3361 0.3977
0.2284 7.01 9600 0.3592 0.3970
0.213 7.3 10000 0.3451 0.3904
0.2188 7.6 10400 0.3426 0.3828
0.2239 7.89 10800 0.3392 0.3878
0.205 8.18 11200 0.3729 0.4021
0.2049 8.47 11600 0.3511 0.3981
0.2082 8.77 12000 0.3719 0.4143
0.2047 9.06 12400 0.3569 0.3984
0.1895 9.35 12800 0.3416 0.3798
0.1935 9.64 13200 0.3378 0.3793
0.1963 9.93 13600 0.3301 0.3883
0.1889 10.23 14000 0.3577 0.3881
0.182 10.52 14400 0.3281 0.3776
0.1794 10.81 14800 0.3368 0.3780
0.1736 11.1 15200 0.3275 0.3664
0.1659 11.4 15600 0.3504 0.3753
0.1651 11.69 16000 0.3343 0.3733
0.1735 11.98 16400 0.3510 0.3750
0.1569 12.27 16800 0.3243 0.3558
0.1535 12.56 17200 0.3239 0.3603
0.1588 12.86 17600 0.3372 0.3655
0.1524 13.15 18000 0.3453 0.3709
0.1453 13.44 18400 0.3301 0.3590
0.1483 13.73 18800 0.3443 0.3597
0.1432 14.02 19200 0.3401 0.3584
0.1374 14.32 19600 0.3357 0.3618
0.1399 14.61 20000 0.3386 0.3621
0.142 14.9 20400 0.3136 0.3547
0.1307 15.19 20800 0.3328 0.3501
0.1299 15.49 21200 0.3346 0.3458
0.1301 15.78 21600 0.3188 0.3473
0.1285 16.07 22000 0.3323 0.3522
0.1197 16.36 22400 0.3333 0.3392
0.1225 16.65 22800 0.3545 0.3590
0.1263 16.95 23200 0.3360 0.3410
0.1134 17.24 23600 0.3204 0.3332
0.114 17.53 24000 0.3264 0.3349
0.1165 17.82 24400 0.3160 0.3323
0.1134 18.12 24800 0.3479 0.3377
0.1066 18.41 25200 0.3306 0.3378
0.1027 18.7 25600 0.3286 0.3286
0.1083 18.99 26000 0.3285 0.3227
0.0937 19.28 26400 0.3240 0.3259
0.1007 19.58 26800 0.3286 0.3283
0.0996 19.87 27200 0.3278 0.3277
0.0972 20.16 27600 0.3171 0.3212
0.0927 20.45 28000 0.3426 0.3283
0.0932 20.75 28400 0.3418 0.3215
0.0932 21.04 28800 0.3246 0.3192
0.086 21.33 29200 0.3385 0.3201
0.0868 21.62 29600 0.3441 0.3164
0.0875 21.91 30000 0.3246 0.3161
0.0815 22.21 30400 0.3303 0.3105
0.0832 22.5 30800 0.3288 0.3062
0.0781 22.79 31200 0.3411 0.3098
0.077 23.08 31600 0.3343 0.3146
0.0755 23.37 32000 0.3211 0.3093
0.0742 23.67 32400 0.3268 0.3044
0.0721 23.96 32800 0.3222 0.3045
0.0699 24.25 33200 0.3266 0.2993
0.0663 24.54 33600 0.3410 0.3008
0.0719 24.84 34000 0.3221 0.3014
0.0682 25.13 34400 0.3290 0.2976
0.0674 25.42 34800 0.3356 0.2967
0.0661 25.71 35200 0.3181 0.2964
0.0681 26.0 35600 0.3318 0.2964
0.0619 26.3 36000 0.3220 0.2945
0.0617 26.59 36400 0.3270 0.2913
0.0592 26.88 36800 0.3391 0.2909
0.0569 27.17 37200 0.3394 0.2900
0.0557 27.47 37600 0.3359 0.2877
0.0555 27.76 38000 0.3306 0.2847
0.055 28.05 38400 0.3344 0.2827
0.0516 28.34 38800 0.3389 0.2845
0.0544 28.63 39200 0.3360 0.2840
0.0542 28.93 39600 0.3366 0.2828
0.0524 29.22 40000 0.3343 0.2819
0.0527 29.51 40400 0.3319 0.2803
0.0503 29.8 40800 0.3312 0.2795

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.2
Downloads last month
9
Safetensors
Model size
315M params
Tensor type
F32
·

Finetuned from

Evaluation results