Edit model card

wav2vec2-turkish-300m-7

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3236
  • Wer: 0.1668

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 0.1
  • num_epochs: 35
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
3.7291 0.6983 500 1.2114 0.8908
1.1707 1.3966 1000 0.3888 0.4555
0.5042 2.0950 1500 0.2879 0.3270
0.2623 2.7933 2000 0.2653 0.3265
0.2012 3.4916 2500 0.2405 0.2778
0.1817 4.1899 3000 0.2555 0.2704
0.1394 4.8883 3500 0.2452 0.2647
0.1112 5.5866 4000 0.2426 0.2458
0.1047 6.2849 4500 0.2520 0.2634
0.0916 6.9832 5000 0.2417 0.2443
0.0902 7.6816 5500 0.2627 0.2427
0.075 8.3799 6000 0.2551 0.2320
0.0716 9.0782 6500 0.2607 0.2221
0.0661 9.7765 7000 0.2504 0.2338
0.0634 10.4749 7500 0.2552 0.2229
0.0583 11.1732 8000 0.2637 0.2249
0.0537 11.8715 8500 0.2627 0.2122
0.0535 12.5698 9000 0.2654 0.2148
0.0521 13.2682 9500 0.2665 0.2123
0.0491 13.9665 10000 0.2814 0.2176
0.0466 14.6648 10500 0.2785 0.2138
0.0445 15.3631 11000 0.2856 0.2075
0.0415 16.0615 11500 0.2750 0.2076
0.0405 16.7598 12000 0.2743 0.2045
0.0368 17.4581 12500 0.2770 0.2013
0.0374 18.1564 13000 0.2961 0.2043
0.0374 18.8547 13500 0.2851 0.2028
0.0322 19.5531 14000 0.2955 0.1961
0.0317 20.2514 14500 0.3053 0.1998
0.0306 20.9497 15000 0.2988 0.1960
0.0328 21.6480 15500 0.2873 0.1949
0.0299 22.3464 16000 0.3030 0.1921
0.0272 23.0447 16500 0.2902 0.1866
0.0286 23.7430 17000 0.2962 0.1879
0.0288 24.4413 17500 0.3114 0.1871
0.0253 25.1397 18000 0.3203 0.1844
0.0262 25.8380 18500 0.2993 0.1861
0.0238 26.5363 19000 0.3108 0.1812
0.0228 27.2346 19500 0.3143 0.1759
0.0235 27.9330 20000 0.3077 0.1780
0.0227 28.6313 20500 0.3099 0.1739
0.0212 29.3296 21000 0.3144 0.1730
0.0212 30.0279 21500 0.3165 0.1726
0.0211 30.7263 22000 0.3178 0.1708
0.0192 31.4246 22500 0.3172 0.1682
0.0193 32.1229 23000 0.3188 0.1693
0.0195 32.8212 23500 0.3255 0.1661
0.0179 33.5196 24000 0.3248 0.1668
0.0166 34.2179 24500 0.3261 0.1668
0.018 34.9162 25000 0.3236 0.1668

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.2.2+cu121
  • Datasets 2.17.1
  • Tokenizers 0.19.1
Downloads last month
99
Safetensors
Model size
316M params
Tensor type
F32
·

Finetuned from

Evaluation results