Edit model card

torgo_xlsr_finetune_M02

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4526
  • Wer: 0.2394

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Wer
3.5091 0.85 1000 3.2824 1.0
1.6986 1.7 2000 1.6782 0.7538
0.8367 2.55 3000 1.1374 0.5772
0.5881 3.4 4000 1.2548 0.4567
0.5024 4.26 5000 1.1946 0.3905
0.4208 5.11 6000 1.3830 0.3939
0.3647 5.96 7000 1.2722 0.3404
0.3357 6.81 8000 1.1639 0.3506
0.2972 7.66 9000 1.3775 0.3192
0.27 8.51 10000 1.2345 0.3200
0.2532 9.36 11000 1.1760 0.3158
0.2171 10.21 12000 1.3310 0.2980
0.2367 11.06 13000 1.2634 0.2912
0.2086 11.91 14000 1.3451 0.2742
0.2023 12.77 15000 1.4799 0.2912
0.1862 13.62 16000 1.7212 0.2869
0.1698 14.47 17000 1.4304 0.2725
0.1489 15.32 18000 1.6233 0.2759
0.1489 16.17 19000 1.6320 0.2657
0.1552 17.02 20000 1.3667 0.2377
0.1414 17.87 21000 1.5199 0.2470
0.1195 18.72 22000 1.4951 0.2368
0.1303 19.57 23000 1.4526 0.2394

Framework versions

  • Transformers 4.26.1
  • Pytorch 2.2.1
  • Datasets 2.18.0
  • Tokenizers 0.13.3
Downloads last month
1