Edit model card

wav2vec2-large-xls-r-300m-kr-jw4169

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9752
  • Wer: 0.5196

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
35.084 1.39 200 6.8536 1.0
4.853 2.78 400 4.6246 1.0
4.5491 4.17 600 4.3815 1.0
2.799 5.55 800 1.7402 0.8642
1.3872 6.94 1000 1.2019 0.7448
0.9599 8.33 1200 1.0594 0.7134
0.675 9.72 1400 0.9321 0.6404
0.4775 11.11 1600 0.9088 0.5911
0.3479 12.5 1800 0.9430 0.6010
0.2712 13.89 2000 0.8948 0.5854
0.2283 15.28 2200 0.9009 0.5495
0.1825 16.67 2400 0.9079 0.5501
0.161 18.06 2600 0.9518 0.5390
0.1394 19.44 2800 0.9529 0.5399
0.1266 20.83 3000 0.9505 0.5283
0.1102 22.22 3200 0.9748 0.5328
0.101 23.61 3400 0.9593 0.5316
0.0907 25.0 3600 0.9832 0.5292
0.0833 26.39 3800 0.9773 0.5181
0.0781 27.78 4000 0.9736 0.5163
0.0744 29.17 4200 0.9752 0.5196

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.10.0+cu102
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
2

Evaluation results