jaeyeon's picture
update model card README.md
fc89cbc
metadata
license: apache-2.0
tags:
  - generated_from_trainer
model-index:
  - name: korean-aihub-learning-math-1-test
    results: []

korean-aihub-learning-math-1-test

This model is a fine-tuned version of kresnik/wav2vec2-large-xlsr-korean on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2537
  • Wer: 0.4765

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 1.0 35 29.8031 1.0
No log 2.0 70 5.7158 1.0
19.8789 3.0 105 4.5005 1.0
19.8789 4.0 140 4.3677 0.9984
19.8789 5.0 175 3.8013 0.9882
3.9785 6.0 210 2.4132 0.8730
3.9785 7.0 245 1.5867 0.7045
3.9785 8.0 280 1.3179 0.6082
1.2266 9.0 315 1.2431 0.6066
1.2266 10.0 350 1.1791 0.5384
1.2266 11.0 385 1.0994 0.5298
0.3916 12.0 420 1.1552 0.5196
0.3916 13.0 455 1.1495 0.5486
0.3916 14.0 490 1.1340 0.5290
0.2488 15.0 525 1.2208 0.5525
0.2488 16.0 560 1.1682 0.5024
0.2488 17.0 595 1.1479 0.5008
0.1907 18.0 630 1.1735 0.4882
0.1907 19.0 665 1.2302 0.4914
0.1461 20.0 700 1.2497 0.4890
0.1461 21.0 735 1.2434 0.4914
0.1461 22.0 770 1.2031 0.5031
0.1147 23.0 805 1.2451 0.4976
0.1147 24.0 840 1.2746 0.4937
0.1147 25.0 875 1.2405 0.4828
0.0892 26.0 910 1.2228 0.4929
0.0892 27.0 945 1.2642 0.4898
0.0892 28.0 980 1.2586 0.4843
0.0709 29.0 1015 1.2518 0.4788
0.0709 30.0 1050 1.2537 0.4765

Framework versions

  • Transformers 4.20.1
  • Pytorch 1.12.0+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1