henil panchal
update model card README.md
c8e58de
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-large-xls-r-300m-telugu-asr
    results: []

wav2vec2-large-xls-r-300m-telugu-asr

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.8208
  • Wer: 1.0395

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
11.4834 1.37 200 4.8950 1.0
3.9043 2.74 400 3.8879 1.0
3.7152 4.11 600 3.6273 1.0
3.6229 5.48 800 3.8105 1.0
3.5465 6.85 1000 3.6970 1.0
3.5106 8.22 1200 3.6657 1.0
3.3731 9.59 1400 3.4669 0.9995
3.183 10.96 1600 3.1894 0.9983
2.9215 12.33 1800 2.9099 0.9993
2.5357 13.7 2000 2.8166 1.0407
2.1257 15.07 2200 2.6122 1.0205
1.7549 16.44 2400 2.5981 1.0457
1.4642 17.81 2600 2.5619 1.0015
1.1814 19.18 2800 2.8769 0.9836
0.9759 20.55 3000 2.8497 1.0078
0.8066 21.92 3200 3.1365 1.0655
0.6614 23.29 3400 3.1759 0.9964
0.5687 24.66 3600 3.3751 1.0103
0.4987 26.03 3800 3.5111 1.0180
0.4304 27.4 4000 3.6908 1.0200
0.3818 28.77 4200 3.8208 1.0395

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.10.0+cu113
  • Datasets 1.18.3
  • Tokenizers 0.13.2