You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-large-xls-r-300m-lg-pt

This model is a fine-tuned version of Alvin-Nahabwe/wav2vec2-large-xls-r-300m-gn on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2974
  • Wer: 0.1465

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.3826 0.81 400 0.2260 0.2142
0.3513 1.61 800 0.2164 0.2289
0.3211 2.42 1200 0.1950 0.1895
0.2939 3.22 1600 0.1977 0.1969
0.2886 4.03 2000 0.1973 0.1957
0.2613 4.84 2400 0.1897 0.1825
0.2566 5.64 2800 0.1878 0.1753
0.2406 6.45 3200 0.1844 0.1713
0.2292 7.25 3600 0.1919 0.1706
0.2176 8.06 4000 0.1965 0.1681
0.2115 8.86 4400 0.1945 0.1746
0.1933 9.67 4800 0.2041 0.1712
0.1878 10.48 5200 0.2098 0.1718
0.1806 11.29 5600 0.2071 0.1666
0.1737 12.09 6000 0.2253 0.1655
0.1652 12.9 6400 0.2087 0.1627
0.1627 13.71 6800 0.2157 0.1666
0.1516 14.51 7200 0.2120 0.1687
0.1432 15.32 7600 0.2186 0.1715
0.1371 16.12 8000 0.2199 0.1681
0.1284 16.93 8400 0.2115 0.1647
0.1215 17.74 8800 0.2304 0.1568
0.115 18.55 9200 0.2322 0.1549
0.1122 19.35 9600 0.2427 0.1541
0.1041 20.16 10000 0.2512 0.1531
0.0999 20.96 10400 0.2526 0.1559
0.0929 21.77 10800 0.2591 0.1536
0.0877 22.58 11200 0.2645 0.1525
0.082 23.39 11600 0.2692 0.1494
0.0787 24.19 12000 0.2742 0.1530
0.0758 25.0 12400 0.2794 0.1484
0.0713 25.8 12800 0.2817 0.1493
0.0687 26.61 13200 0.2881 0.1491
0.065 27.42 13600 0.2945 0.1487
0.0619 28.22 14000 0.2955 0.1478
0.0592 29.03 14400 0.2965 0.1472
0.0569 29.84 14800 0.2974 0.1465

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
0