wav2vec2-large-xlsr-coraa-exp-13

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5561
  • Wer: 0.3456
  • Cer: 0.1803

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
38.268 1.0 14 32.0584 1.0 0.9430
38.268 2.0 28 10.3763 1.0 0.9619
38.268 3.0 42 4.8976 1.0 0.9619
38.268 4.0 56 4.0406 1.0 0.9619
38.268 5.0 70 3.7470 1.0 0.9619
38.268 6.0 84 3.5903 1.0 0.9619
38.268 7.0 98 3.4750 1.0 0.9619
10.1654 8.0 112 3.3406 1.0 0.9619
10.1654 9.0 126 3.2267 1.0 0.9619
10.1654 10.0 140 3.1887 1.0 0.9619
10.1654 11.0 154 3.1301 1.0 0.9619
10.1654 12.0 168 3.1046 1.0 0.9619
10.1654 13.0 182 3.0909 1.0 0.9619
10.1654 14.0 196 3.0603 1.0 0.9619
3.0823 15.0 210 3.0584 1.0 0.9619
3.0823 16.0 224 3.0485 1.0 0.9619
3.0823 17.0 238 3.0464 1.0 0.9619
3.0823 18.0 252 3.0242 1.0 0.9619
3.0823 19.0 266 3.0237 1.0 0.9619
3.0823 20.0 280 3.0304 1.0 0.9619
3.0823 21.0 294 3.0119 1.0 0.9619
2.9562 22.0 308 3.0148 1.0 0.9619
2.9562 23.0 322 3.0061 1.0 0.9619
2.9562 24.0 336 3.0042 1.0 0.9619
2.9562 25.0 350 3.0033 1.0 0.9619
2.9562 26.0 364 3.0029 1.0 0.9619
2.9562 27.0 378 3.0082 1.0 0.9619
2.9562 28.0 392 2.9956 1.0 0.9619
2.9262 29.0 406 2.9948 1.0 0.9619
2.9262 30.0 420 2.9982 1.0 0.9619
2.9262 31.0 434 2.9962 1.0 0.9619
2.9262 32.0 448 2.9931 1.0 0.9619
2.9262 33.0 462 2.9809 1.0 0.9619
2.9262 34.0 476 2.9804 1.0 0.9619
2.9262 35.0 490 2.9742 1.0 0.9619
2.9125 36.0 504 2.9522 1.0 0.9619
2.9125 37.0 518 2.9015 1.0 0.9619
2.9125 38.0 532 2.8522 1.0 0.9619
2.9125 39.0 546 2.8285 1.0 0.9619
2.9125 40.0 560 2.7294 1.0 0.9615
2.9125 41.0 574 2.6491 1.0 0.9605
2.9125 42.0 588 2.4883 1.0 0.8950
2.7205 43.0 602 2.3631 1.0 0.8365
2.7205 44.0 616 2.0546 1.0 0.6074
2.7205 45.0 630 1.7867 1.0 0.5148
2.7205 46.0 644 1.5453 1.0 0.4532
2.7205 47.0 658 1.3554 0.9990 0.4064
2.7205 48.0 672 1.2016 0.9829 0.3670
2.7205 49.0 686 1.0777 0.8805 0.3167
1.6469 50.0 700 0.9790 0.7030 0.2594
1.6469 51.0 714 0.8962 0.5270 0.2224
1.6469 52.0 728 0.8429 0.4974 0.2176
1.6469 53.0 742 0.8159 0.4659 0.2089
1.6469 54.0 756 0.7980 0.4512 0.2066
1.6469 55.0 770 0.7541 0.4441 0.2044
1.6469 56.0 784 0.7299 0.4273 0.2015
1.6469 57.0 798 0.7078 0.4092 0.1964
0.7997 58.0 812 0.7079 0.4110 0.1973
0.7997 59.0 826 0.6861 0.4137 0.1983
0.7997 60.0 840 0.7035 0.4011 0.1975
0.7997 61.0 854 0.6676 0.4000 0.1942
0.7997 62.0 868 0.6562 0.3980 0.1937
0.7997 63.0 882 0.6580 0.3850 0.1911
0.7997 64.0 896 0.6643 0.3911 0.1925
0.5379 65.0 910 0.6532 0.3929 0.1928
0.5379 66.0 924 0.6483 0.3866 0.1906
0.5379 67.0 938 0.6267 0.3757 0.1870
0.5379 68.0 952 0.6296 0.3793 0.1880
0.5379 69.0 966 0.6415 0.3785 0.1902
0.5379 70.0 980 0.6227 0.3746 0.1885
0.5379 71.0 994 0.6213 0.3738 0.1878
0.4372 72.0 1008 0.6110 0.3726 0.1872
0.4372 73.0 1022 0.6019 0.3696 0.1862
0.4372 74.0 1036 0.6037 0.3722 0.1867
0.4372 75.0 1050 0.5994 0.3657 0.1881
0.4372 76.0 1064 0.6083 0.3704 0.1881
0.4372 77.0 1078 0.5838 0.3696 0.1865
0.4372 78.0 1092 0.5795 0.3718 0.1855
0.3912 79.0 1106 0.6201 0.3714 0.1877
0.3912 80.0 1120 0.5915 0.3661 0.1854
0.3912 81.0 1134 0.5894 0.3651 0.1843
0.3912 82.0 1148 0.5994 0.3681 0.1859
0.3912 83.0 1162 0.6001 0.3655 0.1864
0.3912 84.0 1176 0.6008 0.3653 0.1865
0.3912 85.0 1190 0.5770 0.3602 0.1832
0.3485 86.0 1204 0.5905 0.3566 0.1836
0.3485 87.0 1218 0.5810 0.3580 0.1828
0.3485 88.0 1232 0.5765 0.3584 0.1830
0.3485 89.0 1246 0.5902 0.3641 0.1845
0.3485 90.0 1260 0.5812 0.3614 0.1831
0.3485 91.0 1274 0.5966 0.3586 0.1844
0.3485 92.0 1288 0.5686 0.3557 0.1822
0.3234 93.0 1302 0.5839 0.3553 0.1828
0.3234 94.0 1316 0.5765 0.3553 0.1820
0.3234 95.0 1330 0.5780 0.3566 0.1820
0.3234 96.0 1344 0.5862 0.3596 0.1834
0.3234 97.0 1358 0.5702 0.3555 0.1821
0.3234 98.0 1372 0.5787 0.3547 0.1821
0.3234 99.0 1386 0.5767 0.3531 0.1824
0.2803 100.0 1400 0.5778 0.3570 0.1818
0.2803 101.0 1414 0.5759 0.3543 0.1817
0.2803 102.0 1428 0.5838 0.3572 0.1824
0.2803 103.0 1442 0.5696 0.3541 0.1815
0.2803 104.0 1456 0.5724 0.3541 0.1820
0.2803 105.0 1470 0.5698 0.3543 0.1820
0.2803 106.0 1484 0.5727 0.3523 0.1816
0.2803 107.0 1498 0.5609 0.3511 0.1809
0.2718 108.0 1512 0.5655 0.3497 0.1807
0.2718 109.0 1526 0.5761 0.3535 0.1816
0.2718 110.0 1540 0.5753 0.3523 0.1815
0.2718 111.0 1554 0.5703 0.3503 0.1805
0.2718 112.0 1568 0.5623 0.3470 0.1802
0.2718 113.0 1582 0.5723 0.3511 0.1813
0.2718 114.0 1596 0.5608 0.3486 0.1803
0.2614 115.0 1610 0.5613 0.3511 0.1809
0.2614 116.0 1624 0.5742 0.3533 0.1817
0.2614 117.0 1638 0.5715 0.3523 0.1817
0.2614 118.0 1652 0.5695 0.3533 0.1817
0.2614 119.0 1666 0.5713 0.3531 0.1825
0.2614 120.0 1680 0.5664 0.3533 0.1821
0.2614 121.0 1694 0.5716 0.3531 0.1822
0.2463 122.0 1708 0.5680 0.3476 0.1810
0.2463 123.0 1722 0.5760 0.3527 0.1817
0.2463 124.0 1736 0.5561 0.3456 0.1803
0.2463 125.0 1750 0.5698 0.3478 0.1812
0.2463 126.0 1764 0.5667 0.3482 0.1811
0.2463 127.0 1778 0.5677 0.3478 0.1813
0.2463 128.0 1792 0.5681 0.3446 0.1805
0.2477 129.0 1806 0.5666 0.3470 0.1809
0.2477 130.0 1820 0.5696 0.3458 0.1804
0.2477 131.0 1834 0.5704 0.3478 0.1810
0.2477 132.0 1848 0.5656 0.3470 0.1808
0.2477 133.0 1862 0.5697 0.3472 0.1807
0.2477 134.0 1876 0.5716 0.3472 0.1810
0.2477 135.0 1890 0.5742 0.3484 0.1810
0.221 136.0 1904 0.5671 0.3472 0.1807
0.221 137.0 1918 0.5670 0.3462 0.1810
0.221 138.0 1932 0.5675 0.3460 0.1810
0.221 139.0 1946 0.5704 0.3462 0.1810
0.221 140.0 1960 0.5675 0.3458 0.1808
0.221 141.0 1974 0.5618 0.3444 0.1800
0.221 142.0 1988 0.5633 0.3454 0.1800
0.2217 143.0 2002 0.5664 0.3456 0.1807
0.2217 144.0 2016 0.5682 0.3462 0.1810

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.