wav2vec2-large-xlsr-coraa-exp-9

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5553
  • Wer: 0.3466
  • Cer: 0.1788

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
37.5508 1.0 14 23.1376 1.0 0.9619
37.5508 2.0 28 6.5036 1.0 0.9619
37.5508 3.0 42 4.3919 1.0 0.9619
37.5508 4.0 56 3.9441 1.0 0.9619
37.5508 5.0 70 3.7306 1.0 0.9619
37.5508 6.0 84 3.5762 1.0 0.9619
37.5508 7.0 98 3.4129 1.0 0.9619
8.6902 8.0 112 3.2859 1.0 0.9619
8.6902 9.0 126 3.2192 1.0 0.9619
8.6902 10.0 140 3.1479 1.0 0.9619
8.6902 11.0 154 3.1063 1.0 0.9619
8.6902 12.0 168 3.0897 1.0 0.9619
8.6902 13.0 182 3.0849 1.0 0.9619
8.6902 14.0 196 3.0485 1.0 0.9619
3.059 15.0 210 3.0496 1.0 0.9619
3.059 16.0 224 3.0510 1.0 0.9619
3.059 17.0 238 3.0428 1.0 0.9619
3.059 18.0 252 3.0331 1.0 0.9619
3.059 19.0 266 3.0353 1.0 0.9619
3.059 20.0 280 3.0217 1.0 0.9619
3.059 21.0 294 3.0107 1.0 0.9619
2.9492 22.0 308 3.0068 1.0 0.9619
2.9492 23.0 322 2.9950 1.0 0.9619
2.9492 24.0 336 2.9896 1.0 0.9619
2.9492 25.0 350 2.9687 1.0 0.9619
2.9492 26.0 364 2.9474 1.0 0.9619
2.9492 27.0 378 2.9414 1.0 0.9619
2.9492 28.0 392 2.8425 1.0 0.9619
2.8892 29.0 406 2.7813 1.0 0.9619
2.8892 30.0 420 2.7270 1.0 0.9619
2.8892 31.0 434 2.6645 1.0 0.9606
2.8892 32.0 448 2.5593 1.0 0.9139
2.8892 33.0 462 2.3230 1.0 0.7003
2.8892 34.0 476 1.9706 1.0 0.5358
2.8892 35.0 490 1.7085 0.9998 0.4548
2.3937 36.0 504 1.4494 1.0 0.4064
2.3937 37.0 518 1.2865 1.0 0.3847
2.3937 38.0 532 1.1509 0.9947 0.3659
2.3937 39.0 546 1.0467 0.9031 0.3183
2.3937 40.0 560 0.9832 0.5961 0.2404
2.3937 41.0 574 0.8921 0.5049 0.2223
2.3937 42.0 588 0.8306 0.4687 0.2123
1.0877 43.0 602 0.8017 0.4563 0.2088
1.0877 44.0 616 0.7716 0.4405 0.2046
1.0877 45.0 630 0.7694 0.4407 0.2054
1.0877 46.0 644 0.7451 0.4315 0.2037
1.0877 47.0 658 0.7112 0.4250 0.1996
1.0877 48.0 672 0.7008 0.4116 0.1958
1.0877 49.0 686 0.7140 0.4057 0.1980
0.6292 50.0 700 0.7208 0.4114 0.1988
0.6292 51.0 714 0.6675 0.4033 0.1937
0.6292 52.0 728 0.6650 0.4015 0.1938
0.6292 53.0 742 0.6550 0.4013 0.1938
0.6292 54.0 756 0.6477 0.3990 0.1932
0.6292 55.0 770 0.6362 0.3960 0.1932
0.6292 56.0 784 0.6323 0.3919 0.1930
0.6292 57.0 798 0.6264 0.3870 0.1921
0.4739 58.0 812 0.6290 0.3872 0.1921
0.4739 59.0 826 0.6207 0.3864 0.1925
0.4739 60.0 840 0.6178 0.3858 0.1918
0.4739 61.0 854 0.6217 0.3860 0.1918
0.4739 62.0 868 0.6078 0.3799 0.1900
0.4739 63.0 882 0.6072 0.3781 0.1889
0.4739 64.0 896 0.6068 0.3761 0.1883
0.3855 65.0 910 0.5945 0.3748 0.1870
0.3855 66.0 924 0.6194 0.3799 0.1900
0.3855 67.0 938 0.6044 0.3793 0.1885
0.3855 68.0 952 0.5946 0.3751 0.1880
0.3855 69.0 966 0.6116 0.3714 0.1880
0.3855 70.0 980 0.5877 0.3679 0.1861
0.3855 71.0 994 0.5861 0.3679 0.1863
0.3302 72.0 1008 0.5805 0.3685 0.1856
0.3302 73.0 1022 0.5862 0.3714 0.1862
0.3302 74.0 1036 0.5921 0.3720 0.1866
0.3302 75.0 1050 0.5692 0.3683 0.1854
0.3302 76.0 1064 0.5922 0.3702 0.1878
0.3302 77.0 1078 0.6105 0.3710 0.1883
0.3302 78.0 1092 0.5873 0.3683 0.1856
0.3046 79.0 1106 0.5826 0.3681 0.1859
0.3046 80.0 1120 0.5792 0.3633 0.1845
0.3046 81.0 1134 0.5738 0.3610 0.1835
0.3046 82.0 1148 0.5794 0.3625 0.1843
0.3046 83.0 1162 0.5766 0.3564 0.1829
0.3046 84.0 1176 0.5745 0.3578 0.1830
0.3046 85.0 1190 0.5615 0.3555 0.1814
0.2927 86.0 1204 0.5854 0.3614 0.1828
0.2927 87.0 1218 0.5818 0.3625 0.1835
0.2927 88.0 1232 0.5613 0.3578 0.1815
0.2927 89.0 1246 0.5661 0.3549 0.1813
0.2927 90.0 1260 0.5795 0.3604 0.1820
0.2927 91.0 1274 0.5604 0.3521 0.1802
0.2927 92.0 1288 0.5738 0.3590 0.1822
0.2576 93.0 1302 0.5658 0.3574 0.1814
0.2576 94.0 1316 0.5620 0.3511 0.1808
0.2576 95.0 1330 0.5709 0.3541 0.1810
0.2576 96.0 1344 0.5675 0.3503 0.1799
0.2576 97.0 1358 0.5788 0.3549 0.1815
0.2576 98.0 1372 0.5730 0.3525 0.1810
0.2576 99.0 1386 0.5694 0.3511 0.1803
0.2273 100.0 1400 0.5748 0.3527 0.1807
0.2273 101.0 1414 0.5688 0.3513 0.1797
0.2273 102.0 1428 0.5767 0.3553 0.1805
0.2273 103.0 1442 0.5758 0.3529 0.1812
0.2273 104.0 1456 0.5641 0.3507 0.1793
0.2273 105.0 1470 0.5628 0.3495 0.1789
0.2273 106.0 1484 0.5729 0.3466 0.1789
0.2273 107.0 1498 0.5722 0.3497 0.1798
0.2181 108.0 1512 0.5553 0.3466 0.1788
0.2181 109.0 1526 0.5582 0.3484 0.1792
0.2181 110.0 1540 0.5702 0.3521 0.1802
0.2181 111.0 1554 0.5691 0.3505 0.1798
0.2181 112.0 1568 0.5604 0.3470 0.1786
0.2181 113.0 1582 0.5661 0.3482 0.1795
0.2181 114.0 1596 0.5683 0.3511 0.1796
0.2171 115.0 1610 0.5738 0.3509 0.1798
0.2171 116.0 1624 0.5730 0.3458 0.1793
0.2171 117.0 1638 0.5705 0.3456 0.1789
0.2171 118.0 1652 0.5814 0.3466 0.1796
0.2171 119.0 1666 0.5715 0.3442 0.1791
0.2171 120.0 1680 0.5720 0.3470 0.1798
0.2171 121.0 1694 0.5769 0.3470 0.1797
0.1986 122.0 1708 0.5711 0.3464 0.1792
0.1986 123.0 1722 0.5728 0.3442 0.1790
0.1986 124.0 1736 0.5668 0.3450 0.1783
0.1986 125.0 1750 0.5855 0.3484 0.1797
0.1986 126.0 1764 0.5667 0.3427 0.1783
0.1986 127.0 1778 0.5711 0.3460 0.1789
0.1986 128.0 1792 0.5682 0.3444 0.1781

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
135
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.