wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-2-3
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1569
- Wer: 0.0903
- Cer: 0.0301
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
37.8998 | 0.99 | 61 | 10.6413 | 0.9815 | 0.9821 |
11.9841 | 2.0 | 123 | 8.1225 | 0.9815 | 0.9821 |
11.9841 | 2.99 | 184 | 5.2941 | 0.9842 | 0.9678 |
5.886 | 4.0 | 246 | 4.2091 | 0.9866 | 0.9765 |
4.5094 | 4.99 | 307 | 4.1291 | 0.9846 | 0.9793 |
4.5094 | 6.0 | 369 | 3.1493 | 0.9866 | 0.9831 |
3.7742 | 6.99 | 430 | 2.9688 | 1.0 | 0.9995 |
3.7742 | 8.0 | 492 | 2.9437 | 1.0 | 0.9979 |
2.994 | 8.99 | 553 | 2.9303 | 1.0 | 0.9998 |
2.9186 | 10.0 | 615 | 2.9275 | 0.9996 | 0.9965 |
2.9186 | 10.99 | 676 | 2.8990 | 0.9996 | 0.9995 |
2.8906 | 12.0 | 738 | 2.7851 | 0.9984 | 0.9364 |
2.8906 | 12.99 | 799 | 2.2908 | 1.0 | 0.8678 |
2.7313 | 14.0 | 861 | 1.2001 | 1.0 | 0.3552 |
1.7385 | 14.99 | 922 | 0.6682 | 0.9850 | 0.2257 |
1.7385 | 16.0 | 984 | 0.4400 | 0.2038 | 0.0613 |
0.8892 | 16.99 | 1045 | 0.3621 | 0.1884 | 0.0550 |
0.627 | 18.0 | 1107 | 0.3184 | 0.1648 | 0.0501 |
0.627 | 18.99 | 1168 | 0.3015 | 0.1502 | 0.0473 |
0.4719 | 20.0 | 1230 | 0.2782 | 0.1281 | 0.0428 |
0.4719 | 20.99 | 1291 | 0.2601 | 0.1301 | 0.0425 |
0.409 | 22.0 | 1353 | 0.2428 | 0.1151 | 0.0398 |
0.3604 | 22.99 | 1414 | 0.2372 | 0.1175 | 0.0398 |
0.3604 | 24.0 | 1476 | 0.2279 | 0.1115 | 0.0375 |
0.3387 | 24.99 | 1537 | 0.2193 | 0.1084 | 0.0365 |
0.3387 | 26.0 | 1599 | 0.2179 | 0.1037 | 0.0355 |
0.3119 | 26.99 | 1660 | 0.2130 | 0.1021 | 0.0353 |
0.2871 | 28.0 | 1722 | 0.2038 | 0.1001 | 0.0351 |
0.2871 | 28.99 | 1783 | 0.2019 | 0.1001 | 0.0349 |
0.2745 | 30.0 | 1845 | 0.1989 | 0.0958 | 0.0336 |
0.2584 | 30.99 | 1906 | 0.1874 | 0.0954 | 0.0328 |
0.2584 | 32.0 | 1968 | 0.1879 | 0.0930 | 0.0329 |
0.2438 | 32.99 | 2029 | 0.1866 | 0.0922 | 0.0326 |
0.2438 | 34.0 | 2091 | 0.1821 | 0.0918 | 0.0313 |
0.2221 | 34.99 | 2152 | 0.1796 | 0.0930 | 0.0324 |
0.2158 | 36.0 | 2214 | 0.1755 | 0.0978 | 0.0326 |
0.2158 | 36.99 | 2275 | 0.1743 | 0.0934 | 0.0321 |
0.2078 | 38.0 | 2337 | 0.1760 | 0.0989 | 0.0324 |
0.2078 | 38.99 | 2398 | 0.1835 | 0.0918 | 0.0327 |
0.2142 | 40.0 | 2460 | 0.1838 | 0.0962 | 0.0329 |
0.2009 | 40.99 | 2521 | 0.1758 | 0.0946 | 0.0329 |
0.2009 | 42.0 | 2583 | 0.1700 | 0.0985 | 0.0317 |
0.1887 | 42.99 | 2644 | 0.1695 | 0.0942 | 0.0315 |
0.1963 | 44.0 | 2706 | 0.1700 | 0.0942 | 0.0320 |
0.1963 | 44.99 | 2767 | 0.1677 | 0.0930 | 0.0306 |
0.1778 | 46.0 | 2829 | 0.1690 | 0.0938 | 0.0314 |
0.1778 | 46.99 | 2890 | 0.1671 | 0.0934 | 0.0311 |
0.1788 | 48.0 | 2952 | 0.1692 | 0.0899 | 0.0311 |
0.1788 | 48.99 | 3013 | 0.1638 | 0.0887 | 0.0304 |
0.1788 | 50.0 | 3075 | 0.1605 | 0.0946 | 0.0311 |
0.1517 | 50.99 | 3136 | 0.1624 | 0.0891 | 0.0306 |
0.1517 | 52.0 | 3198 | 0.1605 | 0.0875 | 0.0299 |
0.1674 | 52.99 | 3259 | 0.1569 | 0.0903 | 0.0301 |
0.1671 | 54.0 | 3321 | 0.1599 | 0.0867 | 0.0293 |
0.1671 | 54.99 | 3382 | 0.1605 | 0.0867 | 0.0294 |
0.1444 | 56.0 | 3444 | 0.1598 | 0.0867 | 0.0300 |
0.1505 | 56.99 | 3505 | 0.1633 | 0.0918 | 0.0310 |
0.1505 | 58.0 | 3567 | 0.1583 | 0.0847 | 0.0291 |
0.1545 | 58.99 | 3628 | 0.1622 | 0.0875 | 0.0291 |
0.1545 | 60.0 | 3690 | 0.1605 | 0.0887 | 0.0296 |
0.1512 | 60.99 | 3751 | 0.1662 | 0.0867 | 0.0301 |
0.159 | 62.0 | 3813 | 0.1607 | 0.0879 | 0.0299 |
0.159 | 62.99 | 3874 | 0.1603 | 0.0887 | 0.0301 |
0.1378 | 64.0 | 3936 | 0.1686 | 0.0847 | 0.0293 |
0.1378 | 64.99 | 3997 | 0.1691 | 0.0871 | 0.0299 |
0.1549 | 66.0 | 4059 | 0.1678 | 0.0875 | 0.0305 |
0.1314 | 66.99 | 4120 | 0.1610 | 0.0887 | 0.0301 |
0.1314 | 68.0 | 4182 | 0.1636 | 0.0887 | 0.0298 |
0.1332 | 68.99 | 4243 | 0.1588 | 0.0875 | 0.0287 |
0.135 | 70.0 | 4305 | 0.1599 | 0.0879 | 0.0291 |
0.135 | 70.99 | 4366 | 0.1648 | 0.0903 | 0.0304 |
0.1372 | 72.0 | 4428 | 0.1605 | 0.0895 | 0.0297 |
0.1372 | 72.99 | 4489 | 0.1689 | 0.0863 | 0.0301 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.2.1+cu121
- Datasets 2.17.0
- Tokenizers 0.13.3
- Downloads last month
- 4