wav2vec2-large-xlsr-mecita-coraa-portuguese-all-grade-2
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1949
- Wer: 0.1036
- Cer: 0.0321
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
35.3989 | 1.0 | 41 | 4.8783 | 1.0 | 1.0 |
35.3989 | 2.0 | 82 | 3.4224 | 1.0 | 1.0 |
9.6592 | 3.0 | 123 | 3.1635 | 1.0 | 1.0 |
9.6592 | 4.0 | 164 | 3.0371 | 1.0 | 1.0 |
3.0889 | 5.0 | 205 | 2.9919 | 1.0 | 1.0 |
3.0889 | 6.0 | 246 | 2.9810 | 1.0 | 1.0 |
3.0889 | 7.0 | 287 | 2.9552 | 1.0 | 1.0 |
2.9432 | 8.0 | 328 | 2.9460 | 1.0 | 1.0 |
2.9432 | 9.0 | 369 | 2.9532 | 1.0 | 1.0 |
2.9284 | 10.0 | 410 | 2.9459 | 1.0 | 1.0 |
2.9284 | 11.0 | 451 | 2.9130 | 0.9902 | 0.9961 |
2.9284 | 12.0 | 492 | 2.8200 | 1.0 | 1.0 |
2.8965 | 13.0 | 533 | 2.5324 | 0.9985 | 0.9459 |
2.8965 | 14.0 | 574 | 1.6967 | 1.0 | 0.4362 |
2.3114 | 15.0 | 615 | 1.0737 | 0.9992 | 0.2751 |
2.3114 | 16.0 | 656 | 0.7974 | 0.9924 | 0.2441 |
2.3114 | 17.0 | 697 | 0.6548 | 0.9070 | 0.1932 |
1.1858 | 18.0 | 738 | 0.5250 | 0.3517 | 0.0881 |
1.1858 | 19.0 | 779 | 0.4429 | 0.2474 | 0.0686 |
0.7511 | 20.0 | 820 | 0.4119 | 0.2337 | 0.0682 |
0.7511 | 21.0 | 861 | 0.3808 | 0.2247 | 0.0646 |
0.5648 | 22.0 | 902 | 0.3665 | 0.2050 | 0.0620 |
0.5648 | 23.0 | 943 | 0.3512 | 0.1906 | 0.0581 |
0.5648 | 24.0 | 984 | 0.3338 | 0.1800 | 0.0544 |
0.4966 | 25.0 | 1025 | 0.3265 | 0.1672 | 0.0513 |
0.4966 | 26.0 | 1066 | 0.3073 | 0.1513 | 0.0502 |
0.4186 | 27.0 | 1107 | 0.2998 | 0.1543 | 0.0477 |
0.4186 | 28.0 | 1148 | 0.2973 | 0.1604 | 0.0480 |
0.4186 | 29.0 | 1189 | 0.2855 | 0.1475 | 0.0459 |
0.3816 | 30.0 | 1230 | 0.2860 | 0.1467 | 0.0448 |
0.3816 | 31.0 | 1271 | 0.2582 | 0.1445 | 0.0420 |
0.335 | 32.0 | 1312 | 0.2621 | 0.1384 | 0.0426 |
0.335 | 33.0 | 1353 | 0.2585 | 0.1460 | 0.0438 |
0.335 | 34.0 | 1394 | 0.2511 | 0.1452 | 0.0441 |
0.3137 | 35.0 | 1435 | 0.2423 | 0.1399 | 0.0420 |
0.3137 | 36.0 | 1476 | 0.2477 | 0.1369 | 0.0414 |
0.2857 | 37.0 | 1517 | 0.2519 | 0.1339 | 0.0408 |
0.2857 | 38.0 | 1558 | 0.2503 | 0.1346 | 0.0414 |
0.2857 | 39.0 | 1599 | 0.2462 | 0.1331 | 0.0415 |
0.2628 | 40.0 | 1640 | 0.2512 | 0.1316 | 0.0421 |
0.2628 | 41.0 | 1681 | 0.2432 | 0.1339 | 0.0418 |
0.2687 | 42.0 | 1722 | 0.2376 | 0.1263 | 0.0402 |
0.2687 | 43.0 | 1763 | 0.2388 | 0.1195 | 0.0396 |
0.2521 | 44.0 | 1804 | 0.2350 | 0.1271 | 0.0396 |
0.2521 | 45.0 | 1845 | 0.2355 | 0.1256 | 0.0390 |
0.2521 | 46.0 | 1886 | 0.2371 | 0.1195 | 0.0390 |
0.2319 | 47.0 | 1927 | 0.2275 | 0.1203 | 0.0376 |
0.2319 | 48.0 | 1968 | 0.2297 | 0.1203 | 0.0382 |
0.248 | 49.0 | 2009 | 0.2343 | 0.1225 | 0.0388 |
0.248 | 50.0 | 2050 | 0.2265 | 0.1195 | 0.0370 |
0.248 | 51.0 | 2091 | 0.2260 | 0.1203 | 0.0376 |
0.231 | 52.0 | 2132 | 0.2204 | 0.1218 | 0.0375 |
0.231 | 53.0 | 2173 | 0.2188 | 0.1210 | 0.0364 |
0.1952 | 54.0 | 2214 | 0.2179 | 0.1195 | 0.0375 |
0.1952 | 55.0 | 2255 | 0.2171 | 0.1127 | 0.0357 |
0.1952 | 56.0 | 2296 | 0.2186 | 0.1120 | 0.0352 |
0.2148 | 57.0 | 2337 | 0.2226 | 0.1120 | 0.0358 |
0.2148 | 58.0 | 2378 | 0.2184 | 0.1142 | 0.0364 |
0.1883 | 59.0 | 2419 | 0.2141 | 0.1097 | 0.0345 |
0.1883 | 60.0 | 2460 | 0.2160 | 0.1059 | 0.0349 |
0.181 | 61.0 | 2501 | 0.2190 | 0.1082 | 0.0340 |
0.181 | 62.0 | 2542 | 0.2200 | 0.1067 | 0.0354 |
0.181 | 63.0 | 2583 | 0.2194 | 0.1120 | 0.0364 |
0.1806 | 64.0 | 2624 | 0.2152 | 0.1089 | 0.0352 |
0.1806 | 65.0 | 2665 | 0.2157 | 0.1082 | 0.0340 |
0.18 | 66.0 | 2706 | 0.2113 | 0.1059 | 0.0336 |
0.18 | 67.0 | 2747 | 0.2112 | 0.1036 | 0.0333 |
0.18 | 68.0 | 2788 | 0.2078 | 0.1082 | 0.0345 |
0.165 | 69.0 | 2829 | 0.2076 | 0.1059 | 0.0334 |
0.165 | 70.0 | 2870 | 0.2147 | 0.1044 | 0.0337 |
0.1965 | 71.0 | 2911 | 0.2044 | 0.1059 | 0.0340 |
0.1965 | 72.0 | 2952 | 0.2065 | 0.1059 | 0.0340 |
0.1965 | 73.0 | 2993 | 0.2050 | 0.1097 | 0.0345 |
0.1789 | 74.0 | 3034 | 0.2027 | 0.1097 | 0.0340 |
0.1789 | 75.0 | 3075 | 0.2003 | 0.1067 | 0.0339 |
0.1616 | 76.0 | 3116 | 0.1966 | 0.1097 | 0.0340 |
0.1616 | 77.0 | 3157 | 0.1977 | 0.1097 | 0.0336 |
0.1616 | 78.0 | 3198 | 0.2028 | 0.1074 | 0.0340 |
0.1588 | 79.0 | 3239 | 0.1991 | 0.1074 | 0.0342 |
0.1588 | 80.0 | 3280 | 0.1986 | 0.1074 | 0.0339 |
0.158 | 81.0 | 3321 | 0.2009 | 0.1059 | 0.0333 |
0.158 | 82.0 | 3362 | 0.1984 | 0.1059 | 0.0328 |
0.1484 | 83.0 | 3403 | 0.1949 | 0.1036 | 0.0321 |
0.1484 | 84.0 | 3444 | 0.1962 | 0.1029 | 0.0325 |
0.1484 | 85.0 | 3485 | 0.1961 | 0.0991 | 0.0316 |
0.1668 | 86.0 | 3526 | 0.1968 | 0.1074 | 0.0327 |
0.1668 | 87.0 | 3567 | 0.1987 | 0.1059 | 0.0330 |
0.1611 | 88.0 | 3608 | 0.1999 | 0.1059 | 0.0324 |
0.1611 | 89.0 | 3649 | 0.2001 | 0.1082 | 0.0328 |
0.1611 | 90.0 | 3690 | 0.2006 | 0.1051 | 0.0327 |
0.1609 | 91.0 | 3731 | 0.1982 | 0.1112 | 0.0336 |
0.1609 | 92.0 | 3772 | 0.1982 | 0.1082 | 0.0333 |
0.1604 | 93.0 | 3813 | 0.1984 | 0.1089 | 0.0334 |
0.1604 | 94.0 | 3854 | 0.1994 | 0.1074 | 0.0334 |
0.1604 | 95.0 | 3895 | 0.1999 | 0.1074 | 0.0334 |
0.1466 | 96.0 | 3936 | 0.1989 | 0.1074 | 0.0331 |
0.1466 | 97.0 | 3977 | 0.1991 | 0.1082 | 0.0333 |
0.1639 | 98.0 | 4018 | 0.1982 | 0.1074 | 0.0339 |
0.1639 | 99.0 | 4059 | 0.1987 | 0.1082 | 0.0339 |
0.1332 | 100.0 | 4100 | 0.1987 | 0.1059 | 0.0337 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.2.1+cu121
- Datasets 2.17.0
- Tokenizers 0.13.3
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.