wav2vec2-large-xlsr-mecita-coraa-portuguese-2-all-clean-05
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1042
- Wer: 0.0718
- Cer: 0.0214
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
28.8714 | 1.0 | 67 | 3.3571 | 1.0 | 1.0 |
7.5799 | 2.0 | 134 | 2.9876 | 1.0 | 1.0 |
3.0284 | 3.0 | 201 | 2.9114 | 1.0 | 1.0 |
3.0284 | 4.0 | 268 | 2.8889 | 1.0 | 1.0 |
2.9172 | 5.0 | 335 | 2.8515 | 1.0 | 1.0 |
2.8101 | 6.0 | 402 | 2.1557 | 1.0 | 0.6878 |
2.8101 | 7.0 | 469 | 0.7046 | 0.3468 | 0.0850 |
1.5251 | 8.0 | 536 | 0.4276 | 0.1963 | 0.0517 |
0.7791 | 9.0 | 603 | 0.3256 | 0.1723 | 0.0455 |
0.7791 | 10.0 | 670 | 0.2743 | 0.1416 | 0.0388 |
0.5599 | 11.0 | 737 | 0.2362 | 0.1387 | 0.0378 |
0.4678 | 12.0 | 804 | 0.2119 | 0.1265 | 0.0352 |
0.4678 | 13.0 | 871 | 0.1984 | 0.1179 | 0.0339 |
0.4302 | 14.0 | 938 | 0.1834 | 0.1235 | 0.0332 |
0.3794 | 15.0 | 1005 | 0.1760 | 0.1133 | 0.0310 |
0.3794 | 16.0 | 1072 | 0.1763 | 0.1080 | 0.0309 |
0.3234 | 17.0 | 1139 | 0.1583 | 0.1018 | 0.0294 |
0.3144 | 18.0 | 1206 | 0.1570 | 0.0932 | 0.0275 |
0.3144 | 19.0 | 1273 | 0.1421 | 0.0912 | 0.0263 |
0.2824 | 20.0 | 1340 | 0.1448 | 0.0886 | 0.0263 |
0.2503 | 21.0 | 1407 | 0.1371 | 0.0916 | 0.0260 |
0.2503 | 22.0 | 1474 | 0.1387 | 0.0860 | 0.0253 |
0.2547 | 23.0 | 1541 | 0.1301 | 0.0863 | 0.0242 |
0.2397 | 24.0 | 1608 | 0.1272 | 0.0823 | 0.0239 |
0.2397 | 25.0 | 1675 | 0.1368 | 0.0827 | 0.0250 |
0.2402 | 26.0 | 1742 | 0.1303 | 0.0807 | 0.0243 |
0.2581 | 27.0 | 1809 | 0.1248 | 0.0777 | 0.0239 |
0.2581 | 28.0 | 1876 | 0.1242 | 0.0758 | 0.0225 |
0.2334 | 29.0 | 1943 | 0.1231 | 0.0774 | 0.0228 |
0.2087 | 30.0 | 2010 | 0.1226 | 0.0754 | 0.0224 |
0.2087 | 31.0 | 2077 | 0.1227 | 0.0774 | 0.0230 |
0.2175 | 32.0 | 2144 | 0.1270 | 0.0767 | 0.0231 |
0.1973 | 33.0 | 2211 | 0.1258 | 0.0754 | 0.0230 |
0.1973 | 34.0 | 2278 | 0.1186 | 0.0754 | 0.0223 |
0.1787 | 35.0 | 2345 | 0.1234 | 0.0735 | 0.0217 |
0.1958 | 36.0 | 2412 | 0.1199 | 0.0741 | 0.0222 |
0.1958 | 37.0 | 2479 | 0.1177 | 0.0754 | 0.0222 |
0.1773 | 38.0 | 2546 | 0.1138 | 0.0751 | 0.0225 |
0.2047 | 39.0 | 2613 | 0.1164 | 0.0751 | 0.0224 |
0.2047 | 40.0 | 2680 | 0.1155 | 0.0751 | 0.0227 |
0.1727 | 41.0 | 2747 | 0.1109 | 0.0728 | 0.0213 |
0.1708 | 42.0 | 2814 | 0.1132 | 0.0702 | 0.0213 |
0.1708 | 43.0 | 2881 | 0.1110 | 0.0728 | 0.0217 |
0.1814 | 44.0 | 2948 | 0.1094 | 0.0711 | 0.0215 |
0.159 | 45.0 | 3015 | 0.1091 | 0.0702 | 0.0211 |
0.159 | 46.0 | 3082 | 0.1065 | 0.0702 | 0.0208 |
0.163 | 47.0 | 3149 | 0.1110 | 0.0708 | 0.0210 |
0.1565 | 48.0 | 3216 | 0.1121 | 0.0725 | 0.0215 |
0.1565 | 49.0 | 3283 | 0.1096 | 0.0715 | 0.0215 |
0.1571 | 50.0 | 3350 | 0.1083 | 0.0718 | 0.0210 |
0.165 | 51.0 | 3417 | 0.1056 | 0.0711 | 0.0210 |
0.165 | 52.0 | 3484 | 0.1042 | 0.0718 | 0.0214 |
0.1525 | 53.0 | 3551 | 0.1067 | 0.0698 | 0.0209 |
0.1365 | 54.0 | 3618 | 0.1084 | 0.0715 | 0.0208 |
0.1365 | 55.0 | 3685 | 0.1086 | 0.0735 | 0.0215 |
0.1434 | 56.0 | 3752 | 0.1073 | 0.0711 | 0.0208 |
0.1408 | 57.0 | 3819 | 0.1062 | 0.0705 | 0.0209 |
0.1408 | 58.0 | 3886 | 0.1066 | 0.0708 | 0.0205 |
0.1364 | 59.0 | 3953 | 0.1074 | 0.0702 | 0.0207 |
0.1507 | 60.0 | 4020 | 0.1049 | 0.0725 | 0.0207 |
0.1507 | 61.0 | 4087 | 0.1086 | 0.0715 | 0.0211 |
0.1532 | 62.0 | 4154 | 0.1083 | 0.0738 | 0.0210 |
0.1255 | 63.0 | 4221 | 0.1058 | 0.0721 | 0.0207 |
0.1255 | 64.0 | 4288 | 0.1087 | 0.0708 | 0.0202 |
0.1534 | 65.0 | 4355 | 0.1073 | 0.0738 | 0.0208 |
0.1316 | 66.0 | 4422 | 0.1061 | 0.0731 | 0.0210 |
0.1316 | 67.0 | 4489 | 0.1082 | 0.0731 | 0.0208 |
0.1365 | 68.0 | 4556 | 0.1100 | 0.0751 | 0.0213 |
0.1324 | 69.0 | 4623 | 0.1104 | 0.0708 | 0.0206 |
0.1324 | 70.0 | 4690 | 0.1073 | 0.0721 | 0.0206 |
0.1299 | 71.0 | 4757 | 0.1104 | 0.0711 | 0.0211 |
0.125 | 72.0 | 4824 | 0.1078 | 0.0718 | 0.0212 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.1+cu121
- Datasets 2.17.1
- Tokenizers 0.13.3
- Downloads last month
- 1