wav2vec2-large-xlsr-mecita-coraa-portuguese-2-all-clean-06
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1245
- Wer: 0.0826
- Cer: 0.0242
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
25.0528 | 1.0 | 67 | 3.3339 | 1.0 | 1.0 |
7.3126 | 2.0 | 134 | 2.9659 | 1.0 | 1.0 |
3.0007 | 3.0 | 201 | 2.9192 | 1.0 | 1.0 |
3.0007 | 4.0 | 268 | 2.8943 | 1.0 | 1.0 |
2.9153 | 5.0 | 335 | 2.8720 | 1.0 | 1.0 |
2.8426 | 6.0 | 402 | 2.4681 | 0.9980 | 0.9974 |
2.8426 | 7.0 | 469 | 0.8337 | 0.4926 | 0.1146 |
1.7153 | 8.0 | 536 | 0.4488 | 0.2116 | 0.0571 |
0.791 | 9.0 | 603 | 0.3345 | 0.1825 | 0.0507 |
0.791 | 10.0 | 670 | 0.2795 | 0.1584 | 0.0451 |
0.5835 | 11.0 | 737 | 0.2457 | 0.1469 | 0.0422 |
0.4836 | 12.0 | 804 | 0.2251 | 0.1398 | 0.0407 |
0.4836 | 13.0 | 871 | 0.2120 | 0.1303 | 0.0379 |
0.4026 | 14.0 | 938 | 0.1947 | 0.1141 | 0.0348 |
0.3347 | 15.0 | 1005 | 0.1764 | 0.1090 | 0.0324 |
0.3347 | 16.0 | 1072 | 0.1703 | 0.1070 | 0.0320 |
0.3405 | 17.0 | 1139 | 0.1647 | 0.1019 | 0.0311 |
0.3036 | 18.0 | 1206 | 0.1571 | 0.0911 | 0.0286 |
0.3036 | 19.0 | 1273 | 0.1541 | 0.0955 | 0.0292 |
0.2859 | 20.0 | 1340 | 0.1528 | 0.0961 | 0.0300 |
0.2499 | 21.0 | 1407 | 0.1514 | 0.0907 | 0.0288 |
0.2499 | 22.0 | 1474 | 0.1502 | 0.0884 | 0.0284 |
0.2709 | 23.0 | 1541 | 0.1435 | 0.0843 | 0.0269 |
0.2446 | 24.0 | 1608 | 0.1405 | 0.0850 | 0.0267 |
0.2446 | 25.0 | 1675 | 0.1379 | 0.0846 | 0.0272 |
0.2131 | 26.0 | 1742 | 0.1375 | 0.0843 | 0.0263 |
0.2333 | 27.0 | 1809 | 0.1374 | 0.0856 | 0.0275 |
0.2333 | 28.0 | 1876 | 0.1408 | 0.0856 | 0.0274 |
0.216 | 29.0 | 1943 | 0.1341 | 0.0809 | 0.0261 |
0.1948 | 30.0 | 2010 | 0.1344 | 0.0863 | 0.0266 |
0.1948 | 31.0 | 2077 | 0.1353 | 0.0833 | 0.0263 |
0.1957 | 32.0 | 2144 | 0.1296 | 0.0823 | 0.0255 |
0.2116 | 33.0 | 2211 | 0.1350 | 0.0826 | 0.0263 |
0.2116 | 34.0 | 2278 | 0.1344 | 0.0812 | 0.0259 |
0.1862 | 35.0 | 2345 | 0.1361 | 0.0819 | 0.0260 |
0.1819 | 36.0 | 2412 | 0.1349 | 0.0836 | 0.0264 |
0.1819 | 37.0 | 2479 | 0.1305 | 0.0836 | 0.0256 |
0.1622 | 38.0 | 2546 | 0.1284 | 0.0823 | 0.0258 |
0.1693 | 39.0 | 2613 | 0.1295 | 0.0853 | 0.0262 |
0.1693 | 40.0 | 2680 | 0.1321 | 0.0833 | 0.0261 |
0.1865 | 41.0 | 2747 | 0.1278 | 0.0812 | 0.0255 |
0.1646 | 42.0 | 2814 | 0.1307 | 0.0816 | 0.0258 |
0.1646 | 43.0 | 2881 | 0.1284 | 0.0823 | 0.0262 |
0.1526 | 44.0 | 2948 | 0.1312 | 0.0833 | 0.0256 |
0.1708 | 45.0 | 3015 | 0.1273 | 0.0819 | 0.0247 |
0.1708 | 46.0 | 3082 | 0.1266 | 0.0782 | 0.0240 |
0.1582 | 47.0 | 3149 | 0.1260 | 0.0796 | 0.0245 |
0.1448 | 48.0 | 3216 | 0.1252 | 0.0785 | 0.0240 |
0.1448 | 49.0 | 3283 | 0.1259 | 0.0789 | 0.0233 |
0.1425 | 50.0 | 3350 | 0.1262 | 0.0833 | 0.0247 |
0.1502 | 51.0 | 3417 | 0.1264 | 0.0806 | 0.0243 |
0.1502 | 52.0 | 3484 | 0.1245 | 0.0826 | 0.0242 |
0.1337 | 53.0 | 3551 | 0.1299 | 0.0809 | 0.0249 |
0.1523 | 54.0 | 3618 | 0.1303 | 0.0816 | 0.0249 |
0.1523 | 55.0 | 3685 | 0.1295 | 0.0833 | 0.0249 |
0.1365 | 56.0 | 3752 | 0.1277 | 0.0833 | 0.0255 |
0.1435 | 57.0 | 3819 | 0.1286 | 0.0829 | 0.0249 |
0.1435 | 58.0 | 3886 | 0.1264 | 0.0833 | 0.0250 |
0.1319 | 59.0 | 3953 | 0.1309 | 0.0819 | 0.0252 |
0.1473 | 60.0 | 4020 | 0.1272 | 0.0799 | 0.0241 |
0.1473 | 61.0 | 4087 | 0.1260 | 0.0806 | 0.0245 |
0.1419 | 62.0 | 4154 | 0.1264 | 0.0796 | 0.0241 |
0.1371 | 63.0 | 4221 | 0.1293 | 0.0789 | 0.0251 |
0.1371 | 64.0 | 4288 | 0.1276 | 0.0768 | 0.0244 |
0.1301 | 65.0 | 4355 | 0.1262 | 0.0789 | 0.0246 |
0.1341 | 66.0 | 4422 | 0.1280 | 0.0823 | 0.0256 |
0.1341 | 67.0 | 4489 | 0.1300 | 0.0826 | 0.0254 |
0.1283 | 68.0 | 4556 | 0.1280 | 0.0768 | 0.0241 |
0.1241 | 69.0 | 4623 | 0.1289 | 0.0768 | 0.0238 |
0.1241 | 70.0 | 4690 | 0.1305 | 0.0782 | 0.0245 |
0.1326 | 71.0 | 4757 | 0.1295 | 0.0782 | 0.0247 |
0.1244 | 72.0 | 4824 | 0.1287 | 0.0796 | 0.0250 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.2.1+cu121
- Datasets 2.17.0
- Tokenizers 0.13.3
- Downloads last month
- 1