metadata
library_name: transformers
language:
- ha
base_model: Alvin-Nahabwe/wav2vec2-pretrained-asr-africa
tags:
- generated_from_trainer
datasets:
- naijavoices/naijavoices-dataset
metrics:
- wer
model-index:
- name: Wav2Vec2-ASR-Africa Hausa - Alvin Nahabwe
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: NaijaVoices
type: naijavoices/naijavoices-dataset
metrics:
- name: Wer
type: wer
value: 0.4449253474157941
Wav2Vec2-ASR-Africa Hausa - Alvin Nahabwe
This model is a fine-tuned version of Alvin-Nahabwe/wav2vec2-pretrained-asr-africa on the NaijaVoices dataset. It achieves the following results on the evaluation set:
- Cer: 0.1183
- Loss: 1.4415
- Wer: 0.4449
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 100.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Cer | Validation Loss | Wer |
---|---|---|---|---|---|
2.8513 | 1.0 | 713 | 0.2910 | 0.8985 | 0.8985 |
0.9475 | 2.0 | 1426 | 0.2192 | 0.7238 | 0.7567 |
0.8134 | 3.0 | 2139 | 0.1993 | 0.6482 | 0.6990 |
0.7329 | 4.0 | 2852 | 0.1849 | 0.6005 | 0.6628 |
0.6738 | 5.0 | 3565 | 0.1738 | 0.5939 | 0.6399 |
0.6331 | 6.0 | 4278 | 0.1703 | 0.5654 | 0.6220 |
0.5909 | 7.0 | 4991 | 0.1593 | 0.5385 | 0.5846 |
0.5591 | 8.0 | 5704 | 0.1591 | 0.5353 | 0.5803 |
0.5277 | 9.0 | 6417 | 0.1560 | 0.5261 | 0.5691 |
0.5005 | 10.0 | 7130 | 0.1506 | 0.5118 | 0.5559 |
0.4744 | 11.0 | 7843 | 0.1488 | 0.5212 | 0.5463 |
0.4498 | 12.0 | 8556 | 0.1473 | 0.5317 | 0.5484 |
0.4229 | 13.0 | 9269 | 0.1441 | 0.5332 | 0.5348 |
0.4046 | 14.0 | 9982 | 0.1450 | 0.5304 | 0.5323 |
0.3846 | 15.0 | 10695 | 0.1438 | 0.5677 | 0.5283 |
0.3627 | 16.0 | 11408 | 0.1431 | 0.5575 | 0.5290 |
0.3419 | 17.0 | 12121 | 0.1428 | 0.5930 | 0.5268 |
0.3275 | 18.0 | 12834 | 0.1435 | 0.5773 | 0.5248 |
0.309 | 19.0 | 13547 | 0.1405 | 0.6326 | 0.5205 |
0.2974 | 20.0 | 14260 | 0.1404 | 0.6192 | 0.5173 |
0.283 | 21.0 | 14973 | 0.1425 | 0.6331 | 0.5238 |
0.2684 | 22.0 | 15686 | 0.1418 | 0.6447 | 0.5189 |
0.2578 | 23.0 | 16399 | 0.1395 | 0.7003 | 0.5125 |
0.2502 | 24.0 | 17112 | 0.1404 | 0.6843 | 0.5158 |
0.2392 | 25.0 | 17825 | 0.1372 | 0.7490 | 0.5061 |
0.2311 | 26.0 | 18538 | 0.1369 | 0.6989 | 0.5072 |
0.2223 | 27.0 | 19251 | 0.1382 | 0.7487 | 0.5063 |
0.2168 | 28.0 | 19964 | 0.1412 | 0.7233 | 0.5075 |
0.2122 | 29.0 | 20677 | 0.1350 | 0.8055 | 0.5023 |
0.2064 | 30.0 | 21390 | 0.1356 | 0.7682 | 0.5015 |
0.2002 | 31.0 | 22103 | 0.1369 | 0.8371 | 0.5025 |
0.1953 | 32.0 | 22816 | 0.1345 | 0.8060 | 0.4984 |
0.1903 | 33.0 | 23529 | 0.1356 | 0.8662 | 0.5025 |
0.1875 | 34.0 | 24242 | 0.1343 | 0.7856 | 0.4960 |
0.184 | 35.0 | 24955 | 0.1363 | 0.8334 | 0.4967 |
0.1792 | 36.0 | 25668 | 0.1368 | 0.8326 | 0.4984 |
0.1775 | 37.0 | 26381 | 0.1335 | 0.8292 | 0.4928 |
0.1742 | 38.0 | 27094 | 0.1352 | 0.8251 | 0.4962 |
0.1716 | 39.0 | 27807 | 0.1315 | 0.8758 | 0.4868 |
0.1673 | 40.0 | 28520 | 0.1303 | 0.8811 | 0.4864 |
0.1647 | 41.0 | 29233 | 0.1309 | 0.9360 | 0.4864 |
0.163 | 42.0 | 29946 | 0.1326 | 0.9481 | 0.4893 |
0.1615 | 43.0 | 30659 | 0.1337 | 0.9580 | 0.4924 |
0.1575 | 44.0 | 31372 | 0.1314 | 0.8831 | 0.4850 |
0.1556 | 45.0 | 32085 | 0.1316 | 0.9683 | 0.4830 |
0.1531 | 46.0 | 32798 | 0.1325 | 0.9607 | 0.4871 |
0.151 | 47.0 | 33511 | 0.1295 | 0.9514 | 0.4792 |
0.1504 | 48.0 | 34224 | 0.1285 | 0.9855 | 0.4785 |
0.1468 | 49.0 | 34937 | 0.1287 | 0.9842 | 0.4821 |
0.1454 | 50.0 | 35650 | 0.1282 | 0.9534 | 0.4782 |
0.1416 | 51.0 | 36363 | 0.1267 | 0.9950 | 0.4734 |
0.1407 | 52.0 | 37076 | 0.1274 | 0.9657 | 0.4754 |
0.1356 | 53.0 | 37789 | 0.1275 | 0.9976 | 0.4723 |
0.1349 | 54.0 | 38502 | 0.1275 | 1.0288 | 0.4740 |
0.1352 | 55.0 | 39215 | 0.1284 | 1.0283 | 0.4753 |
0.1334 | 56.0 | 39928 | 0.1274 | 1.0172 | 0.4742 |
0.1317 | 57.0 | 40641 | 0.1262 | 1.0150 | 0.4725 |
0.1281 | 58.0 | 41354 | 0.1269 | 1.0258 | 0.4705 |
0.1272 | 59.0 | 42067 | 0.1270 | 1.0485 | 0.4713 |
0.1261 | 60.0 | 42780 | 0.1257 | 1.0763 | 0.4674 |
0.1257 | 61.0 | 43493 | 0.1264 | 1.0779 | 0.4717 |
0.1223 | 62.0 | 44206 | 0.1267 | 1.0957 | 0.4697 |
0.1213 | 63.0 | 44919 | 0.1240 | 1.0922 | 0.4653 |
0.1191 | 64.0 | 45632 | 0.1252 | 1.0292 | 0.4675 |
0.1163 | 65.0 | 46345 | 0.1235 | 1.1721 | 0.4635 |
0.1152 | 66.0 | 47058 | 0.1237 | 1.0529 | 0.4625 |
0.1139 | 67.0 | 47771 | 0.1244 | 1.1248 | 0.4624 |
0.1122 | 68.0 | 48484 | 0.1239 | 1.1673 | 0.4615 |
0.1107 | 69.0 | 49197 | 0.1237 | 1.1475 | 0.4630 |
0.1097 | 70.0 | 49910 | 0.1238 | 1.1714 | 0.4639 |
0.1072 | 71.0 | 50623 | 0.1232 | 1.1740 | 0.4620 |
0.1059 | 72.0 | 51336 | 0.1229 | 1.2115 | 0.4613 |
0.104 | 73.0 | 52049 | 0.1230 | 1.1782 | 0.4640 |
0.101 | 74.0 | 52762 | 0.1225 | 1.2219 | 0.4621 |
0.1 | 75.0 | 53475 | 0.1209 | 1.2251 | 0.4564 |
0.1006 | 76.0 | 54188 | 0.1217 | 1.2656 | 0.4591 |
0.0993 | 77.0 | 54901 | 0.1215 | 1.2014 | 0.4600 |
0.0975 | 78.0 | 55614 | 0.1208 | 1.2231 | 0.4564 |
0.094 | 79.0 | 56327 | 0.1207 | 1.2751 | 0.4536 |
0.0963 | 80.0 | 57040 | 0.1205 | 1.2659 | 0.4531 |
0.0934 | 81.0 | 57753 | 0.1207 | 1.3305 | 0.4548 |
0.092 | 82.0 | 58466 | 0.1207 | 1.3029 | 0.4527 |
0.0905 | 83.0 | 59179 | 0.1210 | 1.2916 | 0.4554 |
0.0899 | 84.0 | 59892 | 0.1199 | 1.3628 | 0.4513 |
0.0876 | 85.0 | 60605 | 0.1200 | 1.3117 | 0.4507 |
0.086 | 86.0 | 61318 | 0.1193 | 1.3287 | 0.4491 |
0.0861 | 87.0 | 62031 | 0.1197 | 1.3370 | 0.4504 |
0.0844 | 88.0 | 62744 | 0.1197 | 1.3742 | 0.4492 |
0.0831 | 89.0 | 63457 | 0.1192 | 1.3519 | 0.4498 |
0.0824 | 90.0 | 64170 | 0.1191 | 1.3569 | 0.4493 |
0.0811 | 91.0 | 64883 | 0.1201 | 1.3567 | 0.4495 |
0.0798 | 92.0 | 65596 | 0.1189 | 1.3839 | 0.4484 |
0.0797 | 93.0 | 66309 | 0.1191 | 1.3982 | 0.4477 |
0.0786 | 94.0 | 67022 | 0.1190 | 1.4314 | 0.4499 |
0.0776 | 95.0 | 67735 | 0.1189 | 1.4345 | 0.4462 |
0.0773 | 96.0 | 68448 | 0.1185 | 1.4470 | 0.4452 |
0.0772 | 97.0 | 69161 | 0.1186 | 1.4376 | 0.4453 |
0.0754 | 98.0 | 69874 | 0.1183 | 1.4523 | 0.4453 |
0.0757 | 99.0 | 70587 | 0.1184 | 1.4372 | 0.4454 |
0.0751 | 100.0 | 71300 | 0.1183 | 1.4415 | 0.4449 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.4.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0