|
--- |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- common_voice_17_0 |
|
metrics: |
|
- wer |
|
model-index: |
|
- name: MyDrive |
|
results: |
|
- task: |
|
name: Automatic Speech Recognition |
|
type: automatic-speech-recognition |
|
dataset: |
|
name: common_voice_17_0 |
|
type: common_voice_17_0 |
|
config: ar |
|
split: test[:10%] |
|
args: ar |
|
metrics: |
|
- name: Wer |
|
type: wer |
|
value: 0.5566888976069047 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# MyDrive |
|
|
|
This model was trained from scratch on the common_voice_17_0 dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 1.1679 |
|
- Wer: 0.5567 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0001 |
|
- train_batch_size: 10 |
|
- eval_batch_size: 10 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 3 |
|
- total_train_batch_size: 30 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 500 |
|
- num_epochs: 25 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer | |
|
|:-------------:|:-------:|:-----:|:---------------:|:------:| |
|
| 0.1174 | 0.4228 | 200 | 1.2291 | 0.6153 | |
|
| 0.098 | 0.8457 | 400 | 1.2325 | 0.6275 | |
|
| 0.1301 | 1.2685 | 600 | 1.1969 | 0.6128 | |
|
| 0.1514 | 1.6913 | 800 | 1.2293 | 0.6489 | |
|
| 0.1494 | 2.1142 | 1000 | 1.3062 | 0.6701 | |
|
| 0.1382 | 2.5370 | 1200 | 1.2223 | 0.6261 | |
|
| 0.1382 | 2.9598 | 1400 | 1.3116 | 0.6506 | |
|
| 0.1239 | 3.3827 | 1600 | 1.1977 | 0.6189 | |
|
| 0.1228 | 3.8055 | 1800 | 1.1852 | 0.6281 | |
|
| 0.1117 | 4.2283 | 2000 | 1.3370 | 0.6495 | |
|
| 0.1118 | 4.6512 | 2200 | 1.3265 | 0.6432 | |
|
| 0.1101 | 5.0740 | 2400 | 1.3458 | 0.6310 | |
|
| 0.1328 | 5.4968 | 2600 | 1.2545 | 0.6342 | |
|
| 0.1384 | 5.9197 | 2800 | 1.2806 | 0.6265 | |
|
| 0.1334 | 6.3425 | 3000 | 1.2484 | 0.6369 | |
|
| 0.1383 | 6.7653 | 3200 | 1.2701 | 0.6479 | |
|
| 0.1281 | 7.1882 | 3400 | 1.1926 | 0.6314 | |
|
| 0.1232 | 7.6110 | 3600 | 1.2255 | 0.6187 | |
|
| 0.0727 | 8.0338 | 3800 | 1.2398 | 0.6014 | |
|
| 0.0749 | 8.4567 | 4000 | 1.2319 | 0.5957 | |
|
| 0.0734 | 8.8795 | 4200 | 1.2247 | 0.5879 | |
|
| 0.0684 | 9.3023 | 4400 | 1.3474 | 0.6136 | |
|
| 0.073 | 9.7252 | 4600 | 1.2837 | 0.5936 | |
|
| 0.0728 | 10.1480 | 4800 | 1.2477 | 0.5910 | |
|
| 0.0718 | 10.5708 | 5000 | 1.2472 | 0.5867 | |
|
| 0.0685 | 10.9937 | 5200 | 1.2693 | 0.5789 | |
|
| 0.0649 | 11.4165 | 5400 | 1.2165 | 0.5787 | |
|
| 0.0632 | 11.8393 | 5600 | 1.2447 | 0.5842 | |
|
| 0.0625 | 12.2622 | 5800 | 1.3088 | 0.5806 | |
|
| 0.061 | 12.6850 | 6000 | 1.3399 | 0.5924 | |
|
| 0.0595 | 13.1078 | 6200 | 1.3049 | 0.5769 | |
|
| 0.0608 | 13.5307 | 6400 | 1.2737 | 0.5734 | |
|
| 0.0596 | 13.9535 | 6600 | 1.2288 | 0.5747 | |
|
| 0.0565 | 14.3763 | 6800 | 1.2599 | 0.5677 | |
|
| 0.0568 | 14.7992 | 7000 | 1.2705 | 0.5622 | |
|
| 0.0538 | 15.2220 | 7200 | 1.3540 | 0.5838 | |
|
| 0.0585 | 15.6448 | 7400 | 1.3334 | 0.5798 | |
|
| 0.0548 | 16.0677 | 7600 | 1.3313 | 0.5724 | |
|
| 0.0526 | 16.4905 | 7800 | 1.3299 | 0.5720 | |
|
| 0.0577 | 16.9133 | 8000 | 1.3206 | 0.5830 | |
|
| 0.0513 | 17.3362 | 8200 | 1.3500 | 0.5787 | |
|
| 0.0506 | 17.7590 | 8400 | 1.3185 | 0.5698 | |
|
| 0.0498 | 18.1818 | 8600 | 1.3656 | 0.5800 | |
|
| 0.0515 | 18.6047 | 8800 | 1.3253 | 0.5669 | |
|
| 0.05 | 19.0275 | 9000 | 1.3411 | 0.5810 | |
|
| 0.048 | 19.4503 | 9200 | 1.3628 | 0.5730 | |
|
| 0.049 | 19.8732 | 9400 | 1.3700 | 0.5730 | |
|
| 0.0469 | 20.2960 | 9600 | 1.3646 | 0.5718 | |
|
| 0.0474 | 20.7188 | 9800 | 1.4191 | 0.5787 | |
|
| 0.0488 | 21.1416 | 10000 | 1.3450 | 0.5753 | |
|
| 0.0466 | 21.5645 | 10200 | 1.2961 | 0.5612 | |
|
| 0.0462 | 21.9873 | 10400 | 1.3379 | 0.5732 | |
|
| 0.0479 | 22.4101 | 10600 | 1.3641 | 0.5755 | |
|
| 0.0475 | 22.8330 | 10800 | 1.3316 | 0.5751 | |
|
| 0.0461 | 23.2558 | 11000 | 1.4021 | 0.5779 | |
|
| 0.0443 | 23.6786 | 11200 | 1.3808 | 0.5767 | |
|
| 0.0448 | 24.1015 | 11400 | 1.4157 | 0.5779 | |
|
| 0.1948 | 16.3609 | 11600 | 0.8630 | 0.5620 | |
|
| 0.1658 | 16.6429 | 11800 | 0.9330 | 0.5692 | |
|
| 0.1632 | 16.9248 | 12000 | 0.8790 | 0.5518 | |
|
| 0.1373 | 17.2068 | 12200 | 0.9279 | 0.5455 | |
|
| 0.1233 | 17.4887 | 12400 | 1.0114 | 0.5634 | |
|
| 0.1223 | 17.7707 | 12600 | 1.0203 | 0.5638 | |
|
| 0.1207 | 18.0526 | 12800 | 1.0660 | 0.5724 | |
|
| 0.1009 | 18.3346 | 13000 | 1.0873 | 0.5667 | |
|
| 0.106 | 18.6165 | 13200 | 1.1188 | 0.5667 | |
|
| 0.0989 | 18.8985 | 13400 | 1.0954 | 0.5689 | |
|
| 0.0981 | 19.1805 | 13600 | 1.1168 | 0.5636 | |
|
| 0.0858 | 19.4638 | 13800 | 1.1655 | 0.5669 | |
|
| 0.0851 | 19.7458 | 14000 | 1.1516 | 0.5596 | |
|
| 0.0929 | 20.0277 | 14200 | 1.1067 | 0.5545 | |
|
| 0.0816 | 20.3097 | 14400 | 1.1479 | 0.5608 | |
|
| 0.0853 | 20.5916 | 14600 | 1.1574 | 0.5626 | |
|
| 0.0823 | 20.8736 | 14800 | 1.1786 | 0.5655 | |
|
| 0.0837 | 21.1555 | 15000 | 1.1809 | 0.5618 | |
|
| 0.0806 | 21.4375 | 15200 | 1.1776 | 0.5547 | |
|
| 0.0819 | 21.7195 | 15400 | 1.1668 | 0.5581 | |
|
| 0.079 | 22.0014 | 15600 | 1.2081 | 0.5573 | |
|
| 0.0739 | 22.2834 | 15800 | 1.2005 | 0.5565 | |
|
| 0.0751 | 22.5653 | 16000 | 1.1868 | 0.5539 | |
|
| 0.0777 | 22.8473 | 16200 | 1.1831 | 0.5569 | |
|
| 0.0705 | 23.1292 | 16400 | 1.2246 | 0.5579 | |
|
| 0.0704 | 23.4112 | 16600 | 1.2922 | 0.5614 | |
|
| 0.0684 | 23.6931 | 16800 | 1.2495 | 0.5555 | |
|
| 0.0714 | 23.9751 | 17000 | 1.2268 | 0.5539 | |
|
| 0.0669 | 24.2570 | 17200 | 1.3074 | 0.5647 | |
|
| 0.067 | 24.5390 | 17400 | 1.2619 | 0.5555 | |
|
| 0.0664 | 24.8210 | 17600 | 1.2757 | 0.5587 | |
|
| 0.1389 | 21.5232 | 17800 | 1.1468 | 0.5704 | |
|
| 0.1246 | 21.7648 | 18000 | 1.1285 | 0.5577 | |
|
| 0.1292 | 22.0064 | 18200 | 1.1010 | 0.5524 | |
|
| 0.1115 | 22.2481 | 18400 | 1.1428 | 0.5563 | |
|
| 0.1129 | 22.4897 | 18600 | 1.1834 | 0.5647 | |
|
| 0.1178 | 22.7314 | 18800 | 1.1346 | 0.5522 | |
|
| 0.1119 | 22.9730 | 19000 | 1.1957 | 0.5587 | |
|
| 0.1031 | 23.2147 | 19200 | 1.1525 | 0.5457 | |
|
| 0.1066 | 23.4563 | 19400 | 1.1926 | 0.5583 | |
|
| 0.103 | 23.6979 | 19600 | 1.2014 | 0.5563 | |
|
| 0.1016 | 23.9396 | 19800 | 1.2301 | 0.5583 | |
|
| 0.1009 | 24.1812 | 20000 | 1.2208 | 0.5530 | |
|
| 0.0953 | 24.4229 | 20200 | 1.2250 | 0.5587 | |
|
| 0.0981 | 24.6645 | 20400 | 1.2353 | 0.5543 | |
|
| 0.0973 | 24.9062 | 20600 | 1.2359 | 0.5571 | |
|
| 0.11 | 24.4417 | 20800 | 1.2127 | 0.5620 | |
|
| 0.1119 | 24.6766 | 21000 | 1.1876 | 0.5571 | |
|
| 0.1053 | 24.9115 | 21200 | 1.1679 | 0.5567 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.2 |
|
- Pytorch 2.3.1+cu121 |
|
- Datasets 2.21.0 |
|
- Tokenizers 0.19.1 |
|
|