|
--- |
|
license: apache-2.0 |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: wav2vec2-xls-r-300m-ftspeech |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# wav2vec2-xls-r-300m-ftspeech |
|
|
|
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 17.8348 |
|
- Wer: 0.1186 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0001 |
|
- train_batch_size: 1 |
|
- eval_batch_size: 1 |
|
- seed: 4242 |
|
- gradient_accumulation_steps: 32 |
|
- total_train_batch_size: 32 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 2000 |
|
- num_epochs: 3 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Wer | |
|
|:-------------:|:-----:|:-----:|:---------------:|:------:| |
|
| 311.274 | 0.02 | 500 | 329.9529 | 1.0 | |
|
| 296.713 | 0.03 | 1000 | 305.5616 | 1.0000 | |
|
| 69.6128 | 0.05 | 1500 | 77.2267 | 0.6466 | |
|
| 47.4542 | 0.06 | 2000 | 56.0227 | 0.5101 | |
|
| 39.2415 | 0.08 | 2500 | 40.1751 | 0.3483 | |
|
| 35.9888 | 0.1 | 3000 | 33.1659 | 0.2619 | |
|
| 34.1621 | 0.11 | 3500 | 30.4220 | 0.2296 | |
|
| 32.3383 | 0.13 | 4000 | 28.3836 | 0.2214 | |
|
| 31.1862 | 0.14 | 4500 | 28.7228 | 0.2220 | |
|
| 29.818 | 0.16 | 5000 | 28.3220 | 0.2259 | |
|
| 29.4729 | 0.18 | 5500 | 26.5646 | 0.2024 | |
|
| 27.6171 | 0.19 | 6000 | 26.3382 | 0.1995 | |
|
| 27.4549 | 0.21 | 6500 | 24.1257 | 0.1697 | |
|
| 27.9176 | 0.22 | 7000 | 24.8758 | 0.1945 | |
|
| 27.4036 | 0.24 | 7500 | 24.1006 | 0.1746 | |
|
| 26.5633 | 0.26 | 8000 | 23.0034 | 0.1582 | |
|
| 26.3558 | 0.27 | 8500 | 24.7499 | 0.1913 | |
|
| 25.9604 | 0.29 | 9000 | 22.5813 | 0.1674 | |
|
| 25.6154 | 0.31 | 9500 | 22.4642 | 0.1499 | |
|
| 25.6231 | 0.32 | 10000 | 21.8089 | 0.1534 | |
|
| 26.7554 | 0.34 | 10500 | 21.9619 | 0.1543 | |
|
| 25.2901 | 0.35 | 11000 | 22.0643 | 0.1593 | |
|
| 24.8642 | 0.37 | 11500 | 21.1113 | 0.1480 | |
|
| 25.4664 | 0.39 | 12000 | 21.2492 | 0.1458 | |
|
| 24.6433 | 0.4 | 12500 | 20.7650 | 0.1419 | |
|
| 24.8455 | 0.42 | 13000 | 21.8535 | 0.1490 | |
|
| 25.1176 | 0.43 | 13500 | 20.7491 | 0.1429 | |
|
| 24.4585 | 0.45 | 14000 | 20.7948 | 0.1423 | |
|
| 24.1613 | 0.47 | 14500 | 20.5817 | 0.1431 | |
|
| 23.7281 | 0.48 | 15000 | 20.1209 | 0.1333 | |
|
| 23.0396 | 0.5 | 15500 | 20.2883 | 0.1383 | |
|
| 24.7056 | 0.51 | 16000 | 19.6813 | 0.1330 | |
|
| 23.608 | 0.53 | 16500 | 20.0252 | 0.1394 | |
|
| 23.9536 | 0.55 | 17000 | 19.9039 | 0.1341 | |
|
| 23.1848 | 0.56 | 17500 | 19.9114 | 0.1308 | |
|
| 23.1835 | 0.58 | 18000 | 19.7044 | 0.1345 | |
|
| 23.9372 | 0.59 | 18500 | 19.2201 | 0.1296 | |
|
| 23.2182 | 0.61 | 19000 | 19.3723 | 0.1350 | |
|
| 22.3118 | 0.63 | 19500 | 19.2624 | 0.1344 | |
|
| 22.9372 | 0.64 | 20000 | 19.5823 | 0.1387 | |
|
| 23.1536 | 0.66 | 20500 | 18.9077 | 0.1289 | |
|
| 22.3477 | 0.67 | 21000 | 18.7098 | 0.1257 | |
|
| 22.3701 | 0.69 | 21500 | 19.0815 | 0.1300 | |
|
| 22.6709 | 0.71 | 22000 | 18.4433 | 0.1242 | |
|
| 22.2519 | 0.72 | 22500 | 18.7482 | 0.1275 | |
|
| 21.8536 | 0.74 | 23000 | 18.6565 | 0.1236 | |
|
| 22.4479 | 0.76 | 23500 | 18.6478 | 0.1264 | |
|
| 21.6824 | 0.77 | 24000 | 18.4383 | 0.1257 | |
|
| 22.1622 | 0.79 | 24500 | 18.4086 | 0.1212 | |
|
| 22.2626 | 0.8 | 25000 | 18.4613 | 0.1230 | |
|
| 21.0009 | 0.82 | 25500 | 18.1851 | 0.1165 | |
|
| 20.554 | 0.84 | 26000 | 17.7352 | 0.1165 | |
|
| 21.5141 | 0.85 | 26500 | 18.3084 | 0.1207 | |
|
| 20.5925 | 0.87 | 27000 | 17.9997 | 0.1207 | |
|
| 21.0997 | 0.88 | 27500 | 17.7534 | 0.1193 | |
|
| 21.7098 | 0.9 | 28000 | 17.8348 | 0.1186 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.16.2 |
|
- Pytorch 1.10.2+cu102 |
|
- Datasets 1.18.3 |
|
- Tokenizers 0.11.0 |
|
|