torgo_xlsr_finetune-F01-2
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.0297
- Wer: 0.9501
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
23.5657 | 0.81 | 500 | 3.3629 | 1.0 |
3.349 | 1.62 | 1000 | 2.9797 | 1.0 |
2.6233 | 2.44 | 1500 | 1.9631 | 1.0317 |
1.6556 | 3.25 | 2000 | 1.5230 | 1.2971 |
1.1942 | 4.06 | 2500 | 1.3185 | 1.1814 |
0.9141 | 4.87 | 3000 | 1.3889 | 1.1723 |
0.7505 | 5.68 | 3500 | 1.2601 | 1.1247 |
0.6246 | 6.49 | 4000 | 1.4765 | 1.0907 |
0.5506 | 7.31 | 4500 | 1.3586 | 1.1020 |
0.5004 | 8.12 | 5000 | 1.2889 | 1.0454 |
0.4591 | 8.93 | 5500 | 1.6057 | 1.0680 |
0.4064 | 9.74 | 6000 | 1.4433 | 1.0023 |
0.3865 | 10.55 | 6500 | 1.7125 | 1.0567 |
0.3422 | 11.36 | 7000 | 1.7667 | 1.0045 |
0.3084 | 12.18 | 7500 | 1.6708 | 1.0567 |
0.2904 | 12.99 | 8000 | 1.8301 | 1.0045 |
0.2826 | 13.8 | 8500 | 1.5566 | 0.9796 |
0.2707 | 14.61 | 9000 | 1.6200 | 0.9909 |
0.2499 | 15.42 | 9500 | 1.9019 | 1.0045 |
0.2292 | 16.23 | 10000 | 1.7184 | 0.9841 |
0.2245 | 17.05 | 10500 | 1.7977 | 0.9932 |
0.2156 | 17.86 | 11000 | 1.7847 | 0.9705 |
0.2152 | 18.67 | 11500 | 1.9312 | 0.9660 |
0.2003 | 19.48 | 12000 | 2.1059 | 0.9909 |
0.1908 | 20.29 | 12500 | 1.9738 | 0.9683 |
0.1638 | 21.1 | 13000 | 1.7796 | 0.9592 |
0.1606 | 21.92 | 13500 | 2.0548 | 0.9569 |
0.1569 | 22.73 | 14000 | 1.9577 | 0.9274 |
0.1458 | 23.54 | 14500 | 2.1357 | 0.9569 |
0.1599 | 24.35 | 15000 | 1.9791 | 0.9410 |
0.1489 | 25.16 | 15500 | 1.9031 | 0.9456 |
0.1473 | 25.97 | 16000 | 1.8522 | 0.9342 |
0.1167 | 26.79 | 16500 | 1.8953 | 0.9342 |
0.1177 | 27.6 | 17000 | 1.9675 | 0.9342 |
0.1268 | 28.41 | 17500 | 1.9905 | 0.9501 |
0.1136 | 29.22 | 18000 | 2.0297 | 0.9501 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 1.18.3
- Tokenizers 0.13.2
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support