DewiBrynJones's picture
Model save
5051cf0 verified
|
raw
history blame
No virus
8.16 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xlsr-53-ft-btb-cy
    results: []

wav2vec2-xlsr-53-ft-btb-cy

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9669
  • Wer: 0.6125

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 15.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.1414 100 3.7427 1.0
No log 0.2829 200 2.9179 1.0
No log 0.4243 300 2.8036 1.0
No log 0.5658 400 1.2196 0.8934
3.574 0.7072 500 0.9860 0.7276
3.574 0.8487 600 0.8392 0.6504
3.574 0.9901 700 0.7804 0.6029
3.574 1.1315 800 0.6122 0.4909
3.574 1.2730 900 0.5901 0.4883
0.811 1.4144 1000 0.5500 0.4508
0.811 1.5559 1100 0.5232 0.4142
0.811 1.6973 1200 0.5186 0.4065
0.811 1.8388 1300 0.4953 0.3929
0.811 1.9802 1400 0.4880 0.3928
0.6459 2.1216 1500 0.4645 0.3692
0.6459 2.2631 1600 0.4666 0.3586
0.6459 2.4045 1700 0.4502 0.3593
0.6459 2.5460 1800 0.4528 0.3638
0.6459 2.6874 1900 0.4665 0.3926
0.5306 2.8289 2000 0.4329 0.3505
0.5306 2.9703 2100 0.4245 0.3374
0.5306 3.1117 2200 0.4377 0.3340
0.5306 3.2532 2300 0.4272 0.3337
0.5306 3.3946 2400 0.4335 0.3326
0.4628 3.5361 2500 0.4268 0.3275
0.4628 3.6775 2600 0.4502 0.3409
0.4628 3.8190 2700 0.6345 0.4390
0.4628 3.9604 2800 1.0203 0.6403
0.4628 4.1018 2900 1.2208 0.7922
0.8685 4.2433 3000 1.1018 0.7387
0.8685 4.3847 3100 1.2497 0.8062
0.8685 4.5262 3200 1.6165 0.9616
0.8685 4.6676 3300 1.4655 0.9217
0.8685 4.8091 3400 1.0288 0.7465
1.3918 4.9505 3500 0.9067 0.5948
1.3918 5.0919 3600 0.9486 0.6353
1.3918 5.2334 3700 0.8674 0.5428
1.3918 5.3748 3800 0.9403 0.5793
1.3918 5.5163 3900 0.9481 0.5764
1.0402 5.6577 4000 1.0176 0.8257
1.0402 5.7992 4100 0.9857 0.6343
1.0402 5.9406 4200 1.3289 0.9014
1.0402 6.0820 4300 2.0891 0.7125
1.0402 6.2235 4400 1.2563 0.7696
1.2886 6.3649 4500 1.1441 0.6927
1.2886 6.5064 4600 1.0626 0.6573
1.2886 6.6478 4700 0.9997 0.6423
1.2886 6.7893 4800 0.9814 0.6380
1.2886 6.9307 4900 1.0955 0.7651
1.0984 7.0721 5000 0.9213 0.5883
1.0984 7.2136 5100 0.8885 0.5933
1.0984 7.3550 5200 0.9001 0.5899
1.0984 7.4965 5300 0.8784 0.5859
1.0984 7.6379 5400 0.9072 0.5898
0.9659 7.7793 5500 0.8812 0.5841
0.9659 7.9208 5600 0.8912 0.5855
0.9659 8.0622 5700 0.8816 0.5807
0.9659 8.2037 5800 0.8914 0.5803
0.9659 8.3451 5900 0.8956 0.5810
0.9679 8.4866 6000 0.9162 0.5780
0.9679 8.6280 6100 0.9409 0.5810
0.9679 8.7694 6200 0.9371 0.5781
0.9679 8.9109 6300 0.9417 0.5790
0.9679 9.0523 6400 0.9664 0.5784
1.0241 9.1938 6500 0.9720 0.5775
1.0241 9.3352 6600 0.9841 0.5784
1.0241 9.4767 6700 0.9574 0.5887
1.0241 9.6181 6800 1.0725 0.6068
1.0241 9.7595 6900 1.0362 0.6000
1.0797 9.9010 7000 1.0117 0.5914
1.0797 10.0424 7100 0.9563 0.6058
1.0797 10.1839 7200 0.9664 0.5978
1.0797 10.3253 7300 1.0209 0.6022
1.0797 10.4668 7400 0.9849 0.5975
1.0701 10.6082 7500 0.9719 0.6057
1.0701 10.7496 7600 0.9670 0.6123
1.0701 10.8911 7700 0.9669 0.6125
1.0701 11.0325 7800 0.9669 0.6125
1.0701 11.1740 7900 0.9669 0.6125
1.0518 11.3154 8000 0.9669 0.6125
1.0518 11.4569 8100 0.9669 0.6125
1.0518 11.5983 8200 0.9669 0.6125
1.0518 11.7397 8300 0.9669 0.6125
1.0518 11.8812 8400 0.9669 0.6125
1.0594 12.0226 8500 0.9669 0.6125
1.0594 12.1641 8600 0.9669 0.6125
1.0594 12.3055 8700 0.9669 0.6125
1.0594 12.4470 8800 0.9669 0.6125
1.0594 12.5884 8900 0.9669 0.6125
1.0584 12.7298 9000 0.9669 0.6125
1.0584 12.8713 9100 0.9669 0.6125
1.0584 13.0127 9200 0.9669 0.6125
1.0584 13.1542 9300 0.9669 0.6125
1.0584 13.2956 9400 0.9669 0.6125
1.0556 13.4371 9500 0.9669 0.6125
1.0556 13.5785 9600 0.9669 0.6125
1.0556 13.7199 9700 0.9669 0.6125
1.0556 13.8614 9800 0.9669 0.6125
1.0556 14.0028 9900 0.9669 0.6125
1.0511 14.1443 10000 0.9669 0.6125
1.0511 14.2857 10100 0.9669 0.6125
1.0511 14.4272 10200 0.9669 0.6125
1.0511 14.5686 10300 0.9669 0.6125
1.0511 14.7100 10400 0.9669 0.6125
1.0585 14.8515 10500 0.9669 0.6125
1.0585 14.9929 10600 0.9669 0.6125

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1