Doogie's picture
update model card README.md
e4513b3
|
raw
history blame
No virus
4.84 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
model-index:
  name: Waynehills-STT-doogie-server

Waynehills-STT-doogie-server

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.9322
  • Wer: 1.0368

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
1.8925 0.51 100 3.9322 1.0368
1.9361 1.01 200 3.9322 1.0368
1.9349 1.52 300 3.9322 1.0368
1.9047 2.02 400 3.9322 1.0368
1.8903 2.53 500 3.9322 1.0368
1.9246 3.03 600 3.9322 1.0368
1.9243 3.54 700 3.9322 1.0368
1.9048 4.04 800 3.9322 1.0368
1.9095 4.55 900 3.9322 1.0368
1.8892 5.05 1000 3.9322 1.0368
1.8724 5.56 1100 3.9322 1.0368
1.9132 6.06 1200 3.9322 1.0368
1.901 6.57 1300 3.9322 1.0368
1.9005 7.07 1400 3.9322 1.0368
1.8921 7.58 1500 3.9322 1.0368
1.9159 8.08 1600 3.9322 1.0368
1.8817 8.59 1700 3.9322 1.0368
1.9161 9.09 1800 3.9322 1.0368
1.9013 9.6 1900 3.9322 1.0368
1.8953 10.1 2000 3.9322 1.0368
1.9131 10.61 2100 3.9322 1.0368
1.8756 11.11 2200 3.9322 1.0368
1.91 11.62 2300 3.9322 1.0368
1.9115 12.12 2400 3.9322 1.0368
1.9179 12.63 2500 3.9322 1.0368
1.8858 13.13 2600 3.9322 1.0368
1.9098 13.64 2700 3.9322 1.0368
1.8918 14.14 2800 3.9322 1.0368
1.9174 14.65 2900 3.9322 1.0368
1.8735 15.15 3000 3.9322 1.0368
1.898 15.66 3100 3.9322 1.0368
1.8958 16.16 3200 3.9322 1.0368
1.8824 16.67 3300 3.9322 1.0368
1.9061 17.17 3400 3.9322 1.0368
1.8963 17.68 3500 3.9322 1.0368
1.9476 18.18 3600 3.9322 1.0368
1.9137 18.69 3700 3.9322 1.0368
1.8795 19.19 3800 3.9322 1.0368
1.8922 19.7 3900 3.9322 1.0368
1.9307 20.2 4000 3.9322 1.0368
1.9179 20.71 4100 3.9322 1.0368
1.8951 21.21 4200 3.9322 1.0368
1.9096 21.72 4300 3.9322 1.0368
1.9104 22.22 4400 3.9322 1.0368
1.8902 22.73 4500 3.9322 1.0368
1.9276 23.23 4600 3.9322 1.0368
1.9154 23.74 4700 3.9322 1.0368
1.9003 24.24 4800 3.9322 1.0368
1.9029 24.75 4900 3.9322 1.0368
1.8991 25.25 5000 3.9322 1.0368
1.9173 25.76 5100 3.9322 1.0368
1.8857 26.26 5200 3.9322 1.0368
1.9122 26.77 5300 3.9322 1.0368
1.9032 27.27 5400 3.9322 1.0368
1.8865 27.78 5500 3.9322 1.0368
1.8944 28.28 5600 3.9322 1.0368
1.9086 28.79 5700 3.9322 1.0368
1.9048 29.29 5800 3.9322 1.0368
1.9152 29.8 5900 3.9322 1.0368

Framework versions

  • Transformers 4.12.5
  • Pytorch 1.10.0+cu113
  • Datasets 1.17.0
  • Tokenizers 0.10.3