Doogie's picture
update model card README.md
95cfbe8
|
raw
history blame
No virus
4.84 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
model-index:
  name: Waynehills-STT-doogie-server

Waynehills-STT-doogie-server

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.9322
  • Wer: 1.0368

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
1.8987 0.51 100 3.9322 1.0368
1.9171 1.01 200 3.9322 1.0368
1.9058 1.52 300 3.9322 1.0368
1.9037 2.02 400 3.9322 1.0368
1.9079 2.53 500 3.9322 1.0368
1.8788 3.03 600 3.9322 1.0368
1.8973 3.54 700 3.9322 1.0368
1.9031 4.04 800 3.9322 1.0368
1.8966 4.55 900 3.9322 1.0368
1.9092 5.05 1000 3.9322 1.0368
1.9158 5.56 1100 3.9322 1.0368
1.89 6.06 1200 3.9322 1.0368
1.916 6.57 1300 3.9322 1.0368
1.8684 7.07 1400 3.9322 1.0368
1.8885 7.58 1500 3.9322 1.0368
1.9335 8.08 1600 3.9322 1.0368
1.9112 8.59 1700 3.9322 1.0368
1.8794 9.09 1800 3.9322 1.0368
1.9062 9.6 1900 3.9322 1.0368
1.9048 10.1 2000 3.9322 1.0368
1.917 10.61 2100 3.9322 1.0368
1.8809 11.11 2200 3.9322 1.0368
1.9101 11.62 2300 3.9322 1.0368
1.8867 12.12 2400 3.9322 1.0368
1.9188 12.63 2500 3.9322 1.0368
1.8933 13.13 2600 3.9322 1.0368
1.8846 13.64 2700 3.9322 1.0368
1.9327 14.14 2800 3.9322 1.0368
1.9041 14.65 2900 3.9322 1.0368
1.8733 15.15 3000 3.9322 1.0368
1.9246 15.66 3100 3.9322 1.0368
1.8925 16.16 3200 3.9322 1.0368
1.9066 16.67 3300 3.9322 1.0368
1.8991 17.17 3400 3.9322 1.0368
1.899 17.68 3500 3.9322 1.0368
1.9003 18.18 3600 3.9322 1.0368
1.9131 18.69 3700 3.9322 1.0368
1.9141 19.19 3800 3.9322 1.0368
1.9074 19.7 3900 3.9322 1.0368
1.9308 20.2 4000 3.9322 1.0368
1.876 20.71 4100 3.9322 1.0368
1.9263 21.21 4200 3.9322 1.0368
1.8956 21.72 4300 3.9322 1.0368
1.9114 22.22 4400 3.9322 1.0368
1.9189 22.73 4500 3.9322 1.0368
1.889 23.23 4600 3.9322 1.0368
1.9065 23.74 4700 3.9322 1.0368
1.9151 24.24 4800 3.9322 1.0368
1.9059 24.75 4900 3.9322 1.0368
1.8875 25.25 5000 3.9322 1.0368
1.9123 25.76 5100 3.9322 1.0368
1.9008 26.26 5200 3.9322 1.0368
1.9128 26.77 5300 3.9322 1.0368
1.9026 27.27 5400 3.9322 1.0368
1.8901 27.78 5500 3.9322 1.0368
1.9108 28.28 5600 3.9322 1.0368
1.9004 28.79 5700 3.9322 1.0368
1.9199 29.29 5800 3.9322 1.0368
1.8783 29.8 5900 3.9322 1.0368

Framework versions

  • Transformers 4.12.5
  • Pytorch 1.10.0+cu113
  • Datasets 1.17.0
  • Tokenizers 0.10.3