Edit model card

Model_ALL_Wav2Vec2

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7779
  • Wer: 0.1975
  • Cer: 0.0813

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.8385 0.67 400 0.5656 0.3049 0.1100
0.3291 1.34 800 0.5395 0.3184 0.1128
0.258 2.01 1200 0.4904 0.2770 0.1030
0.217 2.68 1600 0.4673 0.2814 0.1073
0.1956 3.35 2000 0.5108 0.2697 0.1021
0.1872 4.02 2400 0.5531 0.2735 0.1050
0.168 4.69 2800 0.5113 0.2536 0.0967
0.1476 5.36 3200 0.6744 0.2420 0.0941
0.1531 6.04 3600 0.6433 0.2492 0.0962
0.1271 6.71 4000 0.5360 0.2392 0.0928
0.1362 7.38 4400 0.5451 0.2458 0.0958
0.1169 8.05 4800 0.6710 0.2470 0.0965
0.117 8.72 5200 0.5291 0.2480 0.0990
0.1146 9.39 5600 0.6168 0.2372 0.0927
0.1028 10.06 6000 0.5437 0.2294 0.0914
0.0918 10.73 6400 0.6350 0.2392 0.0947
0.1037 11.4 6800 0.6351 0.2346 0.0920
0.0926 12.07 7200 0.6677 0.2316 0.0924
0.0861 12.74 7600 0.5842 0.2301 0.0934
0.0791 13.41 8000 0.5862 0.2286 0.0916
0.08 14.08 8400 0.6183 0.2227 0.0900
0.0707 14.75 8800 0.5985 0.2351 0.0955
0.0719 15.42 9200 0.6327 0.2200 0.0897
0.0674 16.09 9600 0.6184 0.2193 0.0889
0.0612 16.76 10000 0.5501 0.2224 0.0912
0.0607 17.44 10400 0.5404 0.2233 0.0916
0.0612 18.11 10800 0.6111 0.2193 0.0889
0.0542 18.78 11200 0.6610 0.2196 0.0893
0.0517 19.45 11600 0.6083 0.2199 0.0905
0.0478 20.12 12000 0.6500 0.2130 0.0874
0.0464 20.79 12400 0.6671 0.2144 0.0863
0.0395 21.46 12800 0.7239 0.2113 0.0864
0.0391 22.13 13200 0.7791 0.2084 0.0851
0.0362 22.8 13600 0.6682 0.2083 0.0855
0.0396 23.47 14000 0.6608 0.2065 0.0848
0.0346 24.14 14400 0.7438 0.2065 0.0856
0.0368 24.81 14800 0.7382 0.2066 0.0842
0.0273 25.48 15200 0.7486 0.2020 0.0841
0.0286 26.15 15600 0.7566 0.2029 0.0838
0.0268 26.82 16000 0.7680 0.2015 0.0828
0.0248 27.49 16400 0.7499 0.1994 0.0813
0.0253 28.16 16800 0.7511 0.1998 0.0820
0.0228 28.83 17200 0.7686 0.1985 0.0820
0.0212 29.51 17600 0.7779 0.1975 0.0813

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu117
  • Datasets 1.18.3
  • Tokenizers 0.13.3
Downloads last month
3