Edit model card

wav2vec2-base-stac-local

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9746
  • Wer: 0.7828
  • Cer: 0.3202

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 2
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.0603 1.0 2369 2.1282 0.9517 0.5485
1.6155 2.0 4738 1.6196 0.9060 0.4565
1.3462 3.0 7107 1.4331 0.8379 0.3983
1.1819 4.0 9476 1.3872 0.8233 0.3717
1.0189 5.0 11845 1.4066 0.8328 0.3660
0.9026 6.0 14214 1.3502 0.8198 0.3508
0.777 7.0 16583 1.3016 0.7922 0.3433
0.7109 8.0 18952 1.2662 0.8302 0.3510
0.6766 9.0 21321 1.4321 0.8103 0.3368
0.6078 10.0 23690 1.3592 0.7871 0.3360
0.5958 11.0 26059 1.4389 0.7819 0.3397
0.5094 12.0 28428 1.3391 0.8017 0.3239
0.4567 13.0 30797 1.4718 0.8026 0.3347
0.4448 14.0 33166 1.7450 0.8043 0.3424
0.3976 15.0 35535 1.4581 0.7888 0.3283
0.3449 16.0 37904 1.5688 0.8078 0.3397
0.3046 17.0 40273 1.8630 0.8060 0.3448
0.2983 18.0 42642 1.8400 0.8190 0.3425
0.2728 19.0 45011 1.6726 0.8034 0.3280
0.2579 20.0 47380 1.6661 0.8138 0.3249
0.2169 21.0 49749 1.7389 0.8138 0.3277
0.2498 22.0 52118 1.7205 0.7948 0.3207
0.1831 23.0 54487 1.8641 0.8103 0.3229
0.1927 24.0 56856 1.8724 0.7784 0.3251
0.1649 25.0 59225 1.9187 0.7974 0.3277
0.1594 26.0 61594 1.9022 0.7828 0.3220
0.1338 27.0 63963 1.9303 0.7862 0.3212
0.1441 28.0 66332 1.9528 0.7845 0.3207
0.129 29.0 68701 1.9676 0.7819 0.3212
0.1169 30.0 71070 1.9746 0.7828 0.3202

Framework versions

  • Transformers 4.17.0
  • Pytorch 1.8.1+cu102
  • Datasets 1.18.3
  • Tokenizers 0.12.1
Downloads last month
8