Edit model card

torgo_xlsr_finetune_M01

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8655
  • Wer: 0.3060

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Wer
3.4346 0.89 1000 3.3570 1.0
1.3708 1.79 2000 1.5774 0.7569
0.7783 2.69 3000 1.6546 0.6103
0.5676 3.58 4000 1.3849 0.5216
0.4476 4.48 5000 1.5294 0.5
0.4264 5.37 6000 1.5832 0.4534
0.3434 6.27 7000 1.4397 0.4233
0.3371 7.16 8000 1.4635 0.4129
0.3268 8.06 9000 1.5989 0.3828
0.2623 8.95 10000 1.5145 0.3836
0.2755 9.85 11000 1.6695 0.3569
0.2304 10.74 12000 1.4313 0.3397
0.2052 11.64 13000 1.4242 0.3466
0.199 12.53 14000 1.7287 0.3405
0.2124 13.43 15000 1.4715 0.3086
0.1858 14.32 16000 1.6835 0.3086
0.1667 15.22 17000 1.6080 0.3233
0.1551 16.11 18000 1.6151 0.3293
0.1638 17.01 19000 1.5014 0.3034
0.1584 17.9 20000 1.7036 0.3233
0.1486 18.8 21000 1.6527 0.3207
0.1337 19.7 22000 1.6947 0.3181
0.201 20.59 23000 1.9110 0.3431
0.2058 21.49 24000 1.6260 0.3560
0.1776 22.38 25000 1.8602 0.3483
0.1779 23.28 26000 2.0418 0.3578
0.1401 24.17 27000 2.0262 0.3371
0.1533 25.07 28000 1.7442 0.3069
0.1476 25.96 29000 1.7283 0.3190
0.1414 26.86 30000 1.7655 0.3181
0.1522 27.75 31000 1.6772 0.3103
0.146 28.65 32000 1.4420 0.3
0.1363 29.54 33000 1.5955 0.3276
0.1306 30.44 34000 1.7269 0.3336
0.1241 31.33 35000 1.7725 0.3216
0.1155 32.23 36000 1.8232 0.3086
0.117 33.12 37000 1.8145 0.3052
0.0973 34.02 38000 2.0621 0.3216
0.1181 34.91 39000 1.6758 0.2957
0.1063 35.81 40000 1.6431 0.2983
0.094 36.71 41000 1.7967 0.3069
0.0937 37.6 42000 1.8469 0.3052
0.0931 38.5 43000 1.8364 0.3017
0.0897 39.39 44000 1.8655 0.3060

Framework versions

  • Transformers 4.26.1
  • Pytorch 2.1.2
  • Datasets 2.16.1
  • Tokenizers 0.13.3
Downloads last month
6