Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2M

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3442
  • Wer: 1

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 3000
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
13.3564 28.57 100 27.6137 1
12.2323 57.14 200 23.7860 1
6.6474 85.71 300 11.1610 1
3.5002 114.29 400 4.4376 1
3.2055 142.86 500 3.9892 1
3.1194 171.43 600 3.7170 1
3.0537 200.0 700 3.5853 1
3.009 228.57 800 3.4726 1
2.9618 257.14 900 3.3643 1
2.9248 285.71 1000 3.2705 1
2.8999 314.29 1100 3.2090 1
2.8771 342.86 1200 3.1526 1
2.8548 371.43 1300 3.1171 1
2.832 400.0 1400 3.0867 1
2.8173 428.57 1500 3.0383 1
2.7962 457.14 1600 3.0360 1
2.7794 485.71 1700 3.0202 1
2.7656 514.29 1800 3.0042 1
2.7539 542.86 1900 2.9969 1
2.741 571.43 2000 2.9984 1
2.7193 600.0 2100 2.9676 1
2.6152 628.57 2200 2.8485 1
2.335 657.14 2300 2.5736 1
1.9601 685.71 2400 2.3247 1
1.5744 714.29 2500 2.1688 1
1.306 742.86 2600 2.0462 1
1.0761 771.43 2700 2.0612 1
0.9187 800.0 2800 2.0175 1
0.8063 828.57 2900 2.0096 1
0.685 857.14 3000 2.0829 1
0.6094 885.71 3100 2.1365 1
0.5728 914.29 3200 2.1460 1
0.5295 942.86 3300 2.1939 1
0.4776 971.43 3400 2.1596 1
0.4391 1000.0 3500 2.1430 1
0.4192 1028.57 3600 2.2202 1
0.4191 1057.14 3700 2.2345 1
0.38 1085.71 3800 2.2531 1
0.3636 1114.29 3900 2.2553 1
0.3479 1142.86 4000 2.2299 1
0.3447 1171.43 4100 2.2707 1
0.3251 1200.0 4200 2.2855 1
0.3332 1228.57 4300 2.2808 1
0.3225 1257.14 4400 2.3391 1
0.3053 1285.71 4500 2.3565 1
0.3179 1314.29 4600 2.3616 1
0.2818 1342.86 4700 2.3429 1
0.2926 1371.43 4800 2.3507 1
0.2909 1400.0 4900 2.3501 1
0.2917 1428.57 5000 2.3442 1

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
0