Edit model card

model_en

This model is a fine-tuned version of facebook/wav2vec2-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8610
  • Wer: 0.2641

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Wer
6.3443 3.05 250 3.0966 1.0
2.9847 6.1 500 3.0603 1.0
2.9263 9.15 750 2.9131 1.0
2.2584 12.19 1000 1.4318 0.6575
1.2603 15.24 1250 1.1964 0.4994
0.9182 18.29 1500 1.1494 0.4485
0.7462 21.34 1750 1.2171 0.4357
0.6129 24.39 2000 1.0557 0.3468
0.5364 27.44 2250 1.1069 0.4222
0.4607 30.48 2500 1.3270 0.3370
0.4139 33.53 2750 1.1814 0.3658
0.3587 36.58 3000 1.2423 0.3419
0.321 39.63 3250 1.2931 0.3211
0.2961 42.68 3500 1.1409 0.3315
0.2635 45.73 3750 1.4537 0.3241
0.2498 48.78 4000 1.2643 0.3192
0.2352 51.82 4250 1.2789 0.3278
0.2193 54.87 4500 1.4220 0.3021
0.2068 57.92 4750 1.3567 0.3713
0.2055 60.97 5000 1.5375 0.3051
0.198 64.02 5250 1.2676 0.2782
0.1835 67.07 5500 1.3905 0.2825
0.1655 70.12 5750 1.7000 0.2978
0.1677 73.17 6000 1.4250 0.2812
0.1522 76.22 6250 1.4220 0.2941
0.1522 79.27 6500 1.5195 0.3021
0.1344 82.32 6750 1.3749 0.2996
0.1298 85.36 7000 1.6663 0.2849
0.1293 88.41 7250 1.4564 0.2892
0.1264 91.46 7500 1.4373 0.2935
0.1243 94.51 7750 1.6572 0.2972
0.1141 97.56 8000 1.4936 0.2892
0.1086 100.61 8250 1.5231 0.2868
0.1056 103.65 8500 1.3733 0.2763
0.098 106.7 8750 1.4887 0.2923
0.0984 109.75 9000 1.3779 0.2923
0.0916 112.8 9250 1.4868 0.2604
0.0881 115.85 9500 1.7991 0.2996
0.0846 118.9 9750 1.5845 0.2849
0.0861 121.95 10000 1.6684 0.2794
0.0806 124.99 10250 1.5774 0.3039
0.0822 128.05 10500 1.5928 0.2886
0.0788 131.1 10750 1.6158 0.2880
0.0704 134.15 11000 1.7679 0.2941
0.0721 137.19 11250 1.7055 0.2629
0.0723 140.24 11500 1.5473 0.2653
0.0676 143.29 11750 1.8963 0.2745
0.0665 146.34 12000 1.6367 0.2739
0.0618 149.39 12250 1.6757 0.2745
0.0595 152.44 12500 1.5900 0.2745
0.056 155.48 12750 1.5362 0.2794
0.0587 158.53 13000 1.4616 0.2684
0.0519 161.58 13250 1.6867 0.2549
0.0569 164.63 13500 1.8294 0.2574
0.0497 167.68 13750 1.7844 0.2868
0.0531 170.73 14000 1.7564 0.2770
0.0489 173.78 14250 1.5811 0.2629
0.0524 176.82 14500 1.6925 0.2684
0.0431 179.87 14750 1.7236 0.2653
0.0457 182.92 15000 1.7460 0.2512
0.045 185.97 15250 1.8096 0.2610
0.0402 189.02 15500 1.8795 0.2635
0.0529 192.07 15750 1.8310 0.2616
0.0396 195.12 16000 1.8380 0.2635
0.0432 198.17 16250 1.8610 0.2641

Framework versions

  • Transformers 4.11.3
  • Pytorch 1.9.0
  • Datasets 1.13.3
  • Tokenizers 0.10.3
Downloads last month
2