Edit model card

wav2vec2-large-xlsr-53-french-KM-v3

This model is a fine-tuned version of Ilyes/wav2vec2-large-xlsr-53-french on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2027
  • Wer: 0.1652

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.5539 0.5405 20 0.3481 0.3205
0.6104 1.0811 40 0.3000 0.3083
0.5578 1.6216 60 0.2856 0.2788
0.4996 2.1622 80 0.3071 0.2625
0.4838 2.7027 100 0.2646 0.2608
0.4016 3.2432 120 0.2639 0.2428
0.4191 3.7838 140 0.2480 0.2306
0.4446 4.3243 160 0.2471 0.2142
0.3558 4.8649 180 0.2382 0.2044
0.3824 5.4054 200 0.2409 0.2052
0.3297 5.9459 220 0.2220 0.1864
0.3578 6.4865 240 0.2169 0.1864
0.3362 7.0270 260 0.2038 0.1783
0.3078 7.5676 280 0.2129 0.1766
0.2973 8.1081 300 0.2102 0.1733
0.2574 8.6486 320 0.2251 0.1733
0.3276 9.1892 340 0.2153 0.1750
0.2945 9.7297 360 0.2082 0.1709
0.2859 10.2703 380 0.2122 0.1709
0.2873 10.8108 400 0.2079 0.1676
0.2687 11.3514 420 0.2017 0.1652
0.277 11.8919 440 0.2052 0.1660
0.2781 12.4324 460 0.2055 0.1668
0.2363 12.9730 480 0.2026 0.1643
0.2535 13.5135 500 0.2011 0.1668
0.2488 14.0541 520 0.2052 0.1660
0.2591 14.5946 540 0.2027 0.1652

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Shagufta/wav2vec2-large-xlsr-53-french-KM-v3

Finetuned
(4)
this model