Edit model card

wav2vec2_fleurs

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3761
  • Wer: 0.2811

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
17.6592 0.33 100 7.5846 1.0
5.291 0.67 200 3.6732 1.0
3.4166 1.0 300 3.2280 1.0
3.2149 1.33 400 3.1839 1.0
3.1987 1.67 500 3.1726 1.0
3.1839 2.0 600 3.1652 1.0
3.1746 2.33 700 3.1607 1.0
3.1727 2.67 800 3.1576 1.0
3.1776 3.0 900 3.1446 1.0
3.1622 3.33 1000 3.1419 1.0
3.155 3.67 1100 3.1375 1.0
3.1611 4.0 1200 3.1334 1.0
3.1535 4.33 1300 3.1327 1.0
3.146 4.67 1400 3.1266 1.0
3.1437 5.0 1500 3.1036 1.0
3.098 5.33 1600 2.9865 1.0
2.9291 5.67 1700 2.6558 1.0
2.6039 6.0 1800 2.0985 0.9974
2.1178 6.33 1900 1.5524 0.9067
1.697 6.67 2000 1.1500 0.8066
1.4577 7.0 2100 0.9821 0.7531
1.2932 7.33 2200 0.8853 0.7084
1.1964 7.67 2300 0.8097 0.6629
1.1006 8.0 2400 0.7561 0.6190
1.05 8.33 2500 0.7303 0.6097
0.9948 8.67 2600 0.6801 0.5776
1.0013 9.0 2700 0.6637 0.5544
0.9249 9.33 2800 0.6479 0.5448
0.9226 9.67 2900 0.6200 0.5203
0.9098 10.0 3000 0.6051 0.5069
0.8517 10.33 3100 0.5971 0.4997
0.8322 10.67 3200 0.5780 0.4849
0.8497 11.0 3300 0.5761 0.4831
0.8197 11.33 3400 0.5599 0.4712
0.7526 11.67 3500 0.5521 0.4694
0.8102 12.0 3600 0.5462 0.4641
0.7809 12.33 3700 0.5404 0.4605
0.7684 12.67 3800 0.5371 0.4579
0.752 13.0 3900 0.5287 0.4517
0.7293 13.33 4000 0.5233 0.4515
0.729 13.67 4100 0.5225 0.4471
0.7107 14.0 4200 0.5068 0.4407
0.6823 14.33 4300 0.5064 0.4364
0.6988 14.67 4400 0.5016 0.4325
0.7049 15.0 4500 0.4939 0.4357
0.674 15.33 4600 0.4939 0.4295
0.7003 15.67 4700 0.4917 0.4300
0.684 16.0 4800 0.4862 0.4255
0.6581 16.33 4900 0.4854 0.4247
0.6839 16.67 5000 0.4806 0.4200
0.6494 17.0 5100 0.4798 0.4222
0.6695 17.33 5200 0.4770 0.4169
0.6396 17.67 5300 0.4758 0.4187
0.6676 18.0 5400 0.4740 0.4184
0.6309 18.33 5500 0.4741 0.4150
0.657 18.67 5600 0.4735 0.4127
0.6768 19.0 5700 0.4717 0.4129
0.6433 19.33 5800 0.4704 0.4135
0.6298 19.67 5900 0.4701 0.4131
0.6555 20.0 6000 0.4698 0.4114
0.6775 10.17 6100 0.4858 0.4204
0.6795 10.33 6200 0.4896 0.4164
0.6112 10.5 6300 0.4760 0.4071
0.6233 10.67 6400 0.4672 0.4110
0.6452 10.83 6500 0.4670 0.4110
0.6558 11.0 6600 0.4586 0.3987
0.5709 11.17 6700 0.4527 0.3937
0.5884 11.33 6800 0.4669 0.4032
0.6245 11.5 6900 0.4541 0.4010
0.6294 11.67 7000 0.4462 0.3880
0.6167 11.83 7100 0.4383 0.3768
0.6043 12.0 7200 0.4332 0.3697
0.5714 12.17 7300 0.4450 0.3705
0.5372 12.33 7400 0.4398 0.3781
0.5772 12.5 7500 0.4429 0.3755
0.5943 12.67 7600 0.4325 0.3708
0.571 12.83 7700 0.4447 0.3797
0.5055 13.0 7800 0.4237 0.3610
0.5316 13.17 7900 0.4279 0.3621
0.5225 13.33 8000 0.4200 0.3611
0.5162 13.5 8100 0.4295 0.3593
0.5353 13.67 8200 0.4148 0.3568
0.4887 13.83 8300 0.4096 0.3513
0.5302 14.0 8400 0.4185 0.3538
0.506 14.17 8500 0.4226 0.3480
0.5099 14.33 8600 0.4253 0.3517
0.473 14.5 8700 0.4096 0.3461
0.4963 14.67 8800 0.4074 0.3462
0.4984 14.83 8900 0.4135 0.3445
0.4896 15.0 9000 0.4038 0.3378
0.4836 15.17 9100 0.4108 0.3412
0.4393 15.33 9200 0.4258 0.3320
0.4589 15.5 9300 0.4045 0.3306
0.4711 15.67 9400 0.4052 0.3355
0.471 15.83 9500 0.4069 0.3337
0.4778 16.0 9600 0.4003 0.3270
0.4495 16.17 9700 0.3973 0.3276
0.4512 16.33 9800 0.4097 0.3308
0.4555 16.5 9900 0.4113 0.3283
0.4535 16.67 10000 0.4024 0.3271
0.4226 16.83 10100 0.3938 0.3265
0.457 17.0 10200 0.4116 0.3363
0.4002 17.17 10300 0.4037 0.3261
0.3894 17.33 10400 0.4037 0.3202
0.4473 17.5 10500 0.4005 0.3246
0.4059 17.67 10600 0.3995 0.3183
0.4122 17.83 10700 0.4039 0.3258
0.4519 18.0 10800 0.3972 0.3267
0.3908 18.17 10900 0.3988 0.3188
0.4182 18.33 11000 0.3943 0.3181
0.3978 18.5 11100 0.3901 0.3191
0.4396 18.67 11200 0.3926 0.3087
0.4098 18.83 11300 0.3844 0.3110
0.3765 19.0 11400 0.3902 0.3180
0.3816 19.17 11500 0.3895 0.3130
0.3959 19.33 11600 0.3927 0.3117
0.3636 19.5 11700 0.3922 0.3108
0.3503 19.67 11800 0.3903 0.3071
0.4234 19.83 11900 0.3922 0.3093
0.3963 20.0 12000 0.3806 0.3071
0.3776 20.17 12100 0.3831 0.3110
0.3729 20.33 12200 0.3791 0.3028
0.382 20.5 12300 0.3874 0.3040
0.387 20.67 12400 0.3895 0.3057
0.3756 20.83 12500 0.3970 0.3061
0.3511 21.0 12600 0.3884 0.3047
0.378 21.17 12700 0.3919 0.3027
0.3687 21.33 12800 0.3930 0.3062
0.355 21.5 12900 0.3837 0.2989
0.3381 21.67 13000 0.3835 0.2967
0.3673 21.83 13100 0.3870 0.3023
0.3883 22.0 13200 0.3799 0.2999
0.3513 22.17 13300 0.3783 0.3003
0.3259 22.33 13400 0.3833 0.2962
0.3446 22.5 13500 0.3843 0.2976
0.3519 22.67 13600 0.3822 0.2954
0.3573 22.83 13700 0.3802 0.2932
0.3458 23.0 13800 0.3770 0.2922
0.338 23.17 13900 0.3808 0.3002
0.3391 23.33 14000 0.3837 0.2952
0.3343 23.5 14100 0.3988 0.2977
0.3203 23.67 14200 0.3828 0.2947
0.3486 23.83 14300 0.3746 0.2933
0.3779 24.0 14400 0.3722 0.2919
0.3269 24.17 14500 0.3810 0.2946
0.3503 24.33 14600 0.3745 0.2907
0.3313 24.5 14700 0.3825 0.2903
0.321 24.67 14800 0.3872 0.2956
0.3327 24.83 14900 0.3812 0.2917
0.3387 25.0 15000 0.3822 0.2897
0.3207 25.17 15100 0.3799 0.2914
0.3308 25.33 15200 0.3916 0.2933
0.3253 25.5 15300 0.3863 0.2901
0.3291 25.67 15400 0.3824 0.2859
0.288 25.83 15500 0.3739 0.2884
0.3364 26.0 15600 0.3741 0.2897
0.2987 26.17 15700 0.3826 0.2882
0.3114 26.33 15800 0.3810 0.2908
0.3221 26.5 15900 0.3886 0.2873
0.3283 26.67 16000 0.3850 0.2946
0.3021 26.83 16100 0.3799 0.2879
0.3169 27.0 16200 0.3850 0.2839
0.3048 27.17 16300 0.3777 0.2862
0.3052 27.33 16400 0.3821 0.2862
0.2691 27.5 16500 0.3882 0.2859
0.3335 27.67 16600 0.3847 0.2872
0.3341 27.83 16700 0.3764 0.2869
0.3042 28.0 16800 0.3820 0.2876
0.297 28.17 16900 0.3801 0.2847
0.3218 28.33 17000 0.3747 0.2877
0.3227 28.5 17100 0.3794 0.2836
0.3247 28.67 17200 0.3828 0.2877
0.2952 28.83 17300 0.3887 0.2889
0.3078 29.0 17400 0.3803 0.2842
0.2943 29.17 17500 0.3798 0.2839
0.2769 29.33 17600 0.3791 0.2858
0.3152 29.5 17700 0.3839 0.2856
0.326 29.67 17800 0.3817 0.2839
0.3102 29.83 17900 0.3795 0.2872
0.2856 30.0 18000 0.3768 0.2851
0.2789 30.17 18100 0.3838 0.2831
0.3096 30.33 18200 0.3756 0.2853
0.3188 30.5 18300 0.3813 0.2839
0.3019 30.67 18400 0.3793 0.2834
0.297 30.83 18500 0.3827 0.2853
0.2826 31.0 18600 0.3778 0.2837
0.3096 31.17 18700 0.3833 0.2826
0.2891 31.33 18800 0.3830 0.2832
0.2959 31.5 18900 0.3800 0.2821
0.2818 31.67 19000 0.3767 0.2828
0.2677 31.83 19100 0.3781 0.2831
0.2893 32.0 19200 0.3810 0.2814
0.293 32.17 19300 0.3812 0.2789
0.3025 32.33 19400 0.3839 0.2802
0.2589 32.5 19500 0.3807 0.2788
0.3011 32.67 19600 0.3813 0.2803
0.301 32.83 19700 0.3824 0.2817
0.2989 33.0 19800 0.3794 0.2828
0.3082 33.17 19900 0.3770 0.2812
0.2806 33.33 20000 0.3787 0.2798
0.271 33.5 20100 0.3814 0.2796
0.3318 33.67 20200 0.3764 0.2801
0.3083 33.83 20300 0.3758 0.2789
0.2542 34.0 20400 0.3786 0.2822
0.2795 34.17 20500 0.3760 0.2806
0.2778 34.33 20600 0.3766 0.2814
0.2863 34.5 20700 0.3816 0.2809
0.2902 34.67 20800 0.3792 0.2811
0.3005 34.83 20900 0.3742 0.2807
0.2863 35.0 21000 0.3759 0.2801
0.3005 35.17 21100 0.3747 0.2814
0.2696 35.33 21200 0.3779 0.2816
0.326 35.5 21300 0.3741 0.2812
0.2696 35.67 21400 0.3770 0.2803
0.2756 35.83 21500 0.3789 0.2816
0.2648 36.0 21600 0.3802 0.2814
0.3 36.17 21700 0.3791 0.2822
0.2695 36.33 21800 0.3801 0.2827
0.2685 36.5 21900 0.3783 0.2813
0.2718 36.67 22000 0.3775 0.2813
0.2982 36.83 22100 0.3780 0.2814
0.302 37.0 22200 0.3769 0.2814
0.2885 37.17 22300 0.3774 0.2817
0.2918 37.33 22400 0.3769 0.2821
0.2631 37.5 22500 0.3776 0.2819
0.2854 37.67 22600 0.3768 0.2818
0.2626 37.83 22700 0.3763 0.2803
0.311 38.0 22800 0.3756 0.2806
0.2971 38.17 22900 0.3762 0.2807
0.2496 38.33 23000 0.3762 0.2808
0.3004 38.5 23100 0.3758 0.2814
0.3125 38.67 23200 0.3756 0.2814
0.272 38.83 23300 0.3758 0.2809
0.286 39.0 23400 0.3762 0.2809
0.2562 39.17 23500 0.3762 0.2811
0.2946 39.33 23600 0.3761 0.2812
0.3202 39.5 23700 0.3761 0.2813
0.2806 39.67 23800 0.3760 0.2812
0.2856 39.83 23900 0.3761 0.2803
0.2556 40.0 24000 0.3761 0.2811

Framework versions

  • Transformers 4.37.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.15.0
Downloads last month
108
Safetensors
Model size
317M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hiba2/wav2vec2_fleurs

Finetuned
(206)
this model

Evaluation results