anton-l's picture
anton-l HF staff
update model card README.md
23ef30d
metadata
language:
  - all
license: apache-2.0
tags:
  - fleurs-asr
  - google/xtreme_s
  - generated_from_trainer
model-index:
  - name: xtreme_s_xlsr_300m_fleurs_asr_western_european_nomask
    results: []

xtreme_s_xlsr_300m_fleurs_asr_western_european_nomask

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the GOOGLE/XTREME_S - FLEURS.ALL dataset. It achieves the following results on the evaluation set:

  • Epoch Af Za: 20.0
  • Epoch Am Et: 20.0
  • Epoch Ar Eg: 20.0
  • Epoch As In: 20.0
  • Epoch Ast Es: 20.0
  • Epoch Az Az: 20.0
  • Epoch Be By: 20.0
  • Epoch Bn In: 20.0
  • Epoch Bs Ba: 20.0
  • Epoch Ca Es: 20.0
  • Epoch Ceb Ph: 20.0
  • Epoch Cmn Hans Cn: 20.0
  • Epoch Cs Cz: 20.0
  • Epoch Cy Gb: 20.0
  • Epoch Da Dk: 20.0
  • Epoch De De: 20.0
  • Epoch El Gr: 20.0
  • Epoch En Us: 20.0
  • Epoch Es 419: 20.0
  • Epoch Et Ee: 20.0
  • Epoch Fa Ir: 20.0
  • Epoch Ff Sn: 20.0
  • Epoch Fi Fi: 20.0
  • Epoch Fil Ph: 20.0
  • Epoch Fr Fr: 20.0
  • Epoch Ga Ie: 20.0
  • Epoch Gl Es: 20.0
  • Epoch Gu In: 20.0
  • Epoch Ha Ng: 20.0
  • Epoch He Il: 20.0
  • Epoch Hi In: 20.0
  • Epoch Hr Hr: 20.0
  • Epoch Hu Hu: 20.0
  • Epoch Hy Am: 20.0
  • Epoch Id Id: 20.0
  • Epoch Ig Ng: 20.0
  • Epoch Is Is: 20.0
  • Epoch It It: 20.0
  • Epoch Ja Jp: 20.0
  • Epoch Jv Id: 20.0
  • Epoch Ka Ge: 20.0
  • Epoch Kam Ke: 20.0
  • Epoch Kea Cv: 20.0
  • Epoch Kk Kz: 20.0
  • Epoch Km Kh: 20.0
  • Epoch Kn In: 20.0
  • Epoch Ko Kr: 20.0
  • Epoch Ku Arab Iq: 20.0
  • Epoch Ky Kg: 20.0
  • Epoch Lb Lu: 20.0
  • Epoch Lg Ug: 20.0
  • Epoch Ln Cd: 20.0
  • Epoch Lo La: 20.0
  • Epoch Lt Lt: 20.0
  • Epoch Luo Ke: 20.0
  • Epoch Lv Lv: 20.0
  • Epoch Mi Nz: 20.0
  • Epoch Mk Mk: 20.0
  • Epoch Ml In: 20.0
  • Epoch Mn Mn: 20.0
  • Epoch Mr In: 20.0
  • Epoch Ms My: 20.0
  • Epoch Mt Mt: 20.0
  • Epoch My Mm: 20.0
  • Epoch Nb No: 20.0
  • Epoch Ne Np: 20.0
  • Epoch Nl Nl: 20.0
  • Epoch Nso Za: 20.0
  • Epoch Ny Mw: 20.0
  • Epoch Oci Fr: 20.0
  • Epoch Om Et: 20.0
  • Epoch Or In: 20.0
  • Epoch Pa In: 20.0
  • Epoch Pl Pl: 20.0
  • Epoch Ps Af: 20.0
  • Epoch Pt Br: 20.0
  • Epoch Ro Ro: 20.0
  • Epoch Ru Ru: 20.0
  • Epoch Rup Bg: 20.0
  • Epoch Sd Arab In: 20.0
  • Epoch Sk Sk: 20.0
  • Epoch Sl Si: 20.0
  • Epoch Sn Zw: 20.0
  • Epoch So So: 20.0
  • Epoch Sr Rs: 20.0
  • Epoch Sv Se: 20.0
  • Epoch Sw Ke: 20.0
  • Epoch Ta In: 20.0
  • Epoch Te In: 20.0
  • Epoch Tg Tj: 20.0
  • Epoch Th Th: 20.0
  • Epoch Tr Tr: 20.0
  • Epoch Uk Ua: 20.0
  • Epoch Umb Ao: 20.0
  • Epoch Ur Pk: 20.0
  • Epoch Uz Uz: 20.0
  • Epoch Vi Vn: 20.0
  • Epoch Wo Sn: 20.0
  • Epoch Xh Za: 20.0
  • Epoch Yo Ng: 20.0
  • Epoch Yue Hant Hk: 20.0
  • Epoch Zu Za: 20.0
  • Cer: 0.2484
  • Cer Ast Es: 0.1598
  • Cer Bs Ba: 0.1749
  • Cer Ca Es: 0.1655
  • Cer Cy Gb: 0.2280
  • Cer Da Dk: 0.3616
  • Cer De De: 0.1287
  • Cer El Gr: 0.6020
  • Cer En Us: 0.1938
  • Cer Es 419: 0.1288
  • Cer Fi Fi: 0.2050
  • Cer Fr Fr: 0.1811
  • Cer Ga Ie: 0.4474
  • Cer Gl Es: 0.1324
  • Cer Hr Hr: 0.1555
  • Cer Hu Hu: 0.3911
  • Cer Is Is: 0.4646
  • Cer It It: 0.1283
  • Cer Kea Cv: 0.1818
  • Cer Lb Lu: 0.2594
  • Cer Mt Mt: 0.3628
  • Cer Nb No: 0.2254
  • Cer Nl Nl: 0.1790
  • Cer Oci Fr: 0.2159
  • Cer Pt Br: 0.2275
  • Cer Sv Se: 0.3092
  • Loss: 1.3089
  • Loss Ast Es: 0.7715
  • Loss Bs Ba: 0.7378
  • Loss Ca Es: 0.7868
  • Loss Cy Gb: 1.1441
  • Loss Da Dk: 1.9130
  • Loss De De: 0.5391
  • Loss El Gr: 3.4904
  • Loss En Us: 0.9632
  • Loss Es 419: 0.6186
  • Loss Fi Fi: 0.8953
  • Loss Fr Fr: 0.9076
  • Loss Ga Ie: 3.0217
  • Loss Gl Es: 0.5788
  • Loss Hr Hr: 0.6462
  • Loss Hu Hu: 1.9029
  • Loss Is Is: 2.6551
  • Loss It It: 0.6052
  • Loss Kea Cv: 0.9107
  • Loss Lb Lu: 1.3705
  • Loss Mt Mt: 2.3651
  • Loss Nb No: 1.1518
  • Loss Nl Nl: 0.8490
  • Loss Oci Fr: 1.1421
  • Loss Pt Br: 1.1641
  • Loss Sv Se: 1.5910
  • Wer: 0.6451
  • Wer Ast Es: 0.4654
  • Wer Bs Ba: 0.5443
  • Wer Ca Es: 0.4979
  • Wer Cy Gb: 0.5962
  • Wer Da Dk: 0.8455
  • Wer De De: 0.4221
  • Wer El Gr: 0.9805
  • Wer En Us: 0.4556
  • Wer Es 419: 0.3928
  • Wer Fi Fi: 0.8116
  • Wer Fr Fr: 0.4690
  • Wer Ga Ie: 0.8519
  • Wer Gl Es: 0.4245
  • Wer Hr Hr: 0.4895
  • Wer Hu Hu: 0.9099
  • Wer Is Is: 0.9960
  • Wer It It: 0.4415
  • Wer Kea Cv: 0.5202
  • Wer Lb Lu: 0.7225
  • Wer Mt Mt: 1.0096
  • Wer Nb No: 0.6541
  • Wer Nl Nl: 0.5257
  • Wer Oci Fr: 0.5770
  • Wer Pt Br: 0.6685
  • Wer Sv Se: 0.8546
  • Predict Samples: 20043

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 1
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • total_train_batch_size: 64
  • total_eval_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.1411 0.49 500 3.1673 1.0 1.0
0.6397 0.97 1000 0.9039 0.7171 0.2862
0.4033 1.46 1500 0.8914 0.6862 0.2763
0.3473 1.94 2000 0.8017 0.6505 0.2536
0.3143 2.43 2500 0.8568 0.6566 0.2627
0.3004 2.91 3000 0.8898 0.6640 0.2686
0.282 3.4 3500 0.8489 0.6637 0.2571
0.2489 3.88 4000 0.8955 0.6744 0.2691
0.1706 4.37 4500 0.9190 0.6788 0.2688
0.3336 4.85 5000 0.8915 0.6594 0.2572
0.1426 5.34 5500 0.9501 0.6784 0.2686
0.2301 5.83 6000 1.0217 0.6719 0.2735
0.1325 6.31 6500 0.9578 0.6691 0.2655
0.1145 6.8 7000 0.9129 0.6680 0.2593
0.1202 7.28 7500 0.9646 0.6749 0.2619
0.143 7.77 8000 0.9200 0.6554 0.2554
0.1012 8.25 8500 0.9553 0.6787 0.2628
0.1018 8.74 9000 0.9455 0.6445 0.2511
0.1148 9.22 9500 1.0206 0.6725 0.2629
0.0794 9.71 10000 0.9305 0.6547 0.2526
0.2891 10.19 10500 1.0424 0.6709 0.2570
0.1665 10.68 11000 0.9760 0.6596 0.2507
0.1956 11.17 11500 0.9549 0.6340 0.2440
0.0828 11.65 12000 0.9598 0.6403 0.2460
0.059 12.14 12500 0.9972 0.6574 0.2531
0.0505 12.62 13000 0.9836 0.6534 0.2525
0.0336 13.11 13500 1.0619 0.6564 0.2519
0.0435 13.59 14000 1.0844 0.6480 0.2543
0.0216 14.08 14500 1.1084 0.6512 0.2521
0.0265 14.56 15000 1.1152 0.6607 0.2563
0.0975 15.05 15500 1.1060 0.6456 0.2471
0.1396 15.53 16000 1.1100 0.6337 0.2418
0.0701 16.02 16500 1.1731 0.6309 0.2415
0.1171 16.5 17000 1.1302 0.6315 0.2396
0.0778 16.99 17500 1.1485 0.6379 0.2447
0.0642 17.48 18000 1.2009 0.6400 0.2464
0.0322 17.96 18500 1.2028 0.6357 0.2425
0.031 18.45 19000 1.2381 0.6285 0.2416
0.0579 18.93 19500 1.2299 0.6265 0.2409
0.0628 19.42 20000 1.2582 0.6277 0.2395
0.074 19.9 20500 1.2572 0.6278 0.2394

Framework versions

  • Transformers 4.18.0.dev0
  • Pytorch 1.10.1+cu111
  • Datasets 1.18.4.dev0
  • Tokenizers 0.11.6