Beijuka's picture
End of training
20018a9 verified
metadata
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2_xls_r_300m_FLEURS_Shona_1hr_v2
    results: []

Visualize in Weights & Biases Visualize in Weights & Biases Visualize in Weights & Biases

wav2vec2_xls_r_300m_FLEURS_Shona_1hr_v2

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 14.3078
  • Wer: 1.0
  • Cer: 0.9752

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 1.0 8 14.7146 1.0 0.9727
No log 2.0 16 14.6551 1.0 0.9785
No log 3.0 24 14.4901 1.0 0.9773
No log 4.0 32 14.2300 1.0 0.9908
No log 5.0 40 13.8814 1.0 0.9943
No log 6.0 48 13.1475 1.0 1.0
No log 7.0 56 11.5715 1.0 1.0
No log 8.0 64 9.0914 1.0 1.0
No log 9.0 72 7.1873 1.0 1.0
No log 10.0 80 5.8653 1.0 1.0
No log 11.0 88 5.1712 1.0 1.0
No log 12.0 96 4.7087 1.0 1.0
10.9024 13.0 104 4.3815 1.0 1.0
10.9024 14.0 112 4.1675 1.0 1.0
10.9024 15.0 120 3.9986 1.0 1.0
10.9024 16.0 128 3.8652 1.0 1.0
10.9024 17.0 136 3.7428 1.0 1.0
10.9024 18.0 144 3.6411 1.0 1.0
10.9024 19.0 152 3.5597 1.0 1.0
10.9024 20.0 160 3.4685 1.0 1.0
10.9024 21.0 168 3.3991 1.0 1.0
10.9024 22.0 176 3.3369 1.0 1.0
10.9024 23.0 184 3.2783 1.0 1.0
10.9024 24.0 192 3.2254 1.0 1.0
3.5833 25.0 200 3.1803 1.0 1.0
3.5833 26.0 208 3.1373 1.0 1.0
3.5833 27.0 216 3.1228 1.0 1.0
3.5833 28.0 224 3.0768 1.0 1.0
3.5833 29.0 232 3.0552 1.0 1.0
3.5833 30.0 240 3.0262 1.0 1.0
3.5833 31.0 248 3.0120 1.0 1.0
3.5833 32.0 256 2.9929 1.0 1.0
3.5833 33.0 264 2.9810 1.0 1.0
3.5833 34.0 272 2.9802 1.0 1.0
3.5833 35.0 280 2.9659 1.0 1.0
3.5833 36.0 288 2.9622 1.0 1.0
3.5833 37.0 296 2.9551 1.0 1.0
3.0223 38.0 304 2.9496 1.0 1.0
3.0223 39.0 312 2.9459 1.0 1.0
3.0223 40.0 320 2.9398 1.0 1.0
3.0223 41.0 328 2.9399 1.0 1.0
3.0223 42.0 336 2.9365 1.0 1.0
3.0223 43.0 344 2.9239 1.0 1.0
3.0223 44.0 352 2.9338 1.0 1.0
3.0223 45.0 360 2.8903 1.0 1.0
3.0223 46.0 368 2.8530 1.0 1.0
3.0223 47.0 376 2.7876 1.0 0.9951
3.0223 48.0 384 2.7081 1.0 0.9034
3.0223 49.0 392 2.6055 1.0 0.9060
2.8295 50.0 400 2.4135 1.0 0.8673
2.8295 51.0 408 2.1838 1.0 0.7115
2.8295 52.0 416 1.8849 1.0 0.5230
2.8295 53.0 424 1.5853 1.0 0.3966
2.8295 54.0 432 1.3813 1.0 0.3381
2.8295 55.0 440 1.2596 1.0 0.3092
2.8295 56.0 448 1.1627 1.0 0.2872
2.8295 57.0 456 1.0824 1.0 0.2847
2.8295 58.0 464 1.0633 1.0 0.2739
2.8295 59.0 472 0.9979 1.0 0.2628
2.8295 60.0 480 1.0218 1.0 0.2730
2.8295 61.0 488 1.0080 1.0 0.2629
2.8295 62.0 496 0.9897 1.0 0.2627
0.9322 63.0 504 0.9869 1.0 0.2639
0.9322 64.0 512 0.9348 1.0 0.2448
0.9322 65.0 520 0.9493 1.0 0.2459
0.9322 66.0 528 0.9553 1.0 0.2400
0.9322 67.0 536 0.9731 1.0 0.2468
0.9322 68.0 544 0.9631 1.0 0.2450
0.9322 69.0 552 0.9685 1.0 0.2521
0.9322 70.0 560 0.9508 1.0 0.2409
0.9322 71.0 568 1.0106 1.0 0.2484
0.9322 72.0 576 0.9573 1.0 0.2449
0.9322 73.0 584 0.9931 1.0 0.2482
0.9322 74.0 592 0.9571 1.0 0.2447
0.2354 75.0 600 0.9946 1.0 0.2439
0.2354 76.0 608 0.9376 1.0 0.2328
0.2354 77.0 616 0.9882 1.0 0.2415
0.2354 78.0 624 0.9730 1.0 0.2343
0.2354 79.0 632 0.9602 1.0 0.2355
0.2354 80.0 640 0.9791 1.0 0.2376
0.2354 81.0 648 0.9717 1.0 0.2327
0.2354 82.0 656 0.9760 1.0 0.2388
0.2354 83.0 664 0.9586 1.0 0.2359
0.2354 84.0 672 0.9799 1.0 0.2304
0.2354 85.0 680 0.9717 1.0 0.2299
0.2354 86.0 688 0.9727 1.0 0.2338
0.2354 87.0 696 0.9768 1.0 0.2331
0.1354 88.0 704 0.9938 1.0 0.2351
0.1354 89.0 712 0.9861 1.0 0.2304
0.1354 90.0 720 0.9772 1.0 0.2330
0.1354 91.0 728 0.9781 1.0 0.2298
0.1354 92.0 736 0.9765 1.0 0.2289
0.1354 93.0 744 0.9721 1.0 0.2275
0.1354 94.0 752 0.9741 1.0 0.2282
0.1354 95.0 760 0.9737 1.0 0.2291
0.1354 96.0 768 0.9715 1.0 0.2278
0.1354 97.0 776 0.9715 1.0 0.2282
0.1354 98.0 784 0.9728 1.0 0.2282
0.1354 99.0 792 0.9735 1.0 0.2287
0.1046 100.0 800 0.9737 1.0 0.2288

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.1.0+cu118
  • Datasets 2.20.0
  • Tokenizers 0.19.1