dmusingu's picture
End of training
1c71a32 verified
metadata
library_name: transformers
language:
  - ba
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
datasets:
  - oza75/bambara-asr
metrics:
  - wer
model-index:
  - name: Whisper-WOLOF-5-hours-ALFFA-dataset
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: Bambara-asr
          type: oza75/bambara-asr
          args: 'config: ba, split: test'
        metrics:
          - name: Wer
            type: wer
            value: 25.36480142113945

Visualize in Weights & Biases

Whisper-WOLOF-5-hours-ALFFA-dataset

This model is a fine-tuned version of openai/whisper-small on the Bambara-asr dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5488
  • Wer: 25.3648
  • Cer: 7.4292

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.312 3.8760 500 0.4949 33.8409 10.1763
0.1965 7.7519 1000 0.4457 29.0445 8.7321
0.0184 11.6279 1500 0.4554 26.4433 7.7258
0.0049 15.5039 2000 0.4903 27.0143 8.0668
0.0024 19.3798 2500 0.4904 25.9865 7.8505
0.0012 23.2558 3000 0.5039 25.4917 7.5317
0.0005 27.1318 3500 0.5155 25.3140 7.5012
0.0003 31.0078 4000 0.5259 25.4790 7.5234
0.0002 34.8837 4500 0.5339 25.3394 7.4680
0.0002 38.7597 5000 0.5395 25.3267 7.4541
0.0002 42.6357 5500 0.5448 25.3267 7.4375
0.0001 46.5116 6000 0.5488 25.3648 7.4292

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.1.0+cu118
  • Datasets 3.0.1
  • Tokenizers 0.20.1