JacobLinCool's picture
End of training
aacedad verified
metadata
library_name: peft
language:
  - zh
license: mit
base_model: openai/whisper-large-v3-turbo
tags:
  - wft
  - whisper
  - automatic-speech-recognition
  - audio
  - speech
  - generated_from_trainer
datasets:
  - JacobLinCool/mozilla-foundation-common_voice_16_1-zh-TW-preprocessed
metrics:
  - wer
model-index:
  - name: whisper-large-v3-turbo-common_voice_16_1-zh-TW-pissa
    results:
      - task:
          type: automatic-speech-recognition
          name: Automatic Speech Recognition
        dataset:
          name: JacobLinCool/mozilla-foundation-common_voice_16_1-zh-TW-preprocessed
          type: JacobLinCool/mozilla-foundation-common_voice_16_1-zh-TW-preprocessed
        metrics:
          - type: wer
            value: 63.665594855305464
            name: Wer

whisper-large-v3-turbo-common_voice_16_1-zh-TW-pissa

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the JacobLinCool/mozilla-foundation-common_voice_16_1-zh-TW-preprocessed dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5133
  • Wer: 63.6656
  • Cer: 23.5752

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 0 0 2.7520 77.6125 20.7783
7.6982 0.9987 377 0.8744 87.9421 41.2804
5.1677 2.0 755 0.7499 82.5965 36.6407
3.3647 2.9987 1132 0.6433 76.8087 31.6068
3.4711 4.0 1510 0.6397 76.2460 30.2862
1.5694 4.9987 1887 0.5779 71.5434 27.5471
0.7951 6.0 2265 0.5664 71.3223 27.0600
0.4709 6.9987 2642 0.5492 68.8706 26.0131
0.116 8.0 3020 0.5427 66.7605 24.8104
0.0512 8.9987 3397 0.5298 66.1375 24.8632
0.0273 9.9868 3770 0.5133 63.6656 23.5752

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.0
  • Pytorch 2.4.0
  • Datasets 3.0.2
  • Tokenizers 0.20.1