nb-whisper-large / README.md
pere's picture
Saving weights and logs of step 10000 - epoch 1
d01b5bc
|
raw
history blame
3.25 kB
metadata
language:
  - 'no'
license: apache-2.0
base_model: NbAiLab/nb-whisper-large-v3-RC4
tags:
  - audio
  - asr
  - automatic-speech-recognition
  - hf-asr-leaderboard
model-index:
  - name: nb-whisper-large-v0.8-vad3
    results: []

nb-whisper-large-v0.8-vad3

This model is a fine-tuned version of NbAiLab/nb-whisper-large-v3-RC4 on the NbAiLab/ncc_speech_styling_v2_vad3 dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 7e-05
  • lr_scheduler_type: linear
  • per_device_train_batch_size: 8
  • total_train_batch_size_per_node: 32
  • total_train_batch_size: 1024
  • total_optimization_steps: 50,000
  • starting_optimization_step: None
  • finishing_optimization_step: 50,000
  • num_train_dataset_workers: 32
  • num_hosts: 32
  • total_num_training_examples: 51,200,000
  • steps_per_epoch: 7482
  • num_beams: None
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.98
  • adam_epsilon: 1e-06
  • dropout: True
  • bpe_dropout_probability: 0.2
  • activation_dropout_probability: 0.1

Training results

step validation_nst_loss train_loss validation_nst_wer validation_nst_cer validation_nst_exact_wer validation_nst_exact_cer validation_clean_stortinget_no_loss validation_clean_stortinget_no_wer validation_clean_stortinget_no_cer validation_clean_stortinget_no_exact_wer validation_clean_stortinget_no_exact_cer
0 0.4259 0.9588 2.1721 0.6246 2.7111 0.7079 0.6807 8.5931 5.4608 11.4221 5.8946
5000 0.4376 0.5822 2.5859 0.7793 3.0867 0.8563 0.6738 9.1686 5.8478 12.0792 6.3020
10000 0.4368 0.5675 2.5913 0.7271 3.2337 0.8269 0.6875 9.2705 5.9200 12.1741 6.3750

Framework versions

  • Transformers 4.36.2
  • Datasets 2.16.1
  • Tokenizers 0.15.0