metadata
language:
- 'no'
license: apache-2.0
base_model: NbAiLab/nb-whisper-small-RC1
tags:
- audio
- asr
- automatic-speech-recognition
- hf-asr-leaderboard
model-index:
- name: nb-whisper-small-v0.8-vad3
results: []
nb-whisper-small-v0.8-vad3
This model is a fine-tuned version of NbAiLab/nb-whisper-small-RC1 on the NbAiLab/ncc_speech_styling_v2_vad3 dataset. It achieves the following results on the evaluation set:
- step: 49999
- validation_nst_loss: 0.4444
- train_loss: 0.4400
- validation_nst_wer: 3.0595
- validation_nst_cer: 0.9443
- validation_nst_exact_wer: 3.7237
- validation_nst_exact_cer: 1.0431
- validation_clean_stortinget_no_loss: 0.7056
- validation_clean_stortinget_no_wer: 10.0663
- validation_clean_stortinget_no_cer: 6.2768
- validation_clean_stortinget_no_exact_wer: 13.4172
- validation_clean_stortinget_no_exact_cer: 6.8042
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- lr_scheduler_type: linear
- per_device_train_batch_size: 32
- total_train_batch_size_per_node: 128
- total_train_batch_size: 1024
- total_optimization_steps: 50,000
- starting_optimization_step: None
- finishing_optimization_step: 50,000
- num_train_dataset_workers: 32
- num_hosts: 8
- total_num_training_examples: 51,200,000
- steps_per_epoch: 7455
- num_beams: None
- weight_decay: 0.01
- adam_beta1: 0.9
- adam_beta2: 0.98
- adam_epsilon: 1e-06
- dropout: True
- bpe_dropout_probability: 0.2
- activation_dropout_probability: 0.1
Training results
step | validation_nst_loss | train_loss | validation_nst_wer | validation_nst_cer | validation_nst_exact_wer | validation_nst_exact_cer | validation_clean_stortinget_no_loss | validation_clean_stortinget_no_wer | validation_clean_stortinget_no_cer | validation_clean_stortinget_no_exact_wer | validation_clean_stortinget_no_exact_cer |
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.4313 | 1.0396 | 2.8254 | 0.8865 | 3.5168 | 0.9900 | 0.5547 | 9.6092 | 5.9949 | 12.6794 | 6.4755 |
5000 | 0.4484 | 0.5692 | 3.2010 | 1.0142 | 3.8870 | 1.1172 | 0.6138 | 10.1824 | 6.1896 | 13.4124 | 6.6954 |
10000 | 0.4477 | 0.5317 | 3.3589 | 1.0347 | 4.0176 | 1.1337 | 0.6275 | 10.3316 | 6.4310 | 13.6022 | 6.9442 |
15000 | 0.4493 | 0.5132 | 3.3099 | 1.0086 | 3.9904 | 1.1145 | 0.6599 | 10.2203 | 6.3042 | 13.4100 | 6.8175 |
20000 | 0.4491 | 0.4911 | 3.2283 | 1.0226 | 3.8924 | 1.1227 | 0.6755 | 10.1421 | 6.3188 | 13.4409 | 6.8428 |
25000 | 0.4441 | 0.4766 | 3.1575 | 0.9816 | 3.8924 | 1.0898 | 0.6763 | 10.2700 | 6.3383 | 13.5951 | 6.8658 |
30000 | 0.4498 | 0.4632 | 3.1357 | 0.9741 | 3.8543 | 1.0797 | 0.6599 | 10.2274 | 6.3787 | 13.5144 | 6.8974 |
35000 | 0.4480 | 0.4523 | 3.0432 | 0.9378 | 3.7727 | 1.0486 | 0.6948 | 10.2416 | 6.3617 | 13.5547 | 6.8912 |
40000 | 0.4471 | 0.4606 | 3.0486 | 0.9080 | 3.7291 | 1.0101 | 0.6754 | 10.2155 | 6.3375 | 13.5097 | 6.8506 |
45000 | 0.4442 | 0.4412 | 2.9778 | 0.9275 | 3.6366 | 1.0229 | 0.7021 | 10.1468 | 6.2994 | 13.5286 | 6.8358 |
49999 | 0.4444 | 0.4400 | 3.0595 | 0.9443 | 3.7237 | 1.0431 | |||||
49999 | 0.7056 | 0.4400 | 10.0663 | 6.2768 | 13.4172 | 6.8042 |
Framework versions
- Transformers 4.34.1
- Datasets 2.16.1
- Tokenizers 0.14.1