metadata
language:
- 'no'
license: apache-2.0
base_model: NbAiLabBeta/nb-whisper-medium
tags:
- audio
- asr
- automatic-speech-recognition
- hf-asr-leaderboard
model-index:
- name: nb-whisper-medium-v0.7-semantic
results: []
nb-whisper-medium-v0.7-semantic
This model is a fine-tuned version of NbAiLabBeta/nb-whisper-medium on the NbAiLab/ncc_speech_styling_v4 dataset. It achieves the following results on the evaluation set:
- step: 249
- validation_nst_loss: 0.6407
- train_loss: 0.6265
- validation_nst_wer: 2.4389
- validation_nst_cer: 0.7737
- validation_nst_exact_wer: 3.1412
- validation_nst_exact_cer: 0.8810
- validation_clean_stortinget_no_loss: 0.7441
- validation_clean_stortinget_no_wer: 9.1639
- validation_clean_stortinget_no_cer: 5.9596
- validation_clean_stortinget_no_exact_wer: 12.2050
- validation_clean_stortinget_no_exact_cer: 6.4346
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- lr_scheduler_type: linear
- per_device_train_batch_size: 32
- total_train_batch_size_per_node: 128
- total_train_batch_size: 1024
- total_optimization_steps: 250
- starting_optimization_step: None
- finishing_optimization_step: 250
- num_train_dataset_workers: 32
- num_hosts: 8
- total_num_training_examples: 256,000
- steps_per_epoch: To be computed after first epoch
- num_beams: None
- weight_decay: 0.01
- adam_beta1: 0.9
- adam_beta2: 0.98
- adam_epsilon: 1e-06
- dropout: True
- bpe_dropout_probability: 0.2
- activation_dropout_probability: 0.1
Training results
step | validation_nst_loss | train_loss | validation_nst_wer | validation_nst_cer | validation_nst_exact_wer | validation_nst_exact_cer | validation_clean_stortinget_no_loss | validation_clean_stortinget_no_wer | validation_clean_stortinget_no_cer | validation_clean_stortinget_no_exact_wer | validation_clean_stortinget_no_exact_cer |
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.4372 | 1.1581 | 2.2865 | 0.7038 | 2.9452 | 0.8059 | 0.7951 | 8.8939 | 5.6567 | 11.9393 | 6.1348 |
40 | 0.7551 | 0.7309 | 2.7546 | 0.9443 | 3.4950 | 1.0568 | 0.7489 | 9.8437 | 6.4691 | 12.9830 | 6.9559 |
80 | 0.7655 | 0.6341 | 2.5695 | 0.8250 | 3.3317 | 0.9414 | 0.7355 | 9.3794 | 6.1000 | 12.4090 | 6.5664 |
120 | 0.7493 | 0.6262 | 2.4716 | 0.8129 | 3.2283 | 0.9286 | 0.7432 | 9.3344 | 6.0544 | 12.3592 | 6.5247 |
160 | 0.7165 | 0.6353 | 2.5859 | 0.8259 | 3.3154 | 0.9378 | 0.7455 | 9.3179 | 6.0647 | 12.3852 | 6.5418 |
200 | 0.6678 | 0.6169 | 2.4661 | 0.7812 | 3.1575 | 0.8892 | 0.7430 | 9.2752 | 6.0298 | 12.3378 | 6.5079 |
240 | 0.6409 | 0.6385 | 2.4498 | 0.7849 | 3.1520 | 0.8911 | 0.7434 | 9.1900 | 5.9941 | 12.1931 | 6.4639 |
249 | 0.6407 | 0.6265 | 2.4389 | 0.7737 | 3.1412 | 0.8810 | |||||
249 | 0.7441 | 0.6265 | 9.1639 | 5.9596 | 12.2050 | 6.4346 |
Framework versions
- Transformers 4.34.1
- Datasets 2.15.0
- Tokenizers 0.14.1