pere's picture
Update stats.md
aaeaddf
metadata
language:
  - 'no'
license: apache-2.0
base_model: NbAiLab/nb-whisper-tiny-v0.8-vad3
tags:
  - audio
  - asr
  - automatic-speech-recognition
  - hf-asr-leaderboard
model-index:
  - name: nb-whisper-tiny-v0.8-vad3-verbatim
    results: []

nb-whisper-tiny-v0.8-vad3-verbatim

This model is a fine-tuned version of NbAiLab/nb-whisper-tiny-v0.8-vad3 on the NbAiLab/NPSC dataset. It achieves the following results on the evaluation set:

  • step: 249
  • validation_loss: 0.6217
  • train_loss: 0.5135
  • validation_wer: 14.8034
  • validation_cer: 5.2777
  • validation_exact_wer: 15.0102
  • validation_exact_cer: 5.3185

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.00015
  • lr_scheduler_type: linear
  • per_device_train_batch_size: 32
  • total_train_batch_size_per_node: 128
  • total_train_batch_size: 1024
  • total_optimization_steps: 250
  • starting_optimization_step: None
  • finishing_optimization_step: 250
  • num_train_dataset_workers: 32
  • num_hosts: 8
  • total_num_training_examples: 256,000
  • steps_per_epoch: 45
  • num_beams: None
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.98
  • adam_epsilon: 1e-06
  • dropout: True
  • bpe_dropout_probability: 0.2
  • activation_dropout_probability: 0.1

Training results

step validation_loss train_loss validation_wer validation_cer validation_exact_wer validation_exact_cer
0 1.2428 1.2274 23.3208 12.5036 37.5903 15.7108
40 0.6608 0.6532 17.0908 6.1566 17.2721 6.2179
80 0.6306 0.6049 15.7542 5.6424 16.0029 5.7030
120 0.6208 0.5465 15.2676 5.4967 15.5573 5.5526
160 0.6187 0.5377 15.1334 5.4006 15.3260 5.4422
200 0.6178 0.5273 14.7475 5.2368 14.9763 5.2794
240 0.6192 0.5216 14.6133 5.2120 14.8579 5.2556
249 0.6217 0.5135 14.8034 5.2777 15.0102 5.3185

Framework versions

  • Transformers 4.34.1
  • Datasets 2.16.1
  • Tokenizers 0.14.1