versae's picture
Add multilingual to the language tag (#1)
4a2b459
metadata
language:
  - 'no'
  - nb
  - multilingual
license: apache-2.0
tags:
  - whisper-event
  - generated_from_trainer
datasets:
  - NbAiLab/NCC_S
metrics:
  - wer
model-index:
  - name: Whisper Large Norwegian
    results:
      - task:
          type: automatic-speech-recognition
          name: Automatic Speech Recognition
        dataset:
          name: NbAiLab/NCC_S
          type: NbAiLab/NCC_S
          config: 'no'
          split: validation
          args: 'no'
        metrics:
          - type: wer
            value: 12.058465286236297
            name: Wer

Whisper Large Norwegian

This model is a fine-tuned version of openai/whisper-large-v2 on the NbAiLab/NCC_S dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2784
  • Wer: 12.0585

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 12
  • eval_batch_size: 6
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.6755 0.2 1000 0.3108 14.3118
0.673 0.4 2000 0.3004 13.4592
0.6378 0.6 3000 0.2865 13.0024
0.5776 0.8 4000 0.2809 12.6675
0.5962 1.0 5000 0.2784 12.0585

Framework versions

  • Transformers 4.26.0.dev0
  • Pytorch 1.13.0+cu117
  • Datasets 2.7.1.dev0
  • Tokenizers 0.11.0