Edit model card

Whisper Small FIFA_commentary

This model is a fine-tuned version of openai/whisper-small on the FIFA_commentary dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3021
  • Wer: 23.0032

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 1.0 3 1.3065 23.6422
No log 2.0 6 1.3063 23.6422
No log 3.0 9 1.3060 23.6422
No log 4.0 12 1.3054 23.6422
No log 5.0 15 1.3047 23.6422
No log 6.0 18 1.3041 23.6422
No log 7.0 21 1.3038 23.3227
No log 8.0 24 1.3030 23.0032
No log 9.0 27 1.3025 23.0032
No log 10.0 30 1.3021 23.0032

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
242M params
Tensor type
F32
·
Inference API
or
This model can be loaded on Inference API (serverless).

Finetuned from

Dataset used to train jmcastelo17/whisper-small-FIFA-best

Evaluation results