Whisper Small - FutureProofGlitch

This model is a fine-tuned version of openai/whisper-small on the AMI Meeting Corpus dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4325
  • Wer Ortho: 19.5838
  • Wer: 19.3832

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant_with_warmup
  • lr_scheduler_warmup_steps: 50
  • training_steps: 4000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Ortho Wer
0.2735 0.61 500 0.3324 21.5310 21.2081
0.1235 1.22 1000 0.3473 19.6819 19.4991
0.1317 1.83 1500 0.3342 19.0920 18.7929
0.0647 2.44 2000 0.3671 22.8615 22.6949
0.0294 3.05 2500 0.3842 18.5566 18.4101
0.0534 3.66 3000 0.4044 20.8094 20.5998
0.0366 4.27 3500 0.4277 20.2686 20.1372
0.0328 4.88 4000 0.4325 19.5838 19.3832

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.2
Downloads last month
30
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for futureProofGlitch/whisper-small

Finetuned
(2043)
this model
Finetunes
1 model

Dataset used to train futureProofGlitch/whisper-small

Evaluation results