whisper-base.en-fsc-h

This model is a fine-tuned version of openai/whisper-base.en on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0700
  • Accuracy: 0.9863

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 48
  • eval_batch_size: 48
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 192
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9959 120 0.7203 0.7332
No log 2.0 241 0.3685 0.8505
No log 2.9959 361 0.2024 0.9293
No log 4.0 482 0.1429 0.9552
No log 4.9959 602 0.1143 0.9636
No log 6.0 723 0.1738 0.9565
No log 6.9959 843 0.1295 0.9560
No log 8.0 964 0.1407 0.9599
0.3435 8.9959 1084 0.1104 0.9694
0.3435 10.0 1205 0.1054 0.9718
0.3435 10.9959 1325 0.1042 0.9728
0.3435 12.0 1446 0.1084 0.9710
0.3435 12.9959 1566 0.0926 0.9789
0.3435 14.0 1687 0.1120 0.9763
0.3435 14.9959 1807 0.0903 0.9797
0.3435 16.0 1928 0.0957 0.9800
0.0255 16.9959 2048 0.0851 0.9815
0.0255 18.0 2169 0.0723 0.9844
0.0255 18.9959 2289 0.0784 0.9834
0.0255 20.0 2410 0.0673 0.9860
0.0255 20.9959 2530 0.0700 0.9863
0.0255 22.0 2651 0.0707 0.9858
0.0255 22.9959 2771 0.0712 0.9858
0.0255 24.0 2892 0.0715 0.9858
0.0015 24.8963 3000 0.0716 0.9858

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.2.2+cu121
  • Datasets 2.18.0
  • Tokenizers 0.19.1
Downloads last month
10
Safetensors
Model size
20M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for gokuls/whisper-base.en-fsc-h

Finetuned
(26)
this model