Edit model card

whisper-nm-clp

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3038
  • Wer: 37.9421

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 2.4390 100 0.3751 228.2958
0.9619 4.8780 200 0.4282 51.4469
0.9619 7.3171 300 0.2953 46.3023
0.2051 9.7561 400 0.3202 49.3569
0.2051 12.1951 500 0.3298 45.9807
0.1113 14.6341 600 0.2517 44.6945
0.1113 17.0732 700 0.3117 47.4277
0.0846 19.5122 800 0.3111 113.5048
0.0846 21.9512 900 0.3486 47.9100
0.0584 24.3902 1000 0.3190 48.3923
0.0584 26.8293 1100 0.2865 37.2990
0.0264 29.2683 1200 0.4153 47.9100
0.0264 31.7073 1300 0.3360 43.8907
0.0122 34.1463 1400 0.2909 36.8167
0.0122 36.5854 1500 0.3393 38.4244
0.004 39.0244 1600 0.3064 37.9421
0.004 41.4634 1700 0.2980 34.7267
0.0006 43.9024 1800 0.3004 36.8167
0.0006 46.3415 1900 0.3034 37.9421
0.0002 48.7805 2000 0.3038 37.9421

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
7
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for susmitabhatt/whisper-nm-clp

Finetuned
(1928)
this model