whisper_nmc_nomimose_30

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2184
  • Wer: 24.2321

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 132
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 1.0 59 0.4281 266.4391
1.3358 2.0 118 1.5591 60.6371
1.3358 3.0 177 0.5338 308.3049
0.4635 4.0 236 0.3016 44.3686
0.4635 5.0 295 0.2477 34.8123
0.2439 6.0 354 0.3004 45.3925
0.1766 7.0 413 0.2947 38.2253
0.1766 8.0 472 0.4231 63.8225
0.1258 9.0 531 0.2170 37.5427
0.1258 10.0 590 0.2435 104.2093
0.1033 11.0 649 0.2698 36.7463
0.0805 12.0 708 0.2398 35.1536
0.0805 13.0 767 0.2139 54.9488
0.0553 14.0 826 0.3215 41.5245
0.0553 15.0 885 0.2489 39.5904
0.0559 16.0 944 0.1868 52.3322
0.0377 17.0 1003 0.2378 56.3140
0.0377 18.0 1062 0.2427 30.2617
0.029 19.0 1121 0.1880 25.4835
0.029 20.0 1180 0.1866 27.7588
0.0111 21.0 1239 0.2203 22.8669
0.0111 22.0 1298 0.2507 28.4414
0.0037 23.0 1357 0.2227 26.9625
0.0025 24.0 1416 0.2157 24.4596
0.0025 25.0 1475 0.2189 24.6871
0.0005 26.0 1534 0.2184 24.2321
0.0005 27.0 1593 0.2184 25.1422
0.0002 28.0 1652 0.2184 24.2321
0.0001 29.0 1711 0.2184 24.2321
0.0001 29.4957 1740 0.2184 24.2321

Framework versions

  • Transformers 4.50.0.dev0
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
2
Safetensors
Model size
242M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for susmitabhatt/whisper_nmc_nomimose_30

Finetuned
(2481)
this model