Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

Whisper-Small-Proctor-lora

This model is a fine-tuned version of openai/whisper-small on the Procotor-Dataset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3882
  • Wer: 17.3304

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • training_steps: 500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.7627 0.8929 25 0.7457 31.6183
0.3173 1.7857 50 0.4644 29.4680
0.2013 2.6786 75 0.3984 20.4205
0.1569 3.5714 100 0.3754 14.9570
0.1148 4.4643 125 0.3716 18.4932
0.0794 5.3571 150 0.3643 14.9888
0.0658 6.25 175 0.3637 20.1338
0.0272 7.1429 200 0.3527 18.5091
0.0169 8.0357 225 0.3457 22.0452
0.0085 8.9286 250 0.3644 24.3071
0.0052 9.8214 275 0.3716 20.5639
0.0042 10.7143 300 0.3802 18.7958
0.0019 11.6071 325 0.3738 16.6773
0.0017 12.5 350 0.3811 18.5409
0.0015 13.3929 375 0.3824 19.1940
0.0015 14.2857 400 0.3863 19.0347
0.0014 15.1786 425 0.3866 19.1144
0.0012 16.0714 450 0.3869 17.3304
0.0012 16.9643 475 0.3878 17.3304
0.001 17.8571 500 0.3882 17.3304

Framework versions

  • PEFT 0.14.1.dev0
  • Transformers 4.49.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
0
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Model tree for SGzK/whisper-small-inbrowser-proctor

Adapter
(121)
this model

Evaluation results