Edit model card

openai/whisper-small

This model is a fine-tuned version of openai/whisper-small on the Hanhpt23/GermanMed-full dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6502
  • Wer: 22.8325
  • Cer: 15.2331

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.4898 1.0 194 0.4951 31.3586 21.1891
0.2315 2.0 388 0.4999 33.0968 22.6097
0.1009 3.0 582 0.5097 29.4456 19.8430
0.0558 4.0 776 0.5487 27.8412 19.0253
0.0379 5.0 970 0.5783 28.7566 19.9297
0.0232 6.0 1164 0.5849 24.9203 16.3505
0.0136 7.0 1358 0.5879 27.7486 19.3562
0.0143 8.0 1552 0.6237 25.2700 17.6567
0.0036 9.0 1746 0.6323 26.4939 18.1002
0.0035 10.0 1940 0.6338 26.5762 18.5022
0.0045 11.0 2134 0.6337 23.3364 15.9694
0.0041 12.0 2328 0.6363 23.6655 16.0144
0.0004 13.0 2522 0.6397 22.1125 14.6475
0.0002 14.0 2716 0.6415 22.7502 15.1846
0.0002 15.0 2910 0.6443 22.7193 15.1742
0.0002 16.0 3104 0.6461 22.7296 15.1915
0.0001 17.0 3298 0.6477 22.6885 15.1777
0.0002 18.0 3492 0.6490 22.6782 15.1586
0.0001 19.0 3686 0.6499 22.7296 15.1673
0.0001 20.0 3880 0.6502 22.8325 15.2331

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Hanhpt23/whisper-small-germanmed-free_ED0-8

Finetuned
(1770)
this model