Edit model card

he

This model is a fine-tuned version of openai/whisper-medium on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1756
  • Wer: 20.1811
  • Avg Precision Exact: 0.8083
  • Avg Recall Exact: 0.8102
  • Avg F1 Exact: 0.8087
  • Avg Precision Letter Shift: 0.8373
  • Avg Recall Letter Shift: 0.8394
  • Avg F1 Letter Shift: 0.8377
  • Avg Precision Word Level: 0.8427
  • Avg Recall Word Level: 0.8450
  • Avg F1 Word Level: 0.8432
  • Avg Precision Word Shift: 0.9448
  • Avg Recall Word Shift: 0.9489
  • Avg F1 Word Shift: 0.9460
  • Precision Median Exact: 0.9091
  • Recall Median Exact: 0.9091
  • F1 Median Exact: 0.9091
  • Precision Max Exact: 1.0
  • Recall Max Exact: 1.0
  • F1 Max Exact: 1.0
  • Precision Min Exact: 0.0
  • Recall Min Exact: 0.0
  • F1 Min Exact: 0.0
  • Precision Min Letter Shift: 0.0
  • Recall Min Letter Shift: 0.0
  • F1 Min Letter Shift: 0.0
  • Precision Min Word Level: 0.0
  • Recall Min Word Level: 0.0
  • F1 Min Word Level: 0.0
  • Precision Min Word Shift: 0.1429
  • Recall Min Word Shift: 0.1
  • F1 Min Word Shift: 0.1176

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Avg Precision Exact Avg Recall Exact Avg F1 Exact Avg Precision Letter Shift Avg Recall Letter Shift Avg F1 Letter Shift Avg Precision Word Level Avg Recall Word Level Avg F1 Word Level Avg Precision Word Shift Avg Recall Word Shift Avg F1 Word Shift Precision Median Exact Recall Median Exact F1 Median Exact Precision Max Exact Recall Max Exact F1 Max Exact Precision Min Exact Recall Min Exact F1 Min Exact Precision Min Letter Shift Recall Min Letter Shift F1 Min Letter Shift Precision Min Word Level Recall Min Word Level F1 Min Word Level Precision Min Word Shift Recall Min Word Shift F1 Min Word Shift
0.2873 0.16 2000 0.3088 44.3311 0.5598 0.5685 0.5633 0.6026 0.6118 0.6062 0.6138 0.6239 0.6178 0.8019 0.8196 0.8092 0.6154 0.625 0.6207 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1752 0.32 4000 0.2328 35.1811 0.6557 0.6595 0.6568 0.6946 0.6985 0.6957 0.7041 0.7082 0.7053 0.8676 0.8745 0.8698 0.75 0.75 0.75 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.117 0.48 6000 0.1997 29.4605 0.7124 0.7125 0.7117 0.7514 0.7513 0.7506 0.7604 0.7606 0.7597 0.9031 0.9063 0.9037 0.8182 0.8182 0.8148 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.1 0.1053
0.0994 0.64 8000 0.1881 27.5610 0.7359 0.7407 0.7376 0.7708 0.7758 0.7726 0.7783 0.7837 0.7803 0.9117 0.9191 0.9144 0.8333 0.8462 0.8387 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.0909 0.1111
0.0664 0.8 10000 0.1837 25.9682 0.7446 0.7529 0.7480 0.7785 0.7873 0.7821 0.7857 0.7944 0.7893 0.9194 0.9277 0.9226 0.8462 0.8571 0.8571 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.0909 0.1111
0.0767 0.96 12000 0.1760 24.6157 0.7561 0.7662 0.7604 0.7868 0.7973 0.7913 0.7936 0.8040 0.7980 0.9194 0.9315 0.9245 0.8667 0.875 0.8723 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0593 1.12 14000 0.1716 23.6511 0.7669 0.7732 0.7694 0.7988 0.8054 0.8014 0.8047 0.8114 0.8073 0.9275 0.9343 0.9300 0.875 0.8889 0.8800 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.1111 0.1111
0.0525 1.28 16000 0.1712 23.1264 0.7778 0.7788 0.7777 0.8092 0.8104 0.8092 0.8156 0.8169 0.8156 0.9345 0.9383 0.9356 0.8889 0.8889 0.8889 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.1111 0.125
0.0358 1.44 18000 0.1699 22.4538 0.7841 0.7845 0.7837 0.8150 0.8157 0.8147 0.8212 0.8222 0.8211 0.9344 0.9376 0.9351 0.8889 0.8889 0.8889 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1053
0.0323 1.6 20000 0.1713 22.2173 0.7873 0.7926 0.7893 0.8170 0.8224 0.8190 0.8230 0.8286 0.8251 0.9362 0.9424 0.9385 0.9 0.9 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.1 0.1111
0.0248 1.76 22000 0.1683 21.7480 0.7934 0.7945 0.7933 0.8235 0.8248 0.8235 0.8294 0.8310 0.8295 0.9407 0.9433 0.9412 0.9 0.9 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176
0.0209 1.92 24000 0.1696 21.0310 0.7982 0.8000 0.7986 0.8275 0.8292 0.8278 0.8331 0.8351 0.8335 0.9424 0.9461 0.9435 0.9 0.9 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176
0.0202 2.08 26000 0.1713 21.1936 0.7954 0.7988 0.7965 0.8250 0.8285 0.8262 0.8309 0.8346 0.8321 0.9404 0.9460 0.9424 0.9 0.9091 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0769 0.0909 0.0833
0.0172 2.24 28000 0.1716 20.7761 0.8013 0.8053 0.8027 0.8304 0.8346 0.8319 0.8359 0.8407 0.8376 0.9404 0.9469 0.9428 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.1818 0.2000
0.0161 2.4 30000 0.1740 20.6135 0.8052 0.8079 0.8059 0.8351 0.8380 0.8359 0.8408 0.8440 0.8417 0.9439 0.9494 0.9459 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.125 0.1 0.1176
0.01 2.56 32000 0.1743 20.6948 0.8031 0.8048 0.8033 0.8322 0.8339 0.8323 0.8380 0.8399 0.8382 0.9441 0.9480 0.9452 0.9091 0.9091 0.9062 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176
0.025 2.72 34000 0.1753 20.6282 0.8033 0.8072 0.8046 0.8327 0.8368 0.8341 0.8383 0.8430 0.8400 0.9419 0.9489 0.9446 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176
0.0082 2.88 36000 0.1756 20.3991 0.8060 0.8081 0.8064 0.8354 0.8378 0.8359 0.8406 0.8433 0.8412 0.9436 0.9484 0.9452 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176
0.013 3.04 38000 0.1754 20.3030 0.8078 0.8097 0.8082 0.8374 0.8395 0.8378 0.8427 0.8452 0.8433 0.9447 0.9488 0.9459 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176
0.0183 3.2 40000 0.1756 20.1811 0.8083 0.8102 0.8087 0.8373 0.8394 0.8377 0.8427 0.8450 0.8432 0.9448 0.9489 0.9460 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1 0.1176

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
37.8M params
Tensor type
F32
·

Finetuned from