he-cantillation
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2139
- Wer: 17.8049
- Avg Precision Exact: 0.8322
- Avg Recall Exact: 0.8338
- Avg F1 Exact: 0.8325
- Avg Precision Letter Shift: 0.8586
- Avg Recall Letter Shift: 0.8603
- Avg F1 Letter Shift: 0.8589
- Avg Precision Word Level: 0.8646
- Avg Recall Word Level: 0.8661
- Avg F1 Word Level: 0.8648
- Avg Precision Word Shift: 0.9560
- Avg Recall Word Shift: 0.9594
- Avg F1 Word Shift: 0.9570
- Precision Median Exact: 0.9231
- Recall Median Exact: 0.9231
- F1 Median Exact: 0.9231
- Precision Max Exact: 1.0
- Recall Max Exact: 1.0
- F1 Max Exact: 1.0
- Precision Min Exact: 0.0
- Recall Min Exact: 0.0
- F1 Min Exact: 0.0
- Precision Min Letter Shift: 0.0
- Recall Min Letter Shift: 0.0
- F1 Min Letter Shift: 0.0
- Precision Min Word Level: 0.0
- Recall Min Word Level: 0.0
- F1 Min Word Level: 0.0
- Precision Min Word Shift: 0.1111
- Recall Min Word Shift: 0.1111
- F1 Min Word Shift: 0.125
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 100000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 0.0 | 1 | 9.8845 | 100.0 | 0.0005 | 0.0042 | 0.0008 | 0.0151 | 0.0160 | 0.0155 | 0.0037 | 0.0339 | 0.0066 | 0.0689 | 0.0732 | 0.0706 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.0658 | 0.8 | 10000 | 0.1829 | 26.4745 | 0.7434 | 0.7474 | 0.7447 | 0.7770 | 0.7813 | 0.7784 | 0.7850 | 0.7890 | 0.7862 | 0.9183 | 0.9221 | 0.9192 | 0.8516 | 0.8571 | 0.8571 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0833 | 0.0909 | 0.0870 |
0.03 | 1.6 | 20000 | 0.1693 | 21.5854 | 0.7959 | 0.7942 | 0.7945 | 0.8256 | 0.8239 | 0.8241 | 0.8319 | 0.8300 | 0.8302 | 0.9415 | 0.9414 | 0.9406 | 0.9091 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1111 | 0.125 |
0.0279 | 2.4 | 30000 | 0.1784 | 20.1072 | 0.8107 | 0.8146 | 0.8121 | 0.8398 | 0.8438 | 0.8412 | 0.8459 | 0.8503 | 0.8474 | 0.9442 | 0.9508 | 0.9467 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.0175 | 3.2 | 40000 | 0.1883 | 19.3459 | 0.8145 | 0.8149 | 0.8141 | 0.8411 | 0.8415 | 0.8407 | 0.8465 | 0.8472 | 0.8463 | 0.9480 | 0.9503 | 0.9485 | 0.9167 | 0.9167 | 0.9167 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0909 | 0.1111 | 0.1000 |
0.0046 | 4.0 | 50000 | 0.1997 | 19.2535 | 0.8253 | 0.8230 | 0.8236 | 0.8527 | 0.8504 | 0.8510 | 0.8592 | 0.8567 | 0.8574 | 0.9523 | 0.9517 | 0.9513 | 0.9167 | 0.9167 | 0.9167 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.125 | 0.1333 |
0.0072 | 4.8 | 60000 | 0.2034 | 18.3407 | 0.8247 | 0.8272 | 0.8254 | 0.8527 | 0.8553 | 0.8535 | 0.8595 | 0.8619 | 0.8602 | 0.9551 | 0.9586 | 0.9562 | 0.9167 | 0.9167 | 0.9167 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.0041 | 5.6 | 70000 | 0.2095 | 18.2262 | 0.8263 | 0.8263 | 0.8258 | 0.8534 | 0.8535 | 0.8529 | 0.8589 | 0.8591 | 0.8584 | 0.9554 | 0.9578 | 0.9560 | 0.9231 | 0.9167 | 0.9231 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
0.0011 | 6.4 | 80000 | 0.2145 | 18.1744 | 0.8287 | 0.8292 | 0.8284 | 0.8550 | 0.8555 | 0.8547 | 0.8602 | 0.8606 | 0.8598 | 0.9545 | 0.9570 | 0.9551 | 0.9167 | 0.9231 | 0.9231 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0909 | 0.1111 | 0.1000 |
0.002 | 7.2 | 90000 | 0.2144 | 17.8640 | 0.8323 | 0.8332 | 0.8323 | 0.8587 | 0.8598 | 0.8587 | 0.8646 | 0.8655 | 0.8646 | 0.9559 | 0.9587 | 0.9566 | 0.9231 | 0.9231 | 0.9231 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0909 | 0.1111 | 0.1000 |
0.0018 | 8.0 | 100000 | 0.2139 | 17.8049 | 0.8322 | 0.8338 | 0.8325 | 0.8586 | 0.8603 | 0.8589 | 0.8646 | 0.8661 | 0.8648 | 0.9560 | 0.9594 | 0.9570 | 0.9231 | 0.9231 | 0.9231 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1111 | 0.125 |
Framework versions
- Transformers 4.39.0.dev0
- Pytorch 2.2.1+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 4