File size: 9,042 Bytes
82422c2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_syl_cv12_pad_lob100_low__0065
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_syl_cv12_pad_lob100_low__0065
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0042
- Train Accuracy: 0.0362
- Train Wermet: 0.0019
- Validation Loss: 0.6914
- Validation Accuracy: 0.0233
- Validation Wermet: 0.2424
- Epoch: 64
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 5.2930 | 0.0113 | 2.0658 | 3.9415 | 0.0117 | 0.9401 | 0 |
| 4.6215 | 0.0121 | 0.8917 | 3.7803 | 0.0120 | 0.9294 | 1 |
| 4.4086 | 0.0128 | 0.8403 | 3.6070 | 0.0124 | 0.9223 | 2 |
| 4.1842 | 0.0135 | 0.8337 | 3.4291 | 0.0128 | 0.8867 | 3 |
| 3.9981 | 0.0141 | 0.8182 | 3.3251 | 0.0131 | 0.8750 | 4 |
| 3.8531 | 0.0145 | 0.8058 | 3.2385 | 0.0133 | 0.8699 | 5 |
| 3.7345 | 0.0149 | 0.7925 | 3.1751 | 0.0134 | 0.8665 | 6 |
| 3.6307 | 0.0152 | 0.7851 | 3.1031 | 0.0136 | 0.8507 | 7 |
| 3.5437 | 0.0155 | 0.7717 | 3.0752 | 0.0138 | 0.8286 | 8 |
| 3.4649 | 0.0157 | 0.7651 | 3.0334 | 0.0139 | 0.8417 | 9 |
| 3.3926 | 0.0159 | 0.7531 | 3.0022 | 0.0139 | 0.8413 | 10 |
| 3.3262 | 0.0162 | 0.7462 | 2.9669 | 0.0140 | 0.8264 | 11 |
| 3.2625 | 0.0164 | 0.7367 | 2.9342 | 0.0141 | 0.8520 | 12 |
| 3.1979 | 0.0166 | 0.7231 | 2.9046 | 0.0144 | 0.8196 | 13 |
| 3.1319 | 0.0169 | 0.7133 | 2.8607 | 0.0145 | 0.8026 | 14 |
| 3.0616 | 0.0172 | 0.7007 | 2.8165 | 0.0146 | 0.7788 | 15 |
| 2.9792 | 0.0176 | 0.6816 | 2.7552 | 0.0149 | 0.7643 | 16 |
| 2.8905 | 0.0180 | 0.6641 | 2.6788 | 0.0151 | 0.7473 | 17 |
| 2.7749 | 0.0186 | 0.6424 | 2.5824 | 0.0155 | 0.7241 | 18 |
| 2.6263 | 0.0193 | 0.6159 | 2.4206 | 0.0161 | 0.7047 | 19 |
| 2.4352 | 0.0203 | 0.5829 | 2.2230 | 0.0168 | 0.6500 | 20 |
| 2.1941 | 0.0216 | 0.5411 | 2.0349 | 0.0175 | 0.5980 | 21 |
| 1.9184 | 0.0231 | 0.4922 | 1.7850 | 0.0184 | 0.5659 | 22 |
| 1.6174 | 0.0249 | 0.4371 | 1.5664 | 0.0192 | 0.5081 | 23 |
| 1.3542 | 0.0265 | 0.3851 | 1.3992 | 0.0199 | 0.4690 | 24 |
| 1.1499 | 0.0278 | 0.3408 | 1.2512 | 0.0205 | 0.4299 | 25 |
| 0.9878 | 0.0288 | 0.3029 | 1.1479 | 0.0209 | 0.4013 | 26 |
| 0.8600 | 0.0297 | 0.2735 | 1.0527 | 0.0213 | 0.3755 | 27 |
| 0.7516 | 0.0305 | 0.2441 | 0.9803 | 0.0216 | 0.3570 | 28 |
| 0.6626 | 0.0311 | 0.2197 | 0.9314 | 0.0219 | 0.3416 | 29 |
| 0.5863 | 0.0316 | 0.1993 | 0.8730 | 0.0221 | 0.3238 | 30 |
| 0.5187 | 0.0321 | 0.1775 | 0.8357 | 0.0223 | 0.3136 | 31 |
| 0.4608 | 0.0326 | 0.1610 | 0.8059 | 0.0224 | 0.3033 | 32 |
| 0.4087 | 0.0330 | 0.1467 | 0.7746 | 0.0226 | 0.2949 | 33 |
| 0.3642 | 0.0334 | 0.1298 | 0.7476 | 0.0227 | 0.2847 | 34 |
| 0.3221 | 0.0337 | 0.1168 | 0.7330 | 0.0228 | 0.2802 | 35 |
| 0.2837 | 0.0340 | 0.1030 | 0.7093 | 0.0229 | 0.2728 | 36 |
| 0.2509 | 0.0343 | 0.0882 | 0.6941 | 0.0229 | 0.2687 | 37 |
| 0.2209 | 0.0346 | 0.0747 | 0.6892 | 0.0230 | 0.2656 | 38 |
| 0.1934 | 0.0349 | 0.0670 | 0.6824 | 0.0230 | 0.2630 | 39 |
| 0.1688 | 0.0351 | 0.0542 | 0.6773 | 0.0230 | 0.2625 | 40 |
| 0.1469 | 0.0353 | 0.0429 | 0.6700 | 0.0231 | 0.2633 | 41 |
| 0.1268 | 0.0355 | 0.0365 | 0.6680 | 0.0231 | 0.2578 | 42 |
| 0.1086 | 0.0357 | 0.0284 | 0.6643 | 0.0231 | 0.2540 | 43 |
| 0.0920 | 0.0358 | 0.0221 | 0.6645 | 0.0231 | 0.2530 | 44 |
| 0.0783 | 0.0359 | 0.0169 | 0.6621 | 0.0232 | 0.2540 | 45 |
| 0.0667 | 0.0360 | 0.0121 | 0.6714 | 0.0232 | 0.2532 | 46 |
| 0.0563 | 0.0361 | 0.0094 | 0.6604 | 0.0232 | 0.2503 | 47 |
| 0.0477 | 0.0361 | 0.0072 | 0.6620 | 0.0232 | 0.2489 | 48 |
| 0.0397 | 0.0362 | 0.0055 | 0.6611 | 0.0232 | 0.2502 | 49 |
| 0.0330 | 0.0362 | 0.0045 | 0.6686 | 0.0232 | 0.2496 | 50 |
| 0.0283 | 0.0362 | 0.0033 | 0.6705 | 0.0232 | 0.2503 | 51 |
| 0.0242 | 0.0362 | 0.0034 | 0.6686 | 0.0232 | 0.2486 | 52 |
| 0.0212 | 0.0362 | 0.0031 | 0.6686 | 0.0232 | 0.2493 | 53 |
| 0.0197 | 0.0362 | 0.0028 | 0.6688 | 0.0232 | 0.2530 | 54 |
| 0.0226 | 0.0362 | 0.0041 | 0.6598 | 0.0233 | 0.2451 | 55 |
| 0.0158 | 0.0362 | 0.0024 | 0.6605 | 0.0233 | 0.2428 | 56 |
| 0.0115 | 0.0362 | 0.0018 | 0.6648 | 0.0233 | 0.2435 | 57 |
| 0.0094 | 0.0362 | 0.0017 | 0.6672 | 0.0233 | 0.2446 | 58 |
| 0.0081 | 0.0362 | 0.0018 | 0.6731 | 0.0233 | 0.2439 | 59 |
| 0.0071 | 0.0362 | 0.0017 | 0.6762 | 0.0233 | 0.2429 | 60 |
| 0.0062 | 0.0362 | 0.0017 | 0.6794 | 0.0233 | 0.2426 | 61 |
| 0.0055 | 0.0362 | 0.0017 | 0.6825 | 0.0233 | 0.2429 | 62 |
| 0.0048 | 0.0362 | 0.0017 | 0.6895 | 0.0233 | 0.2450 | 63 |
| 0.0042 | 0.0362 | 0.0019 | 0.6914 | 0.0233 | 0.2424 | 64 |
### Framework versions
- Transformers 4.33.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3
|