--- license: apache-2.0 base_model: openai/whisper-tiny tags: - generated_from_keras_callback model-index: - name: whisper_4_with_init_sun__0065 results: [] --- # whisper_4_with_init_sun__0065 This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.4037 - Train Accuracy: 0.0324 - Train Wermet: 0.0912 - Validation Loss: 1.1798 - Validation Accuracy: 0.0206 - Validation Wermet: 0.3272 - Epoch: 64 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch | |:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:| | 5.3333 | 0.0111 | 1.3132 | 3.9675 | 0.0114 | 0.9339 | 0 | | 4.7131 | 0.0116 | 0.8607 | 3.9360 | 0.0114 | 0.9503 | 1 | | 4.6717 | 0.0117 | 0.8449 | 3.9196 | 0.0113 | 0.9768 | 2 | | 4.6474 | 0.0117 | 0.8338 | 3.9039 | 0.0114 | 0.9557 | 3 | | 4.6273 | 0.0118 | 0.8243 | 3.8721 | 0.0115 | 0.9414 | 4 | | 4.6101 | 0.0118 | 0.8167 | 3.8629 | 0.0116 | 0.9156 | 5 | | 4.5912 | 0.0119 | 0.7985 | 3.8361 | 0.0116 | 0.8988 | 6 | | 4.5645 | 0.0120 | 0.7753 | 3.8298 | 0.0116 | 0.9045 | 7 | | 4.5386 | 0.0121 | 0.7558 | 3.7904 | 0.0118 | 0.8426 | 8 | | 4.5075 | 0.0122 | 0.7405 | 3.7472 | 0.0119 | 0.8103 | 9 | | 4.4586 | 0.0124 | 0.7255 | 3.7163 | 0.0120 | 0.8189 | 10 | | 4.3978 | 0.0126 | 0.7174 | 3.6168 | 0.0122 | 0.8163 | 11 | | 4.3031 | 0.0128 | 0.7107 | 3.4956 | 0.0125 | 0.7847 | 12 | | 4.1606 | 0.0133 | 0.7025 | 3.3414 | 0.0128 | 0.7897 | 13 | | 3.9636 | 0.0138 | 0.6991 | 3.1311 | 0.0133 | 0.7495 | 14 | | 3.7290 | 0.0145 | 0.6827 | 2.8892 | 0.0139 | 0.7292 | 15 | | 3.4993 | 0.0152 | 0.6643 | 2.7195 | 0.0143 | 0.7129 | 16 | | 3.2810 | 0.0159 | 0.6448 | 2.5418 | 0.0148 | 0.6803 | 17 | | 3.0604 | 0.0167 | 0.6182 | 2.3572 | 0.0153 | 0.6538 | 18 | | 2.8748 | 0.0174 | 0.5946 | 2.2575 | 0.0156 | 0.6337 | 19 | | 2.6889 | 0.0181 | 0.5699 | 2.0988 | 0.0162 | 0.6016 | 20 | | 2.5493 | 0.0187 | 0.5449 | 1.9878 | 0.0166 | 0.5834 | 21 | | 2.3921 | 0.0194 | 0.5207 | 1.9029 | 0.0168 | 0.5597 | 22 | | 2.2491 | 0.0201 | 0.4987 | 1.8642 | 0.0169 | 0.5409 | 23 | | 2.1254 | 0.0207 | 0.4766 | 1.7354 | 0.0175 | 0.5231 | 24 | | 1.9980 | 0.0213 | 0.4552 | 1.6661 | 0.0178 | 0.5049 | 25 | | 1.9147 | 0.0217 | 0.4382 | 1.6140 | 0.0180 | 0.4921 | 26 | | 1.8008 | 0.0223 | 0.4196 | 1.5652 | 0.0182 | 0.4742 | 27 | | 1.7185 | 0.0228 | 0.4028 | 1.5159 | 0.0184 | 0.4632 | 28 | | 1.6401 | 0.0232 | 0.3867 | 1.4891 | 0.0185 | 0.4548 | 29 | | 1.5786 | 0.0235 | 0.3728 | 1.5141 | 0.0183 | 0.4548 | 30 | | 1.4950 | 0.0241 | 0.3582 | 1.4345 | 0.0188 | 0.4340 | 31 | | 1.4323 | 0.0244 | 0.3448 | 1.3694 | 0.0191 | 0.4226 | 32 | | 1.3495 | 0.0250 | 0.3319 | 1.3780 | 0.0190 | 0.4172 | 33 | | 1.3007 | 0.0253 | 0.3187 | 1.3296 | 0.0193 | 0.4109 | 34 | | 1.2320 | 0.0257 | 0.3074 | 1.3116 | 0.0194 | 0.4029 | 35 | | 1.1836 | 0.0261 | 0.2958 | 1.3025 | 0.0195 | 0.3992 | 36 | | 1.1131 | 0.0266 | 0.2842 | 1.2885 | 0.0195 | 0.3894 | 37 | | 1.0630 | 0.0269 | 0.2730 | 1.2627 | 0.0197 | 0.3850 | 38 | | 1.0189 | 0.0272 | 0.2628 | 1.2633 | 0.0197 | 0.3822 | 39 | | 1.0025 | 0.0273 | 0.2550 | 1.2561 | 0.0197 | 0.3760 | 40 | | 0.9498 | 0.0277 | 0.2445 | 1.2288 | 0.0199 | 0.3710 | 41 | | 0.9027 | 0.0281 | 0.2337 | 1.2188 | 0.0199 | 0.3684 | 42 | | 0.8469 | 0.0286 | 0.2240 | 1.2072 | 0.0200 | 0.3637 | 43 | | 0.8056 | 0.0289 | 0.2153 | 1.2046 | 0.0201 | 0.3599 | 44 | | 0.7761 | 0.0291 | 0.2070 | 1.1989 | 0.0201 | 0.3579 | 45 | | 0.7369 | 0.0295 | 0.1982 | 1.1938 | 0.0202 | 0.3528 | 46 | | 0.7026 | 0.0298 | 0.1902 | 1.1934 | 0.0202 | 0.3508 | 47 | | 0.6976 | 0.0298 | 0.1834 | 1.1803 | 0.0203 | 0.3469 | 48 | | 0.6880 | 0.0298 | 0.1765 | 1.1844 | 0.0203 | 0.3470 | 49 | | 0.6674 | 0.0300 | 0.1702 | 1.1741 | 0.0203 | 0.3446 | 50 | | 0.6099 | 0.0305 | 0.1606 | 1.1753 | 0.0203 | 0.3440 | 51 | | 0.5972 | 0.0306 | 0.1549 | 1.1692 | 0.0204 | 0.3401 | 52 | | 0.5555 | 0.0310 | 0.1475 | 1.1744 | 0.0204 | 0.3382 | 53 | | 0.5275 | 0.0313 | 0.1412 | 1.1743 | 0.0204 | 0.3384 | 54 | | 0.5103 | 0.0315 | 0.1344 | 1.1720 | 0.0205 | 0.3355 | 55 | | 0.5268 | 0.0313 | 0.1308 | 1.1709 | 0.0205 | 0.3343 | 56 | | 0.5060 | 0.0315 | 0.1251 | 1.2090 | 0.0203 | 0.3318 | 57 | | 0.4696 | 0.0318 | 0.1172 | 1.1748 | 0.0205 | 0.3321 | 58 | | 0.4737 | 0.0318 | 0.1136 | 1.1764 | 0.0205 | 0.3313 | 59 | | 0.4749 | 0.0318 | 0.1115 | 1.1684 | 0.0206 | 0.3289 | 60 | | 0.4208 | 0.0323 | 0.1015 | 1.1704 | 0.0206 | 0.3275 | 61 | | 0.3895 | 0.0326 | 0.0958 | 1.1777 | 0.0206 | 0.3286 | 62 | | 0.3721 | 0.0328 | 0.0909 | 1.1754 | 0.0206 | 0.3267 | 63 | | 0.4037 | 0.0324 | 0.0912 | 1.1798 | 0.0206 | 0.3272 | 64 | ### Framework versions - Transformers 4.34.0.dev0 - TensorFlow 2.13.0 - Tokenizers 0.13.3