metadata
license: apache-2.0
base_model: bigmorning/whisper_charsplit_new_round2__0061
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round3__0046
results: []
whisper_charsplit_new_round3__0046
This model is a fine-tuned version of bigmorning/whisper_charsplit_new_round2__0061 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0000
- Train Accuracy: 0.0795
- Train Wermet: 7.7766
- Validation Loss: 0.5639
- Validation Accuracy: 0.0772
- Validation Wermet: 6.8982
- Epoch: 45
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
---|---|---|---|---|---|---|
0.0009 | 0.0795 | 7.9492 | 0.5730 | 0.0769 | 7.2856 | 0 |
0.0015 | 0.0795 | 8.4221 | 0.5756 | 0.0769 | 7.1487 | 1 |
0.0012 | 0.0795 | 7.8476 | 0.5699 | 0.0769 | 6.5976 | 2 |
0.0010 | 0.0795 | 7.6843 | 0.5740 | 0.0769 | 6.9513 | 3 |
0.0014 | 0.0795 | 8.0796 | 0.5763 | 0.0768 | 7.4043 | 4 |
0.0019 | 0.0795 | 7.7274 | 0.5724 | 0.0769 | 6.4922 | 5 |
0.0008 | 0.0795 | 7.3468 | 0.5734 | 0.0769 | 6.1909 | 6 |
0.0009 | 0.0795 | 7.2393 | 0.5816 | 0.0769 | 6.5734 | 7 |
0.0010 | 0.0795 | 7.5822 | 0.5755 | 0.0769 | 6.6613 | 8 |
0.0004 | 0.0795 | 7.3807 | 0.5698 | 0.0770 | 7.0671 | 9 |
0.0001 | 0.0795 | 7.7157 | 0.5681 | 0.0771 | 6.8391 | 10 |
0.0001 | 0.0795 | 7.7540 | 0.5725 | 0.0771 | 6.9281 | 11 |
0.0001 | 0.0795 | 7.7721 | 0.5726 | 0.0771 | 6.8911 | 12 |
0.0000 | 0.0795 | 7.8163 | 0.5721 | 0.0771 | 6.8876 | 13 |
0.0000 | 0.0795 | 7.7745 | 0.5741 | 0.0771 | 6.8770 | 14 |
0.0000 | 0.0795 | 7.7277 | 0.5752 | 0.0771 | 6.8671 | 15 |
0.0000 | 0.0795 | 7.7355 | 0.5765 | 0.0771 | 6.8447 | 16 |
0.0000 | 0.0795 | 7.7109 | 0.5784 | 0.0771 | 6.8560 | 17 |
0.0000 | 0.0795 | 7.7427 | 0.5796 | 0.0771 | 6.8406 | 18 |
0.0003 | 0.0795 | 7.6709 | 0.6610 | 0.0762 | 7.0119 | 19 |
0.0115 | 0.0793 | 8.3288 | 0.5580 | 0.0769 | 7.1457 | 20 |
0.0013 | 0.0795 | 8.2537 | 0.5574 | 0.0770 | 6.7708 | 21 |
0.0004 | 0.0795 | 8.0507 | 0.5619 | 0.0770 | 7.0678 | 22 |
0.0003 | 0.0795 | 8.0534 | 0.5593 | 0.0771 | 7.0433 | 23 |
0.0002 | 0.0795 | 8.1738 | 0.5604 | 0.0771 | 7.1617 | 24 |
0.0001 | 0.0795 | 8.1494 | 0.5589 | 0.0771 | 7.1609 | 25 |
0.0000 | 0.0795 | 8.2151 | 0.5614 | 0.0771 | 7.1972 | 26 |
0.0000 | 0.0795 | 8.2332 | 0.5633 | 0.0771 | 7.1736 | 27 |
0.0000 | 0.0795 | 8.2573 | 0.5648 | 0.0771 | 7.2086 | 28 |
0.0000 | 0.0795 | 8.2571 | 0.5667 | 0.0771 | 7.1787 | 29 |
0.0000 | 0.0795 | 8.2607 | 0.5689 | 0.0771 | 7.2107 | 30 |
0.0000 | 0.0795 | 8.2992 | 0.5700 | 0.0772 | 7.2006 | 31 |
0.0000 | 0.0795 | 8.3059 | 0.5721 | 0.0772 | 7.2341 | 32 |
0.0000 | 0.0795 | 8.2872 | 0.5744 | 0.0772 | 7.2069 | 33 |
0.0080 | 0.0794 | 8.3693 | 0.5947 | 0.0762 | 7.3034 | 34 |
0.0063 | 0.0794 | 8.2517 | 0.5491 | 0.0769 | 7.1324 | 35 |
0.0008 | 0.0795 | 7.9115 | 0.5447 | 0.0771 | 6.9422 | 36 |
0.0002 | 0.0795 | 7.6265 | 0.5471 | 0.0771 | 6.8107 | 37 |
0.0001 | 0.0795 | 7.6685 | 0.5493 | 0.0771 | 6.6914 | 38 |
0.0001 | 0.0795 | 7.6100 | 0.5515 | 0.0771 | 6.7738 | 39 |
0.0000 | 0.0795 | 7.6623 | 0.5535 | 0.0771 | 6.7829 | 40 |
0.0000 | 0.0795 | 7.6768 | 0.5556 | 0.0771 | 6.8287 | 41 |
0.0000 | 0.0795 | 7.7199 | 0.5578 | 0.0772 | 6.8398 | 42 |
0.0000 | 0.0795 | 7.7423 | 0.5600 | 0.0772 | 6.8518 | 43 |
0.0000 | 0.0795 | 7.7561 | 0.5617 | 0.0772 | 6.8898 | 44 |
0.0000 | 0.0795 | 7.7766 | 0.5639 | 0.0772 | 6.8982 | 45 |
Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3