File size: 4,744 Bytes
e3a995c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 |
---
license: apache-2.0
base_model: bigmorning/whisper_charsplit_new_round2__0061
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round3__0027
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round3__0027
This model is a fine-tuned version of [bigmorning/whisper_charsplit_new_round2__0061](https://huggingface.co/bigmorning/whisper_charsplit_new_round2__0061) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0000
- Train Accuracy: 0.0795
- Train Wermet: 8.2151
- Validation Loss: 0.5614
- Validation Accuracy: 0.0771
- Validation Wermet: 7.1972
- Epoch: 26
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0009 | 0.0795 | 7.9492 | 0.5730 | 0.0769 | 7.2856 | 0 |
| 0.0015 | 0.0795 | 8.4221 | 0.5756 | 0.0769 | 7.1487 | 1 |
| 0.0012 | 0.0795 | 7.8476 | 0.5699 | 0.0769 | 6.5976 | 2 |
| 0.0010 | 0.0795 | 7.6843 | 0.5740 | 0.0769 | 6.9513 | 3 |
| 0.0014 | 0.0795 | 8.0796 | 0.5763 | 0.0768 | 7.4043 | 4 |
| 0.0019 | 0.0795 | 7.7274 | 0.5724 | 0.0769 | 6.4922 | 5 |
| 0.0008 | 0.0795 | 7.3468 | 0.5734 | 0.0769 | 6.1909 | 6 |
| 0.0009 | 0.0795 | 7.2393 | 0.5816 | 0.0769 | 6.5734 | 7 |
| 0.0010 | 0.0795 | 7.5822 | 0.5755 | 0.0769 | 6.6613 | 8 |
| 0.0004 | 0.0795 | 7.3807 | 0.5698 | 0.0770 | 7.0671 | 9 |
| 0.0001 | 0.0795 | 7.7157 | 0.5681 | 0.0771 | 6.8391 | 10 |
| 0.0001 | 0.0795 | 7.7540 | 0.5725 | 0.0771 | 6.9281 | 11 |
| 0.0001 | 0.0795 | 7.7721 | 0.5726 | 0.0771 | 6.8911 | 12 |
| 0.0000 | 0.0795 | 7.8163 | 0.5721 | 0.0771 | 6.8876 | 13 |
| 0.0000 | 0.0795 | 7.7745 | 0.5741 | 0.0771 | 6.8770 | 14 |
| 0.0000 | 0.0795 | 7.7277 | 0.5752 | 0.0771 | 6.8671 | 15 |
| 0.0000 | 0.0795 | 7.7355 | 0.5765 | 0.0771 | 6.8447 | 16 |
| 0.0000 | 0.0795 | 7.7109 | 0.5784 | 0.0771 | 6.8560 | 17 |
| 0.0000 | 0.0795 | 7.7427 | 0.5796 | 0.0771 | 6.8406 | 18 |
| 0.0003 | 0.0795 | 7.6709 | 0.6610 | 0.0762 | 7.0119 | 19 |
| 0.0115 | 0.0793 | 8.3288 | 0.5580 | 0.0769 | 7.1457 | 20 |
| 0.0013 | 0.0795 | 8.2537 | 0.5574 | 0.0770 | 6.7708 | 21 |
| 0.0004 | 0.0795 | 8.0507 | 0.5619 | 0.0770 | 7.0678 | 22 |
| 0.0003 | 0.0795 | 8.0534 | 0.5593 | 0.0771 | 7.0433 | 23 |
| 0.0002 | 0.0795 | 8.1738 | 0.5604 | 0.0771 | 7.1617 | 24 |
| 0.0001 | 0.0795 | 8.1494 | 0.5589 | 0.0771 | 7.1609 | 25 |
| 0.0000 | 0.0795 | 8.2151 | 0.5614 | 0.0771 | 7.1972 | 26 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|