File size: 7,886 Bytes
80542d8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0055
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# whisper_charsplit_new_round2__0055

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0014
- Train Accuracy: 0.0795
- Train Wermet: 7.7408
- Validation Loss: 0.5661
- Validation Accuracy: 0.0769
- Validation Wermet: 7.1664
- Epoch: 54

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010     | 0.0795         | 8.7507       | 0.5575          | 0.0767              | 7.6778            | 0     |
| 0.0013     | 0.0795         | 8.9468       | 0.5652          | 0.0766              | 8.3360            | 1     |
| 0.0025     | 0.0795         | 8.7338       | 0.5673          | 0.0765              | 8.3770            | 2     |
| 0.0019     | 0.0795         | 8.9450       | 0.5623          | 0.0766              | 7.7117            | 3     |
| 0.0011     | 0.0795         | 8.9053       | 0.5609          | 0.0767              | 7.5155            | 4     |
| 0.0012     | 0.0795         | 8.8862       | 0.5667          | 0.0767              | 8.2913            | 5     |
| 0.0009     | 0.0795         | 8.7510       | 0.5642          | 0.0766              | 7.9083            | 6     |
| 0.0037     | 0.0795         | 9.3428       | 0.5717          | 0.0764              | 8.2631            | 7     |
| 0.0031     | 0.0795         | 9.2135       | 0.5636          | 0.0766              | 8.2384            | 8     |
| 0.0011     | 0.0795         | 8.9730       | 0.5605          | 0.0767              | 8.3958            | 9     |
| 0.0005     | 0.0795         | 9.3749       | 0.5552          | 0.0768              | 8.0800            | 10    |
| 0.0003     | 0.0795         | 9.3340       | 0.5584          | 0.0768              | 8.1322            | 11    |
| 0.0005     | 0.0795         | 9.2292       | 0.5687          | 0.0767              | 8.5576            | 12    |
| 0.0037     | 0.0795         | 9.2838       | 0.5751          | 0.0765              | 7.4189            | 13    |
| 0.0038     | 0.0795         | 8.7270       | 0.5605          | 0.0767              | 7.7098            | 14    |
| 0.0012     | 0.0795         | 8.8259       | 0.5563          | 0.0768              | 8.2647            | 15    |
| 0.0005     | 0.0795         | 9.0553       | 0.5620          | 0.0768              | 8.5020            | 16    |
| 0.0004     | 0.0795         | 9.1734       | 0.5607          | 0.0768              | 8.0252            | 17    |
| 0.0003     | 0.0795         | 9.0084       | 0.5571          | 0.0769              | 8.1563            | 18    |
| 0.0014     | 0.0795         | 8.7153       | 0.5804          | 0.0765              | 7.8654            | 19    |
| 0.0058     | 0.0794         | 8.8460       | 0.5706          | 0.0766              | 7.4342            | 20    |
| 0.0020     | 0.0795         | 8.6599       | 0.5612          | 0.0767              | 7.7369            | 21    |
| 0.0007     | 0.0795         | 8.6456       | 0.5543          | 0.0768              | 7.4625            | 22    |
| 0.0008     | 0.0795         | 8.3246       | 0.5620          | 0.0768              | 7.4475            | 23    |
| 0.0012     | 0.0795         | 7.9451       | 0.5615          | 0.0768              | 7.0907            | 24    |
| 0.0025     | 0.0795         | 8.1065       | 0.5619          | 0.0768              | 7.7020            | 25    |
| 0.0011     | 0.0795         | 8.4237       | 0.5710          | 0.0768              | 7.4035            | 26    |
| 0.0009     | 0.0795         | 8.3074       | 0.5641          | 0.0768              | 7.1747            | 27    |
| 0.0007     | 0.0795         | 8.5183       | 0.5688          | 0.0768              | 7.4310            | 28    |
| 0.0014     | 0.0795         | 8.6604       | 0.5750          | 0.0767              | 8.0751            | 29    |
| 0.0022     | 0.0795         | 8.2353       | 0.5789          | 0.0767              | 7.4442            | 30    |
| 0.0019     | 0.0795         | 8.6037       | 0.5715          | 0.0767              | 7.6157            | 31    |
| 0.0009     | 0.0795         | 8.4768       | 0.5611          | 0.0769              | 7.6392            | 32    |
| 0.0005     | 0.0795         | 8.2728       | 0.5669          | 0.0768              | 7.1451            | 33    |
| 0.0010     | 0.0795         | 8.1006       | 0.5918          | 0.0766              | 7.4447            | 34    |
| 0.0036     | 0.0795         | 8.9171       | 0.5687          | 0.0767              | 7.6962            | 35    |
| 0.0018     | 0.0795         | 8.4062       | 0.5713          | 0.0768              | 7.2127            | 36    |
| 0.0012     | 0.0795         | 8.3370       | 0.5683          | 0.0768              | 7.1040            | 37    |
| 0.0005     | 0.0795         | 7.9931       | 0.5658          | 0.0769              | 6.8043            | 38    |
| 0.0002     | 0.0795         | 7.9500       | 0.5660          | 0.0769              | 7.0891            | 39    |
| 0.0001     | 0.0795         | 8.1912       | 0.5632          | 0.0770              | 7.1929            | 40    |
| 0.0001     | 0.0795         | 8.2484       | 0.5678          | 0.0769              | 7.6993            | 41    |
| 0.0001     | 0.0795         | 8.2925       | 0.5648          | 0.0770              | 7.1917            | 42    |
| 0.0001     | 0.0795         | 7.9155       | 0.5752          | 0.0769              | 6.4900            | 43    |
| 0.0095     | 0.0793         | 8.3244       | 0.5662          | 0.0767              | 6.9524            | 44    |
| 0.0019     | 0.0795         | 7.8491       | 0.5533          | 0.0769              | 6.9541            | 45    |
| 0.0006     | 0.0795         | 8.0596       | 0.5573          | 0.0768              | 6.9489            | 46    |
| 0.0008     | 0.0795         | 8.0277       | 0.5581          | 0.0769              | 6.9081            | 47    |
| 0.0005     | 0.0795         | 7.6084       | 0.5604          | 0.0769              | 6.7158            | 48    |
| 0.0006     | 0.0795         | 8.0561       | 0.5729          | 0.0767              | 7.4189            | 49    |
| 0.0014     | 0.0795         | 8.2875       | 0.5658          | 0.0768              | 7.5768            | 50    |
| 0.0011     | 0.0795         | 8.4376       | 0.5665          | 0.0768              | 7.2469            | 51    |
| 0.0018     | 0.0795         | 8.3093       | 0.5771          | 0.0768              | 7.2637            | 52    |
| 0.0021     | 0.0795         | 7.8370       | 0.5680          | 0.0768              | 7.0030            | 53    |
| 0.0014     | 0.0795         | 7.7408       | 0.5661          | 0.0769              | 7.1664            | 54    |


### Framework versions

- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3