File size: 5,532 Bytes
cc0a5c2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: whisper_new_split_0035
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# whisper_new_split_0035

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0054
- Train Accuracy: 0.0335
- Train Wermet: 27.4530
- Validation Loss: 0.4807
- Validation Accuracy: 0.0315
- Validation Wermet: 30.3598
- Epoch: 34

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 5.1027     | 0.0113         | 52.5530      | 4.4267          | 0.0121              | 41.4796           | 0     |
| 4.3285     | 0.0126         | 38.6893      | 3.9835          | 0.0145              | 33.6050           | 1     |
| 3.4573     | 0.0168         | 30.7714      | 2.5568          | 0.0215              | 31.7559           | 2     |
| 2.0878     | 0.0226         | 20.5131      | 1.5738          | 0.0257              | 21.2159           | 3     |
| 1.3529     | 0.0258         | 17.4367      | 1.1712          | 0.0276              | 17.7695           | 4     |
| 0.9953     | 0.0275         | 18.7308      | 0.9389          | 0.0287              | 20.5259           | 5     |
| 0.7852     | 0.0286         | 18.5731      | 0.8074          | 0.0294              | 17.6576           | 6     |
| 0.6428     | 0.0293         | 18.2945      | 0.7219          | 0.0298              | 19.9850           | 7     |
| 0.5384     | 0.0299         | 18.9258      | 0.6610          | 0.0301              | 18.9327           | 8     |
| 0.4565     | 0.0304         | 19.0749      | 0.6117          | 0.0304              | 21.9796           | 9     |
| 0.3901     | 0.0308         | 19.2099      | 0.5693          | 0.0306              | 18.0965           | 10    |
| 0.3348     | 0.0312         | 20.4777      | 0.5449          | 0.0307              | 19.9518           | 11    |
| 0.2877     | 0.0315         | 20.3181      | 0.5232          | 0.0309              | 20.4017           | 12    |
| 0.2471     | 0.0318         | 19.2073      | 0.5057          | 0.0310              | 18.7612           | 13    |
| 0.2120     | 0.0320         | 19.0961      | 0.4925          | 0.0311              | 22.3187           | 14    |
| 0.1809     | 0.0323         | 20.7944      | 0.4849          | 0.0311              | 27.2314           | 15    |
| 0.1539     | 0.0325         | 22.0951      | 0.4787          | 0.0312              | 25.2171           | 16    |
| 0.1299     | 0.0327         | 22.7652      | 0.4733          | 0.0312              | 22.7492           | 17    |
| 0.1087     | 0.0329         | 25.2223      | 0.4701          | 0.0312              | 28.9044           | 18    |
| 0.0899     | 0.0330         | 24.8354      | 0.4715          | 0.0313              | 21.1618           | 19    |
| 0.0739     | 0.0332         | 25.4987      | 0.4680          | 0.0313              | 29.6304           | 20    |
| 0.0604     | 0.0333         | 27.6465      | 0.4693          | 0.0313              | 27.6937           | 21    |
| 0.0498     | 0.0333         | 27.7045      | 0.4711          | 0.0313              | 27.5013           | 22    |
| 0.0414     | 0.0334         | 28.0547      | 0.4689          | 0.0313              | 29.1776           | 23    |
| 0.0327     | 0.0334         | 27.5594      | 0.4718          | 0.0313              | 31.5623           | 24    |
| 0.0256     | 0.0335         | 27.3983      | 0.4710          | 0.0313              | 27.1071           | 25    |
| 0.0210     | 0.0335         | 24.7398      | 0.4736          | 0.0313              | 30.8282           | 26    |
| 0.0165     | 0.0335         | 25.1927      | 0.4773          | 0.0313              | 24.1750           | 27    |
| 0.0133     | 0.0335         | 25.6261      | 0.4807          | 0.0313              | 29.9520           | 28    |
| 0.0110     | 0.0335         | 25.8127      | 0.4825          | 0.0314              | 27.0813           | 29    |
| 0.0171     | 0.0335         | 26.0445      | 0.4858          | 0.0313              | 39.8503           | 30    |
| 0.0154     | 0.0335         | 28.6186      | 0.4766          | 0.0314              | 28.4465           | 31    |
| 0.0094     | 0.0335         | 27.8978      | 0.4778          | 0.0314              | 28.7775           | 32    |
| 0.0071     | 0.0335         | 27.8180      | 0.4775          | 0.0314              | 28.5229           | 33    |
| 0.0054     | 0.0335         | 27.4530      | 0.4807          | 0.0315              | 30.3598           | 34    |


### Framework versions

- Transformers 4.27.0.dev0
- TensorFlow 2.11.0
- Tokenizers 0.13.2