File size: 5,600 Bytes
23d9b2a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_train_on_validated_cv_model__0035
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# whisper_train_on_validated_cv_model__0035

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0089
- Train Accuracy: 0.0821
- Train Wermet: 5.1607
- Validation Loss: 0.5637
- Validation Accuracy: 0.0725
- Validation Wermet: 7.3612
- Epoch: 34

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 2.2992     | 0.0331         | 6.3187       | 1.9677          | 0.0372              | 7.2782            | 0     |
| 1.7447     | 0.0431         | 5.6345       | 1.7865          | 0.0408              | 6.3969            | 1     |
| 1.5036     | 0.0480         | 5.6512       | 1.4821          | 0.0468              | 5.1810            | 2     |
| 1.0955     | 0.0567         | 4.4606       | 1.0492          | 0.0557              | 3.7886            | 3     |
| 0.7076     | 0.0654         | 4.9157       | 0.7312          | 0.0627              | 4.2416            | 4     |
| 0.4703     | 0.0710         | 4.9749       | 0.5668          | 0.0664              | 4.8883            | 5     |
| 0.3552     | 0.0737         | 5.1823       | 0.4725          | 0.0685              | 4.8141            | 6     |
| 0.2865     | 0.0754         | 4.1024       | 0.4358          | 0.0694              | 3.8550            | 7     |
| 0.2380     | 0.0765         | 3.3689       | 0.3947          | 0.0704              | 2.2953            | 8     |
| 0.1998     | 0.0775         | 2.6219       | 0.3848          | 0.0707              | 3.0529            | 9     |
| 0.1687     | 0.0782         | 2.2111       | 0.3689          | 0.0711              | 1.8146            | 10    |
| 0.1417     | 0.0789         | 2.5284       | 0.3709          | 0.0713              | 1.9439            | 11    |
| 0.1190     | 0.0795         | 2.8048       | 0.3631          | 0.0716              | 3.0845            | 12    |
| 0.0983     | 0.0800         | 3.2668       | 0.3657          | 0.0717              | 3.6423            | 13    |
| 0.0815     | 0.0804         | 3.8567       | 0.3806          | 0.0717              | 6.1506            | 14    |
| 0.0659     | 0.0808         | 5.0931       | 0.3920          | 0.0718              | 6.4431            | 15    |
| 0.0520     | 0.0812         | 5.6397       | 0.3935          | 0.0720              | 5.1514            | 16    |
| 0.0409     | 0.0814         | 5.7797       | 0.4147          | 0.0720              | 4.2822            | 17    |
| 0.0330     | 0.0816         | 5.1017       | 0.4354          | 0.0719              | 6.2876            | 18    |
| 0.0257     | 0.0818         | 6.1581       | 0.4476          | 0.0720              | 7.0531            | 19    |
| 0.0212     | 0.0819         | 6.3234       | 0.4647          | 0.0720              | 7.4961            | 20    |
| 0.0183     | 0.0820         | 5.8886       | 0.4744          | 0.0721              | 5.6633            | 21    |
| 0.0141     | 0.0821         | 6.0894       | 0.5076          | 0.0718              | 5.6186            | 22    |
| 0.0130     | 0.0821         | 5.8770       | 0.5010          | 0.0721              | 6.2209            | 23    |
| 0.0123     | 0.0821         | 5.7417       | 0.5214          | 0.0720              | 6.8845            | 24    |
| 0.0115     | 0.0821         | 5.7680       | 0.5333          | 0.0720              | 7.2049            | 25    |
| 0.0091     | 0.0822         | 5.3959       | 0.5272          | 0.0723              | 4.1630            | 26    |
| 0.0097     | 0.0821         | 5.0201       | 0.5545          | 0.0720              | 5.1619            | 27    |
| 0.0100     | 0.0821         | 5.2278       | 0.5328          | 0.0724              | 5.7914            | 28    |
| 0.0069     | 0.0822         | 4.9319       | 0.5432          | 0.0723              | 3.8214            | 29    |
| 0.0083     | 0.0822         | 4.4749       | 0.5610          | 0.0722              | 3.6943            | 30    |
| 0.0075     | 0.0822         | 4.8208       | 0.5609          | 0.0724              | 5.1153            | 31    |
| 0.0066     | 0.0822         | 4.0023       | 0.5662          | 0.0724              | 3.1397            | 32    |
| 0.0067     | 0.0822         | 4.3423       | 0.5831          | 0.0723              | 5.5127            | 33    |
| 0.0089     | 0.0821         | 5.1607       | 0.5637          | 0.0725              | 7.3612            | 34    |


### Framework versions

- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3