File size: 4,434 Bytes
bf6248d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_4_with_init_sun_char_0025
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# whisper_4_with_init_sun_char_0025

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.1717
- Train Accuracy: 0.0392
- Train Wermet: 0.0635
- Validation Loss: 1.9791
- Validation Accuracy: 0.0282
- Validation Wermet: 0.0928
- Epoch: 24

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 3.2071     | 0.0313         | 0.1237       | 2.8546          | 0.0225              | 0.1109            | 0     |
| 3.0365     | 0.0325         | 0.0375       | 2.8115          | 0.0228              | 0.1215            | 1     |
| 3.0162     | 0.0326         | 0.0484       | 2.7884          | 0.0231              | 0.1318            | 2     |
| 3.0042     | 0.0327         | 0.0555       | 2.7853          | 0.0233              | 0.1393            | 3     |
| 2.9934     | 0.0328         | 0.0614       | 2.7657          | 0.0232              | 0.1273            | 4     |
| 2.9858     | 0.0329         | 0.0654       | 2.7542          | 0.0234              | 0.1073            | 5     |
| 2.9735     | 0.0330         | 0.0673       | 2.7367          | 0.0234              | 0.1414            | 6     |
| 2.9574     | 0.0332         | 0.0704       | 2.6961          | 0.0240              | 0.1429            | 7     |
| 2.9320     | 0.0335         | 0.0723       | 2.6652          | 0.0239              | 0.0990            | 8     |
| 2.8976     | 0.0339         | 0.0729       | 2.5997          | 0.0245              | 0.0944            | 9     |
| 2.8460     | 0.0343         | 0.0728       | 2.5378          | 0.0248              | 0.1435            | 10    |
| 2.7781     | 0.0347         | 0.0741       | 2.4355          | 0.0254              | 0.1372            | 11    |
| 2.7083     | 0.0352         | 0.0747       | 2.5163          | 0.0248              | 0.0987            | 12    |
| 2.6445     | 0.0356         | 0.0720       | 2.2997          | 0.0261              | 0.1484            | 13    |
| 2.5838     | 0.0360         | 0.0724       | 2.2386          | 0.0266              | 0.1419            | 14    |
| 2.5294     | 0.0363         | 0.0721       | 2.1855          | 0.0269              | 0.1289            | 15    |
| 2.4760     | 0.0367         | 0.0711       | 2.1682          | 0.0271              | 0.1214            | 16    |
| 2.4339     | 0.0370         | 0.0698       | 2.1018          | 0.0273              | 0.1264            | 17    |
| 2.3867     | 0.0373         | 0.0684       | 2.0647          | 0.0275              | 0.1403            | 18    |
| 2.3528     | 0.0376         | 0.0669       | 2.0705          | 0.0275              | 0.1089            | 19    |
| 2.3145     | 0.0379         | 0.0658       | 2.0179          | 0.0280              | 0.1209            | 20    |
| 2.2765     | 0.0382         | 0.0654       | 2.0182          | 0.0279              | 0.1023            | 21    |
| 2.2415     | 0.0385         | 0.0650       | 1.9558          | 0.0284              | 0.1523            | 22    |
| 2.2102     | 0.0388         | 0.0643       | 1.9395          | 0.0285              | 0.1123            | 23    |
| 2.1717     | 0.0392         | 0.0635       | 1.9791          | 0.0282              | 0.0928            | 24    |


### Framework versions

- Transformers 4.34.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3