whisper_havest_0015 / README.md
bigmorning's picture
Upload TFWhisperForConditionalGeneration
7ab6db5
|
raw
history blame
2.59 kB
metadata
license: apache-2.0
tags:
  - generated_from_keras_callback
model-index:
  - name: whisper_havest_0015
    results: []

whisper_havest_0015

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.3473
  • Train Accuracy: 0.0321
  • Validation Loss: 0.5945
  • Validation Accuracy: 0.0311
  • Epoch: 14

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Validation Loss Validation Accuracy Epoch
5.0949 0.0116 4.4444 0.0124 0
4.3242 0.0130 4.0648 0.0140 1
3.9308 0.0145 3.6837 0.0157 2
3.5552 0.0159 3.3410 0.0171 3
3.1591 0.0175 2.8089 0.0198 4
2.2408 0.0221 1.7104 0.0255 5
1.4220 0.0261 1.2181 0.0279 6
1.0460 0.0280 0.9912 0.0290 7
0.8363 0.0291 0.8645 0.0296 8
0.6967 0.0299 0.7748 0.0301 9
0.5942 0.0305 0.7201 0.0304 10
0.5151 0.0309 0.6675 0.0307 11
0.4496 0.0314 0.6382 0.0308 12
0.3951 0.0318 0.6060 0.0310 13
0.3473 0.0321 0.5945 0.0311 14

Framework versions

  • Transformers 4.25.0.dev0
  • TensorFlow 2.9.2
  • Datasets 2.6.1
  • Tokenizers 0.13.2