bigmorning's picture
Upload TFWhisperForConditionalGeneration
7db6466
|
raw
history blame
3.25 kB
metadata
license: apache-2.0
tags:
  - generated_from_keras_callback
model-index:
  - name: whisper_wermet_nosup_0015
    results: []

whisper_wermet_nosup_0015

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.2807
  • Train Accuracy: 0.0326
  • Train Wermet: 5.9455
  • Validation Loss: 0.5636
  • Validation Accuracy: 0.0313
  • Validation Wermet: 5.3382
  • Epoch: 14

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.0860 0.0116 45.4352 4.4455 0.0124 36.1611 0
4.3098 0.0131 29.4890 4.0321 0.0144 24.9514 1
3.6711 0.0160 25.7380 2.7995 0.0205 32.2126 2
2.2582 0.0224 31.5946 1.6772 0.0257 23.9282 3
1.4268 0.0262 23.3380 1.2097 0.0279 18.7331 4
1.0613 0.0279 13.6764 0.9972 0.0289 10.7707 5
0.8545 0.0290 9.2746 0.8605 0.0296 6.5566 6
0.7144 0.0297 7.4723 0.7768 0.0300 5.4825 7
0.6116 0.0303 6.9092 0.7125 0.0304 6.5220 8
0.5307 0.0308 6.4113 0.6640 0.0306 5.7589 9
0.4647 0.0312 5.5865 0.6338 0.0309 4.8846 10
0.4085 0.0316 5.9874 0.6035 0.0310 3.8731 11
0.3602 0.0320 5.5558 0.5838 0.0311 6.3441 12
0.3184 0.0323 6.2887 0.5690 0.0312 3.9274 13
0.2807 0.0326 5.9455 0.5636 0.0313 5.3382 14

Framework versions

  • Transformers 4.25.0.dev0
  • TensorFlow 2.9.2
  • Datasets 2.6.1
  • Tokenizers 0.13.2