Edit model card

whisper_input_decoder_shift_r_labels_with_force__0045

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.1194
  • Train Accuracy: 0.0334
  • Train Wermet: 0.3137
  • Validation Loss: 0.7596
  • Validation Accuracy: 0.0207
  • Validation Wermet: 0.5479
  • Epoch: 44

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.6249 0.0091 1.7162 4.2965 0.0094 0.9447 0
4.9223 0.0099 0.9041 4.1562 0.0097 0.9327 1
4.6814 0.0107 0.8376 3.9245 0.0103 0.8927 2
4.4407 0.0114 0.8311 3.7252 0.0107 0.8775 3
4.2445 0.0119 0.8228 3.6283 0.0108 0.8695 4
4.0889 0.0123 0.8067 3.5310 0.0110 0.8916 5
3.9575 0.0127 0.7908 3.4478 0.0113 0.8407 6
3.8547 0.0130 0.7781 3.4227 0.0113 0.8670 7
3.7599 0.0133 0.7654 3.3519 0.0115 0.8375 8
3.6763 0.0136 0.7543 3.3183 0.0116 0.8678 9
3.6006 0.0138 0.7421 3.2581 0.0117 0.8120 10
3.5300 0.0140 0.7296 3.2415 0.0118 0.8257 11
3.4554 0.0143 0.7179 3.2163 0.0119 0.8078 12
3.3930 0.0145 0.7057 3.1612 0.0121 0.7758 13
3.3218 0.0148 0.6946 3.1357 0.0122 0.7760 14
3.2424 0.0151 0.6806 3.0812 0.0123 0.7639 15
3.1577 0.0155 0.6633 3.0193 0.0126 0.7428 16
3.0655 0.0159 0.6454 2.9643 0.0128 0.7423 17
2.9579 0.0164 0.6271 2.8510 0.0132 0.7103 18
2.8149 0.0170 0.6022 2.7020 0.0136 0.6811 19
2.6475 0.0178 0.5775 2.5406 0.0142 0.6495 20
2.4340 0.0189 0.5451 2.3364 0.0149 0.6166 21
2.2002 0.0200 0.5065 2.1300 0.0155 0.5766 22
1.9511 0.0213 0.4658 1.9335 0.0162 0.5419 23
1.6777 0.0228 0.4184 1.7327 0.0169 0.5071 24
1.4282 0.0243 0.3754 1.5461 0.0176 0.4669 25
1.2219 0.0255 0.3365 1.4027 0.0181 0.4326 26
1.0535 0.0265 0.3016 1.2979 0.0185 0.4134 27
0.9205 0.0274 0.2731 1.1891 0.0189 0.3843 28
0.8079 0.0281 0.2453 1.1135 0.0192 0.3659 29
0.7140 0.0288 0.2218 1.0532 0.0195 0.3495 30
0.6318 0.0293 0.1975 0.9976 0.0197 0.3351 31
0.5623 0.0298 0.1770 0.9571 0.0199 0.3256 32
0.4990 0.0303 0.1582 0.9184 0.0200 0.3147 33
0.4444 0.0307 0.1424 0.8865 0.0202 0.3062 34
0.3949 0.0311 0.1260 0.8532 0.0203 0.2968 35
0.3505 0.0314 0.1118 0.8333 0.0204 0.2898 36
0.3104 0.0317 0.0988 0.8245 0.0204 0.2881 37
0.2743 0.0321 0.0886 0.8014 0.0205 0.2825 38
0.2428 0.0323 0.0842 0.7944 0.0206 0.2794 39
0.2120 0.0326 0.0880 0.7742 0.0206 0.2762 40
0.1863 0.0328 0.1289 0.7744 0.0206 0.2863 41
0.1621 0.0330 0.1792 0.7683 0.0207 0.2873 42
0.1390 0.0332 0.1918 0.7664 0.0207 0.4006 43
0.1194 0.0334 0.3137 0.7596 0.0207 0.5479 44

Framework versions

  • Transformers 4.34.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
3

Finetuned from