Edit model card

whisper_input_decoder_no_lob__0075

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.2529
  • Train Accuracy: 0.0339
  • Train Wermet: 0.0685
  • Validation Loss: 1.2239
  • Validation Accuracy: 0.0207
  • Validation Wermet: 0.3321
  • Epoch: 74

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
5.4122 0.0107 0.9328 3.9759 0.0114 0.9606 0
4.7176 0.0116 0.8683 3.9404 0.0114 0.9334 1
4.6750 0.0117 0.8478 3.9211 0.0115 0.9237 2
4.6511 0.0117 0.8413 3.8864 0.0115 0.9331 3
4.6294 0.0118 0.8270 3.8729 0.0115 0.9228 4
4.6134 0.0118 0.8199 3.8690 0.0114 0.9451 5
4.5980 0.0118 0.8102 3.8491 0.0115 0.9152 6
4.5759 0.0119 0.7890 3.8366 0.0116 0.8691 7
4.5518 0.0120 0.7694 3.8081 0.0116 0.9013 8
4.5219 0.0121 0.7591 3.7734 0.0118 0.8383 9
4.4761 0.0122 0.7400 3.7156 0.0120 0.8125 10
4.4139 0.0125 0.7257 3.6311 0.0121 0.8188 11
4.3113 0.0128 0.7127 3.5089 0.0124 0.8008 12
4.1608 0.0132 0.7088 3.3587 0.0127 0.7742 13
3.9595 0.0138 0.7012 3.1493 0.0132 0.7718 14
3.7188 0.0145 0.6820 2.8784 0.0139 0.7292 15
3.4775 0.0153 0.6678 2.6716 0.0144 0.7074 16
3.2575 0.0160 0.6481 2.4980 0.0149 0.6764 17
3.0615 0.0167 0.6314 2.3456 0.0153 0.6476 18
2.8715 0.0174 0.6094 2.2090 0.0158 0.6210 19
2.6930 0.0181 0.5931 2.0918 0.0162 0.5992 20
2.5383 0.0187 0.5739 1.9769 0.0166 0.5791 21
2.3952 0.0193 0.5512 1.9042 0.0168 0.5589 22
2.2427 0.0201 0.5333 1.8028 0.0172 0.5394 23
2.1236 0.0206 0.5174 1.7434 0.0174 0.5240 24
2.0315 0.0211 0.4978 1.6755 0.0177 0.5084 25
1.9066 0.0217 0.4773 1.6534 0.0178 0.4947 26
1.8279 0.0221 0.4596 1.5606 0.0182 0.4788 27
1.7325 0.0227 0.4412 1.5173 0.0184 0.4667 28
1.6416 0.0232 0.4199 1.4733 0.0186 0.4511 29
1.5702 0.0236 0.4028 1.4519 0.0187 0.4442 30
1.4787 0.0241 0.3839 1.4213 0.0188 0.4322 31
1.4238 0.0244 0.3700 1.3971 0.0190 0.4272 32
1.3561 0.0249 0.3594 1.3499 0.0192 0.4171 33
1.2828 0.0254 0.3431 1.3555 0.0192 0.4097 34
1.2318 0.0257 0.3277 1.3183 0.0194 0.4035 35
1.1668 0.0262 0.3201 1.3068 0.0195 0.3978 36
1.1571 0.0261 0.3105 1.2901 0.0195 0.3916 37
1.0812 0.0267 0.2989 1.2720 0.0197 0.3860 38
1.0134 0.0273 0.2863 1.2593 0.0197 0.3777 39
0.9986 0.0273 0.2769 1.2629 0.0198 0.3754 40
0.9322 0.0279 0.2653 1.2320 0.0199 0.3694 41
0.9021 0.0281 0.2552 1.2308 0.0200 0.3651 42
0.8583 0.0284 0.2444 1.2199 0.0200 0.3614 43
0.8101 0.0288 0.2355 1.2120 0.0200 0.3597 44
0.8045 0.0288 0.2299 1.2023 0.0201 0.3567 45
0.7823 0.0290 0.2213 1.2075 0.0201 0.3529 46
0.7186 0.0296 0.2107 1.1917 0.0202 0.3530 47
0.6949 0.0298 0.2028 1.1926 0.0202 0.3465 48
0.6669 0.0300 0.1943 1.1902 0.0203 0.3446 49
0.6125 0.0305 0.1842 1.1892 0.0203 0.3437 50
0.5926 0.0307 0.1778 1.2058 0.0203 0.3450 51
0.6055 0.0305 0.1738 1.1859 0.0203 0.3394 52
0.5828 0.0307 0.1653 1.1921 0.0203 0.3379 53
0.5507 0.0311 0.1569 1.1906 0.0204 0.3385 54
0.5050 0.0315 0.1485 1.1834 0.0205 0.3361 55
0.4878 0.0316 0.1447 1.1815 0.0205 0.3329 56
0.4825 0.0317 0.1410 1.2096 0.0204 0.3359 57
0.4987 0.0315 0.1374 1.2000 0.0204 0.3352 58
0.4576 0.0319 0.1305 1.1868 0.0205 0.3329 59
0.4185 0.0323 0.1215 1.2043 0.0205 0.3322 60
0.3889 0.0326 0.1156 1.1853 0.0206 0.3302 61
0.3790 0.0327 0.1101 1.2028 0.0205 0.3316 62
0.4072 0.0324 0.1110 1.2502 0.0203 0.3309 63
0.3519 0.0330 0.1020 1.1959 0.0206 0.3284 64
0.3861 0.0326 0.1034 1.1885 0.0206 0.3271 65
0.3789 0.0326 0.0961 1.1969 0.0206 0.3298 66
0.3233 0.0332 0.0905 1.1922 0.0207 0.3280 67
0.2956 0.0335 0.0854 1.2003 0.0207 0.3296 68
0.2666 0.0339 0.0796 1.2141 0.0207 0.3252 69
0.3181 0.0333 0.0813 1.2133 0.0207 0.3302 70
0.3032 0.0335 0.0770 1.2170 0.0207 0.3315 71
0.2746 0.0337 0.0741 1.2180 0.0207 0.3299 72
0.2549 0.0339 0.0705 1.2496 0.0206 0.3308 73
0.2529 0.0339 0.0685 1.2239 0.0207 0.3321 74

Framework versions

  • Transformers 4.33.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
6

Finetuned from