Edit model card

whisper_charsplit_new_0099

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0022
  • Train Accuracy: 0.0795
  • Train Wermet: 8.7907
  • Validation Loss: 0.5556
  • Validation Accuracy: 0.0766
  • Validation Wermet: 7.7851
  • Epoch: 98

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Validation Loss Validation Accuracy Validation Wermet Epoch
0.8733 0.0602 13.0686 0.6470 0.0676 11.4066 0
0.5740 0.0666 12.7778 0.5113 0.0706 11.1022 1
0.4553 0.0692 12.2404 0.4371 0.0723 10.9105 2
0.3813 0.0708 11.9157 0.3935 0.0733 9.4615 3
0.3292 0.0720 11.5732 0.3630 0.0740 9.9885 4
0.2886 0.0729 11.5171 0.3403 0.0745 9.8042 5
0.2561 0.0736 11.3173 0.3256 0.0749 9.9431 6
0.2282 0.0743 11.7308 0.3159 0.0752 9.2086 7
0.2036 0.0748 11.4503 0.3071 0.0754 9.5236 8
0.1820 0.0754 11.7175 0.3005 0.0756 10.0755 9
0.1628 0.0758 11.7056 0.2993 0.0757 9.9497 10
0.1450 0.0762 11.7637 0.2971 0.0758 10.1481 11
0.1287 0.0766 11.8509 0.3029 0.0759 10.2042 12
0.1140 0.0770 12.1100 0.3004 0.0760 10.3873 13
0.0998 0.0773 11.9502 0.3025 0.0761 10.7066 14
0.0872 0.0777 12.3196 0.3129 0.0759 10.7707 15
0.0760 0.0779 12.2637 0.3142 0.0761 10.2638 16
0.0651 0.0782 12.1215 0.3192 0.0761 10.0750 17
0.0547 0.0785 12.0551 0.3294 0.0761 10.4732 18
0.0463 0.0787 11.9677 0.3402 0.0760 10.2814 19
0.0386 0.0789 11.6855 0.3517 0.0760 10.0599 20
0.0318 0.0790 11.6314 0.3628 0.0760 9.6652 21
0.0262 0.0792 11.4603 0.3728 0.0760 10.0035 22
0.0224 0.0792 11.4330 0.3824 0.0760 9.1995 23
0.0181 0.0793 11.3124 0.3982 0.0759 9.8710 24
0.0142 0.0794 11.3562 0.4057 0.0760 9.6831 25
0.0118 0.0794 11.0532 0.4207 0.0759 9.7227 26
0.0101 0.0794 11.2963 0.4282 0.0760 9.5792 27
0.0114 0.0794 11.3093 0.4431 0.0758 9.5545 28
0.0109 0.0794 11.4214 0.4419 0.0760 9.4377 29
0.0084 0.0794 10.9143 0.4474 0.0760 9.3668 30
0.0043 0.0795 10.9497 0.4525 0.0761 9.3202 31
0.0036 0.0795 10.7759 0.4667 0.0761 9.0385 32
0.0047 0.0795 10.7613 0.4788 0.0759 9.4065 33
0.0130 0.0793 11.1022 0.4748 0.0760 9.4521 34
0.0074 0.0794 10.9738 0.4730 0.0760 9.3348 35
0.0032 0.0795 10.6370 0.4750 0.0762 8.8298 36
0.0020 0.0795 10.7428 0.4835 0.0762 9.0566 37
0.0014 0.0795 10.6908 0.4937 0.0761 9.2445 38
0.0035 0.0795 10.6833 0.5276 0.0757 8.9798 39
0.0120 0.0793 10.4810 0.4963 0.0760 8.9194 40
0.0045 0.0795 10.2251 0.5014 0.0761 8.5737 41
0.0028 0.0795 10.3174 0.4968 0.0762 8.8525 42
0.0023 0.0795 10.4871 0.5027 0.0762 8.6712 43
0.0024 0.0795 10.3731 0.5055 0.0762 8.6347 44
0.0041 0.0795 10.2751 0.5242 0.0760 8.3671 45
0.0070 0.0794 10.2166 0.5169 0.0760 8.8409 46
0.0037 0.0795 10.0455 0.5174 0.0762 8.2514 47
0.0023 0.0795 9.9201 0.5167 0.0763 8.9537 48
0.0008 0.0795 10.0022 0.5166 0.0764 8.4855 49
0.0006 0.0795 9.9494 0.5233 0.0763 8.5719 50
0.0069 0.0794 10.2037 0.5434 0.0759 8.5399 51
0.0083 0.0794 9.9557 0.5173 0.0762 8.2406 52
0.0032 0.0795 10.0283 0.5240 0.0763 9.0101 53
0.0018 0.0795 10.0694 0.5247 0.0763 8.5717 54
0.0008 0.0795 10.1079 0.5217 0.0764 8.5608 55
0.0005 0.0795 10.0546 0.5286 0.0764 8.8830 56
0.0007 0.0795 10.2557 0.5328 0.0764 8.5665 57
0.0006 0.0795 10.2165 0.5412 0.0763 8.4623 58
0.0124 0.0792 10.2304 0.5284 0.0762 9.1194 59
0.0044 0.0795 10.3884 0.5223 0.0764 8.8152 60
0.0015 0.0795 9.8557 0.5227 0.0764 8.3774 61
0.0005 0.0795 9.8123 0.5233 0.0765 8.5043 62
0.0003 0.0795 9.7631 0.5282 0.0765 8.3860 63
0.0003 0.0795 9.7593 0.5320 0.0765 8.4815 64
0.0002 0.0795 9.7663 0.5357 0.0765 8.4281 65
0.0034 0.0795 9.8382 0.5771 0.0758 8.8051 66
0.0123 0.0792 10.2575 0.5261 0.0763 9.3701 67
0.0027 0.0795 10.3802 0.5272 0.0764 8.8216 68
0.0011 0.0795 10.1683 0.5291 0.0764 8.5736 69
0.0012 0.0795 10.1305 0.5336 0.0765 8.6648 70
0.0008 0.0795 10.2545 0.5315 0.0765 9.0617 71
0.0006 0.0795 10.4562 0.5369 0.0765 9.6485 72
0.0032 0.0795 10.2347 0.5569 0.0763 8.4947 73
0.0062 0.0794 10.1654 0.5471 0.0763 8.8666 74
0.0029 0.0795 10.1320 0.5376 0.0765 8.7713 75
0.0012 0.0795 10.2943 0.5406 0.0765 8.6959 76
0.0006 0.0795 10.1888 0.5371 0.0767 8.9689 77
0.0005 0.0795 10.2138 0.5398 0.0766 8.7470 78
0.0016 0.0795 10.2173 0.5497 0.0764 8.9675 79
0.0065 0.0794 10.2806 0.5559 0.0763 9.4487 80
0.0028 0.0795 10.7728 0.5394 0.0766 8.9716 81
0.0012 0.0795 10.3247 0.5453 0.0765 8.9986 82
0.0013 0.0795 10.3174 0.5535 0.0765 8.9229 83
0.0011 0.0795 10.2846 0.5452 0.0766 9.1239 84
0.0007 0.0795 10.1996 0.5491 0.0766 8.9308 85
0.0034 0.0795 10.5048 0.5578 0.0764 8.9920 86
0.0038 0.0795 10.1430 0.5538 0.0765 9.1635 87
0.0019 0.0795 10.3176 0.5492 0.0766 8.5812 88
0.0007 0.0795 10.2569 0.5488 0.0766 8.9133 89
0.0006 0.0795 10.2538 0.5541 0.0766 8.7676 90
0.0029 0.0795 10.1412 0.5666 0.0764 9.0822 91
0.0042 0.0795 9.5603 0.5582 0.0765 7.6837 92
0.0015 0.0795 9.4004 0.5495 0.0766 7.7859 93
0.0008 0.0795 9.5417 0.5503 0.0767 7.8876 94
0.0005 0.0795 9.3473 0.5590 0.0766 7.8967 95
0.0016 0.0795 9.1740 0.5746 0.0765 7.8469 96
0.0044 0.0794 8.8948 0.5589 0.0765 7.4085 97
0.0022 0.0795 8.7907 0.5556 0.0766 7.7851 98

Framework versions

  • Transformers 4.32.0.dev0
  • TensorFlow 2.12.0
  • Tokenizers 0.13.3
Downloads last month
4

Finetuned from