Edit model card

whisper_tn_hi

This model is a fine-tuned version of openai/whisper-tiny on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6400
  • Wer: 147.3885
  • Cer: 295.3741

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.6414 1.0 409 0.8809 83.9795 248.4138
0.531 2.0 818 0.6346 75.4042 280.7284
0.3538 3.0 1227 0.5810 69.8595 259.1276
0.2679 4.0 1636 0.5639 84.1488 332.292
0.2074 5.0 2045 0.5715 85.9773 264.435
0.1599 6.0 2454 0.6074 86.481 262.8894
0.1216 7.0 2863 0.6402 110.0398 285.4649
0.0903 8.0 3272 0.6736 92.7961 278.7952
0.0663 9.0 3681 0.7023 96.3896 268.8475
0.0472 10.0 4090 0.7527 104.0041 276.3805
0.0335 11.0 4499 0.7907 99.0646 274.9479
0.0235 12.0 4908 0.8320 128.1004 282.2682
0.0169 13.0 5317 0.8741 116.1305 277.659
0.0124 14.0 5726 0.9090 137.6534 290.6719
0.0094 15.0 6135 0.9492 117.0405 285.2422
0.007 16.0 6544 0.9905 122.1663 280.3801
0.0061 17.0 6953 1.0199 125.0656 277.5049
0.0051 18.0 7362 1.0383 117.3368 278.8806
0.0044 19.0 7771 1.0617 110.8736 275.7667
0.0041 20.0 8180 1.0867 142.9061 291.1584
0.0037 21.0 8589 1.1224 119.377 273.63
0.0026 22.0 8998 1.1322 158.2452 295.0852
0.0024 23.0 9407 1.1619 134.9446 283.1038
0.0022 24.0 9816 1.1677 124.5789 283.5701
0.002 25.0 10225 1.1898 125.0275 288.5318
0.0019 26.0 10634 1.1994 138.0386 288.9011
0.0023 27.0 11043 1.2216 119.7071 279.7329
0.0021 28.0 11452 1.2521 96.3388 266.6656
0.0018 29.0 11861 1.2568 148.4932 288.7504
0.0018 30.0 12270 1.2541 115.3771 283.8205
0.0021 31.0 12679 1.2291 98.8995 271.6817
0.0014 32.0 13088 1.2821 130.6654 293.2197
0.0014 33.0 13497 1.2804 121.8954 287.2249
0.0013 34.0 13906 1.2802 137.5857 293.6275
0.0015 35.0 14315 1.3010 147.5789 296.7907
0.0014 36.0 14724 1.2945 139.6766 292.4335
0.0012 37.0 15133 1.3310 144.5653 288.2045
0.0011 38.0 15542 1.3200 160.6493 306.8297
0.0009 39.0 15951 1.3394 211.9783 341.8621
0.0013 40.0 16360 1.3367 133.4166 304.8621
0.0007 41.0 16769 1.3472 154.6601 319.8702
0.0005 42.0 17178 1.3617 149.0815 301.4669
0.0009 43.0 17587 1.3570 163.2312 319.4675
0.0009 44.0 17996 1.3723 149.9915 310.0088
0.0009 45.0 18405 1.3809 133.1118 289.5232
0.0009 46.0 18814 1.3664 166.6427 308.9287
0.0008 47.0 19223 1.3894 150.0127 304.3739
0.0005 48.0 19632 1.3632 129.4929 307.7766
0.0005 49.0 20041 1.3917 143.9304 313.3529
0.0005 50.0 20450 1.4006 113.0111 295.1966
0.0007 51.0 20859 1.3966 158.3129 303.4328
0.0009 52.0 21268 1.4149 138.1613 304.2098
0.0003 53.0 21677 1.3998 163.519 314.8466
0.0002 54.0 22086 1.4192 141.4035 302.2313
0.0001 55.0 22495 1.4183 150.2878 300.9336
0.0002 56.0 22904 1.4281 172.598 321.0298
0.0018 57.0 23313 1.4229 151.9597 309.6211
0.0009 58.0 23722 1.4263 128.9554 290.2265
0.0003 59.0 24131 1.4430 135.6599 301.7223
0.0002 60.0 24540 1.4487 156.1034 307.6167
0.0004 61.0 24949 1.4252 107.7161 272.8312
0.0001 62.0 25358 1.4254 123.5122 289.272
0.0 63.0 25767 1.4510 121.2901 280.6162
0.0002 64.0 26176 1.4407 111.6482 284.5364
0.0003 65.0 26585 1.4512 123.5207 285.948
0.0006 66.0 26994 1.4476 108.9224 280.1608
0.0005 67.0 27403 1.4721 153.8178 309.4788
0.0004 68.0 27812 1.4675 132.1341 289.9678
0.0001 69.0 28221 1.4712 135.9096 292.8338
0.0001 70.0 28630 1.4712 137.0228 294.8725
0.0 71.0 29039 1.4727 137.9582 292.8438
0.0 72.0 29448 1.4766 135.6514 291.9329
0.0 73.0 29857 1.4808 135.7784 292.3431
0.0 74.0 30266 1.4850 135.5414 291.5527
0.0 75.0 30675 1.4901 134.3224 290.6803
0.0 76.0 31084 1.4943 135.9562 291.6507
0.0 77.0 31493 1.4986 136.0069 291.0294
0.0 78.0 31902 1.5039 139.228 292.1162
0.0 79.0 32311 1.5092 138.6862 291.7796
0.0 80.0 32720 1.5146 139.8375 292.3959
0.0 81.0 33129 1.5208 138.9782 292.097
0.0 82.0 33538 1.5270 140.976 293.3127
0.0 83.0 33947 1.5334 141.3993 292.2359
0.0 84.0 34356 1.5401 141.2258 292.2309
0.0 85.0 34765 1.5472 140.7686 291.4648
0.0 86.0 35174 1.5550 140.6163 291.7997
0.0 87.0 35583 1.5617 142.9104 293.0816
0.0 88.0 35992 1.5700 140.9972 292.0618
0.0 89.0 36401 1.5781 141.5559 292.3054
0.0 90.0 36810 1.5855 142.4109 293.033
0.0 91.0 37219 1.5925 145.0436 293.8586
0.0 92.0 37628 1.6010 144.2648 293.2315
0.0 93.0 38037 1.6083 144.3833 293.3211
0.0 94.0 38446 1.6153 146.3007 294.4095
0.0 95.0 38855 1.6207 146.9864 295.1798
0.0 96.0 39264 1.6269 145.179 293.7054
0.0 97.0 39673 1.6321 148.0107 295.6043
0.0 98.0 40082 1.6358 147.088 295.2686
0.0 99.0 40491 1.6389 148.1503 295.822
0.0 100.0 40900 1.6400 147.3885 295.3741

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.13.0
  • Datasets 2.6.1
  • Tokenizers 0.11.0
Downloads last month
7