Edit model card

jdrt_byclass_rinnna_hubert_asr_2

This model is a fine-tuned version of rinna/japanese-hubert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4574
  • Wer: 0.4461
  • Cer: 0.3253

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 260
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
11.8822 1.0 53 8.0828 0.9838 0.9911
6.3754 2.0 106 4.6774 0.9838 0.9911
4.5264 3.0 159 3.9963 0.9838 0.9911
3.7867 4.0 212 3.3575 0.9838 0.9911
3.2022 5.0 265 2.9007 0.9838 0.9911
2.8134 6.0 318 2.7079 0.9838 0.9911
2.708 7.0 371 2.6779 0.9838 0.9911
2.6923 8.0 424 2.6719 0.9838 0.9911
2.6555 9.0 477 2.4997 0.9838 0.9911
2.3308 10.0 530 2.1709 0.9960 0.9522
2.1951 11.0 583 2.0579 0.9960 0.9522
2.0032 12.0 636 1.8255 0.9989 0.9302
1.8394 13.0 689 1.6870 0.9999 0.9021
1.6562 14.0 742 1.4793 0.9999 0.8139
1.5057 15.0 795 1.2942 0.9997 0.7813
1.4022 16.0 848 1.2146 0.9997 0.7594
1.2945 17.0 901 1.1512 0.9941 0.7395
1.2062 18.0 954 1.0732 0.9844 0.6726
1.1396 19.0 1007 1.0457 0.9694 0.6559
1.0419 20.0 1060 0.8567 0.5929 0.4214
0.8799 21.0 1113 0.7172 0.5273 0.3951
0.8249 22.0 1166 0.6980 0.5204 0.4048
0.7649 23.0 1219 0.6157 0.4923 0.3737
0.7401 24.0 1272 0.6171 0.4992 0.3722
0.7074 25.0 1325 0.5948 0.4835 0.3669
0.6543 26.0 1378 0.5520 0.4759 0.3537
0.6164 27.0 1431 0.5512 0.4745 0.3467
0.5758 28.0 1484 0.5286 0.4677 0.3416
0.5648 29.0 1537 0.5099 0.4644 0.3447
0.578 30.0 1590 0.5262 0.4701 0.3530
0.5343 31.0 1643 0.5025 0.4621 0.3463
0.5236 32.0 1696 0.4927 0.4588 0.3411
0.5256 33.0 1749 0.4971 0.4614 0.3434
0.5309 34.0 1802 0.4882 0.4553 0.3400
0.4938 35.0 1855 0.4663 0.4497 0.3309
0.4874 36.0 1908 0.4734 0.4508 0.3316
0.4519 37.0 1961 0.4642 0.4483 0.3289
0.5137 38.0 2014 0.4655 0.4490 0.3303
0.4832 39.0 2067 0.4613 0.4474 0.3282
0.4581 40.0 2120 0.4565 0.4450 0.3236
0.4542 41.0 2173 0.4550 0.4461 0.3262
0.4478 42.0 2226 0.4548 0.4442 0.3247
0.448 43.0 2279 0.4523 0.4450 0.3238
0.4468 44.0 2332 0.4560 0.4455 0.3224
0.4681 45.0 2385 0.4599 0.4468 0.3253
0.455 46.0 2438 0.4589 0.4472 0.3246
0.4479 47.0 2491 0.4583 0.4465 0.3245
0.4533 48.0 2544 0.4582 0.4469 0.3253
0.4486 49.0 2597 0.4575 0.4462 0.3253
0.4466 50.0 2650 0.4574 0.4461 0.3253

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
1

Finetuned from