Edit model card

jdrt_byclass_rinnna_hubert_asr_1

This model is a fine-tuned version of rinna/japanese-hubert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3647
  • Wer: 0.4190
  • Cer: 0.2827

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
10.589 1.0 53 5.6588 0.9156 0.9495
5.0974 2.0 106 4.0914 0.9156 0.9495
3.6701 3.0 159 3.1732 0.9156 0.9495
2.9285 4.0 212 2.7238 0.9156 0.9495
2.6943 5.0 265 2.6600 0.9156 0.9495
2.4567 6.0 318 2.2231 0.9960 0.9112
2.1447 7.0 371 1.9716 0.9960 0.9112
1.8452 8.0 424 1.5058 0.9062 0.7431
1.4358 9.0 477 1.0988 0.7370 0.5347
1.1898 10.0 530 0.9512 0.6981 0.5062
1.0261 11.0 583 0.8354 0.6510 0.4779
0.8371 12.0 636 0.7158 0.5560 0.3784
0.7896 13.0 689 0.6381 0.5330 0.3686
0.6846 14.0 742 0.5720 0.5183 0.3555
0.6357 15.0 795 0.5879 0.5030 0.3505
0.5893 16.0 848 0.5501 0.4884 0.3468
0.558 17.0 901 0.4291 0.4487 0.3154
0.5019 18.0 954 0.4354 0.4552 0.3064
0.4784 19.0 1007 0.4199 0.4490 0.3014
0.4564 20.0 1060 0.4439 0.4508 0.3153
0.4291 21.0 1113 0.4143 0.4352 0.2845
0.4144 22.0 1166 0.4415 0.4384 0.2812
0.3766 23.0 1219 0.3706 0.4264 0.2918
0.3792 24.0 1272 0.3933 0.4377 0.3015
0.3759 25.0 1325 0.3708 0.4231 0.3023
0.337 26.0 1378 0.3762 0.4250 0.2942
0.3282 27.0 1431 0.3595 0.4253 0.2937
0.3174 28.0 1484 0.3998 0.4269 0.2898
0.3156 29.0 1537 0.4056 0.4268 0.3093
0.2921 30.0 1590 0.3694 0.4274 0.3041
0.2929 31.0 1643 0.3917 0.4228 0.2881
0.2686 32.0 1696 0.3880 0.4276 0.2969
0.2776 33.0 1749 0.4038 0.4273 0.2963
0.2619 34.0 1802 0.3647 0.4190 0.2827

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
0

Finetuned from