Edit model card

jdrt_byclass_rinnna_hubert_asr_3

This model is a fine-tuned version of rinna/japanese-hubert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4223
  • Wer: 0.4080
  • Cer: 0.2885

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 7.5e-05
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 260
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
11.2994 1.0 53 6.9048 0.9156 0.9495
5.5642 2.0 106 4.4074 0.9156 0.9495
4.1184 3.0 159 3.5723 0.9156 0.9495
3.2849 4.0 212 2.9362 0.9156 0.9495
2.7998 5.0 265 2.6897 0.9156 0.9495
2.6983 6.0 318 2.6367 0.9156 0.9495
2.4519 7.0 371 2.2030 0.9960 0.9112
2.1019 8.0 424 1.8801 1.0 0.8929
1.8091 9.0 477 1.5845 1.0 0.8639
1.5947 10.0 530 1.3550 1.0 0.7570
1.3709 11.0 583 1.2357 1.0000 0.7344
1.2377 12.0 636 1.0982 1.0000 0.6984
1.1595 13.0 689 0.9865 0.9997 0.6737
1.0386 14.0 742 0.9245 0.9125 0.5754
0.928 15.0 795 0.8553 0.8591 0.5117
0.8691 16.0 848 0.7590 0.8435 0.4966
0.7983 17.0 901 0.6782 0.5164 0.3451
0.6839 18.0 954 0.5806 0.4843 0.3323
0.5901 19.0 1007 0.5280 0.4438 0.3133
0.5553 20.0 1060 0.5312 0.4434 0.3143
0.5274 21.0 1113 0.5229 0.4357 0.2939
0.4843 22.0 1166 0.4674 0.4215 0.2844
0.477 23.0 1219 0.4996 0.4335 0.2984
0.4624 24.0 1272 0.4762 0.4334 0.3005
0.4485 25.0 1325 0.4241 0.4286 0.3003
0.4301 26.0 1378 0.4485 0.4247 0.2923
0.3953 27.0 1431 0.4292 0.4175 0.2944
0.401 28.0 1484 0.4241 0.4102 0.2868
0.3833 29.0 1537 0.4053 0.3995 0.2691
0.4125 30.0 1590 0.4210 0.4013 0.2690
0.3703 31.0 1643 0.4385 0.4070 0.2744
0.3441 32.0 1696 0.4126 0.4035 0.2718
0.3411 33.0 1749 0.4286 0.4125 0.2875
0.3302 34.0 1802 0.4311 0.4128 0.2943
0.3422 35.0 1855 0.4350 0.4084 0.2880
0.3428 36.0 1908 0.4223 0.4080 0.2885

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
1

Finetuned from