Edit model card

whisper-large-v2-translate-zh-v0.1-lt-ct2

This model is a fine-tuned version of openai/whisper-large-v2.

Model description

3500小时 (日语音频,中文字幕) 数据微调, 翻译模式直出中文

CTranslate2 版

Usage

task='translate', language='ja'

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 4000
  • dropout: 0.1
  • mask_time_prob: 0.05
  • mask_feature_prob: 0.2
  • condition_on_previous_text_rate: 0.5

Training results

Training Loss Epoch Step Validation Loss Cer Wer
1.743 0.0740 1000 1.5631 0.8223 1.4517
1.6014 0.1479 2000 1.4808 0.6775 1.0950
1.5549 0.2219 3000 1.4381 0.6756 1.1158
1.5283 0.2958 4000 1.4174 0.6992 1.1137
1.474 0.3698 5000 1.3849 0.6570 1.1369
1.4193 0.4437 6000 1.3657 0.6544 1.1339
1.4148 0.5177 7000 1.3477 0.6386 1.1647
1.3754 0.5916 8000 1.3392 0.6228 1.0461
1.3441 0.6656 9000 1.3362 0.6196 1.0609
1.3545 0.7395 10000 1.3176 0.6354 1.2138
1.3498 0.8135 11000 1.3236 0.6631 1.2232
1.31 0.8874 12000 1.3020 0.6199 1.0018
1.3213 0.9614 13000 1.2966 0.5922 1.0021
1.2375 1.0353 14000 1.2900 0.6097 1.0639
1.2334 1.1093 15000 1.2963 0.6150 1.0920
1.2277 1.1832 16000 1.2888 0.6077 1.0929
1.2087 1.2572 17000 1.2779 0.5954 1.0012
1.2131 1.3311 18000 1.2722 0.5776 1.0075
1.2012 1.4051 19000 1.2716 0.5726 1.0211
1.1912 1.4790 20000 1.2707 0.6007 1.1538
1.2127 1.5530 21000 1.2749 0.6086 1.0742
1.1789 1.6269 22000 1.2797 0.5765 1.0072
1.1527 1.7009 23000 1.2761 0.5855 1.0588
1.1693 1.7748 24000 1.2701 0.5635 0.9928
1.1709 1.8488 25000 1.2662 0.5980 1.0697
1.1637 1.9227 26000 1.2749 0.5872 1.0392
1.1562 1.9967 27000 1.2587 0.5651 1.0121
1.0929 2.0706 28000 1.2668 0.5857 1.0139
1.1232 2.1446 29000 1.2710 0.5742 0.9997
1.1045 2.2185 30000 1.2656 0.5643 0.9897
1.0841 2.2925 31000 1.2695 0.5835 1.0181
1.0868 2.3664 32000 1.2707 0.5673 0.9964
1.0938 2.4404 33000 1.2644 0.5712 0.9928
1.0938 2.5143 34000 1.2662 0.5750 1.0109
1.0848 2.5883 35000 1.2677 0.5841 1.0832
1.0914 2.6622 36000 1.2638 0.5801 1.0299
1.0688 2.7362 37000 1.2587 0.5694 1.0072
1.0856 2.8101 38000 1.2581 0.5646 1.0057
1.1037 2.8841 39000 1.2557 0.5771 1.0262
1.0652 2.9580 40000 1.2566 0.5634 0.9979

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from