Edit model card

whisper-large-v3-translate-zh-v0.1-lt

This model is a fine-tuned version of openai/whisper-large-v3.

Model description

3500小时 (日语音频,中文字幕) 数据微调, 翻译模式直出中文

Usage

task='translate', language='ja'

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 4000
  • dropout: 0.1
  • mask_time_prob: 0.05
  • mask_feature_prob: 0.2
  • condition_on_previous_text_rate: 0.5

Training results

Training Loss Epoch Step Validation Loss Cer Wer
2.1282 0.0739 1000 2.1852 1.9014 4.4904
1.8567 0.1478 2000 1.8366 1.7295 3.8716
1.6968 0.2217 3000 1.2615 1.6279 2.4825
1.6264 0.2956 4000 1.0536 1.5625 1.8101
1.5687 0.3695 5000 1.0932 1.5410 2.1218
1.531 0.4433 6000 1.5156 1.2533 2.3689
1.4875 0.5172 7000 1.4697 0.9560 1.5588
1.4518 0.5911 8000 1.4521 1.0170 1.6392
1.4472 0.6650 9000 1.4463 1.0084 1.6420
1.3991 0.7389 10000 1.4238 0.9266 1.6992
1.4266 0.8128 11000 1.4141 0.8365 1.3056
1.3755 0.8867 12000 1.4033 0.7904 1.3119
1.3833 0.9606 13000 1.4004 0.8600 1.3333
1.3224 1.0345 14000 1.3770 0.8243 1.4560
1.3295 1.1084 15000 1.3770 0.7852 1.4298
1.3136 1.1823 16000 1.3564 0.7176 1.1826
1.2832 1.2561 17000 1.3535 0.6767 1.1781
1.2917 1.3300 18000 1.3584 0.7255 1.1218
1.27 1.4039 19000 1.3330 0.6590 1.1242
1.2704 1.4778 20000 1.3379 0.6934 1.1944
1.2614 1.5517 21000 1.3330 0.6949 1.1820
1.2455 1.6256 22000 1.3350 0.6931 1.0892
1.2475 1.6995 23000 1.3154 0.6662 1.1576
1.2583 1.7734 24000 1.3164 0.6490 1.0705
1.2333 1.8473 25000 1.3184 0.6833 1.1480
1.2462 1.9212 26000 1.3125 0.6672 1.1612
1.2279 1.9950 27000 1.3047 0.6644 1.2179
1.1908 2.0689 28000 1.3047 0.6938 1.2221
1.1831 2.1428 29000 1.2998 0.6316 1.0717
1.1705 2.2167 30000 1.3018 0.6165 1.0958
1.171 2.2906 31000 1.3027 0.6109 1.0868
1.1567 2.3645 32000 1.3037 0.6485 1.1736
1.1705 2.4384 33000 1.2969 0.6078 1.0515
1.1819 2.5123 34000 1.2949 0.6158 1.0362
1.1447 2.5862 35000 1.2920 0.6365 1.0558
1.17 2.6601 36000 1.2881 0.6339 1.0868
1.1495 2.7340 37000 1.2949 0.6297 1.0437
1.1395 2.8078 38000 1.2900 0.6285 1.1221
1.15 2.8817 39000 1.2891 0.5997 1.0217
1.1623 2.9556 40000 1.2881 0.6085 1.0395

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
1.61B params
Tensor type
FP16
·
Inference API
or
This model can be loaded on Inference API (serverless).

Finetuned from