speecht5_tts / README.md
JBZhang2342's picture
Model save
7fc2fcb
|
raw
history blame
No virus
7.68 kB
metadata
license: mit
base_model: microsoft/speecht5_tts
tags:
  - generated_from_trainer
model-index:
  - name: speecht5_tts
    results: []

speecht5_tts

This model is a fine-tuned version of microsoft/speecht5_tts on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6228

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 30000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
No log 3.85 250 0.5310
0.6287 7.69 500 0.5088
0.6287 11.54 750 0.4855
0.5138 15.38 1000 0.4986
0.5138 19.23 1250 0.4820
0.4735 23.08 1500 0.4775
0.4735 26.92 1750 0.5104
0.4512 30.77 2000 0.4953
0.4512 34.62 2250 0.4838
0.4419 38.46 2500 0.4969
0.4419 42.31 2750 0.5057
0.4313 46.15 3000 0.4931
0.4313 50.0 3250 0.4975
0.4164 53.85 3500 0.5145
0.4164 57.69 3750 0.5070
0.4055 61.54 4000 0.4921
0.4055 65.38 4250 0.5139
0.3999 69.23 4500 0.5111
0.3999 73.08 4750 0.5118
0.3895 76.92 5000 0.5184
0.3895 80.77 5250 0.5246
0.3843 84.62 5500 0.5244
0.3843 88.46 5750 0.5252
0.3731 92.31 6000 0.5092
0.3731 96.15 6250 0.5098
0.3698 100.0 6500 0.5357
0.3698 103.85 6750 0.5315
0.363 107.69 7000 0.5297
0.363 111.54 7250 0.5429
0.358 115.38 7500 0.5418
0.358 119.23 7750 0.5483
0.3539 123.08 8000 0.5449
0.3539 126.92 8250 0.5466
0.3503 130.77 8500 0.5505
0.3503 134.62 8750 0.5402
0.346 138.46 9000 0.5372
0.346 142.31 9250 0.5547
0.3421 146.15 9500 0.5650
0.3421 150.0 9750 0.5544
0.3376 153.85 10000 0.5594
0.3376 157.69 10250 0.5624
0.3331 161.54 10500 0.5574
0.3331 165.38 10750 0.5605
0.3285 169.23 11000 0.5710
0.3285 173.08 11250 0.5671
0.3253 176.92 11500 0.5561
0.3253 180.77 11750 0.5677
0.3233 184.62 12000 0.5841
0.3233 188.46 12250 0.5770
0.3203 192.31 12500 0.5705
0.3203 196.15 12750 0.5642
0.317 200.0 13000 0.5830
0.317 203.85 13250 0.5800
0.3132 207.69 13500 0.5833
0.3132 211.54 13750 0.5658
0.31 215.38 14000 0.5874
0.31 219.23 14250 0.5911
0.3084 223.08 14500 0.5907
0.3084 226.92 14750 0.5982
0.3046 230.77 15000 0.5962
0.3046 234.62 15250 0.5846
0.3003 238.46 15500 0.5886
0.3003 242.31 15750 0.6019
0.2995 246.15 16000 0.6022
0.2995 250.0 16250 0.5986
0.2985 253.85 16500 0.5994
0.2985 257.69 16750 0.5967
0.2925 261.54 17000 0.5928
0.2925 265.38 17250 0.6138
0.2911 269.23 17500 0.6000
0.2911 273.08 17750 0.6025
0.2909 276.92 18000 0.5917
0.2909 280.77 18250 0.6016
0.2875 284.62 18500 0.6151
0.2875 288.46 18750 0.6035
0.2866 292.31 19000 0.6019
0.2866 296.15 19250 0.6014
0.2821 300.0 19500 0.6029
0.2821 303.85 19750 0.5953
0.2814 307.69 20000 0.6202
0.2814 311.54 20250 0.5953
0.2798 315.38 20500 0.6153
0.2798 319.23 20750 0.6232
0.2766 323.08 21000 0.6175
0.2766 326.92 21250 0.6162
0.2755 330.77 21500 0.6047
0.2755 334.62 21750 0.6052
0.2742 338.46 22000 0.6138
0.2742 342.31 22250 0.6225
0.2746 346.15 22500 0.6015
0.2746 350.0 22750 0.6029
0.2716 353.85 23000 0.6105
0.2716 357.69 23250 0.6132
0.2697 361.54 23500 0.6129
0.2697 365.38 23750 0.6045
0.2704 369.23 24000 0.6155
0.2704 373.08 24250 0.6075
0.2694 376.92 24500 0.6154
0.2694 380.77 24750 0.6263
0.2672 384.62 25000 0.6181
0.2672 388.46 25250 0.6185
0.2649 392.31 25500 0.6131
0.2649 396.15 25750 0.6113
0.2641 400.0 26000 0.6151
0.2641 403.85 26250 0.6219
0.2642 407.69 26500 0.6228
0.2642 411.54 26750 0.6258
0.2621 415.38 27000 0.6161
0.2621 419.23 27250 0.6316
0.2634 423.08 27500 0.6159
0.2634 426.92 27750 0.6192
0.2611 430.77 28000 0.6210
0.2611 434.62 28250 0.6246
0.2593 438.46 28500 0.6142
0.2593 442.31 28750 0.6157
0.26 446.15 29000 0.6198
0.26 450.0 29250 0.6182
0.262 453.85 29500 0.6188
0.262 457.69 29750 0.6223
0.2616 461.54 30000 0.6228

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.14.1