speecht5_tts / README.md
JBZhang2342's picture
Model save
a92cc05
|
raw
history blame
No virus
7.56 kB
metadata
license: mit
base_model: microsoft/speecht5_tts
tags:
  - generated_from_trainer
model-index:
  - name: speecht5_tts
    results: []

speecht5_tts

This model is a fine-tuned version of microsoft/speecht5_tts on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6815

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 30000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
No log 0.53 250 1.1437
1.3289 1.06 500 0.8521
1.3289 1.6 750 0.7901
0.8977 2.13 1000 0.7478
0.8977 2.66 1250 0.7437
0.8131 3.19 1500 0.7243
0.8131 3.72 1750 0.7106
0.771 4.26 2000 0.7072
0.771 4.79 2250 0.7008
0.7562 5.32 2500 0.6916
0.7562 5.85 2750 0.6850
0.7472 6.38 3000 0.6876
0.7472 6.91 3250 0.6807
0.7266 7.45 3500 0.6804
0.7266 7.98 3750 0.6763
0.715 8.51 4000 0.6769
0.715 9.04 4250 0.6698
0.7005 9.57 4500 0.6690
0.7005 10.11 4750 0.6653
0.6932 10.64 5000 0.6656
0.6932 11.17 5250 0.6684
0.6854 11.7 5500 0.6645
0.6854 12.23 5750 0.6634
0.6739 12.77 6000 0.6674
0.6739 13.3 6250 0.6606
0.6754 13.83 6500 0.6663
0.6754 14.36 6750 0.6681
0.6592 14.89 7000 0.6589
0.6592 15.43 7250 0.6601
0.6528 15.96 7500 0.6739
0.6528 16.49 7750 0.6643
0.6539 17.02 8000 0.6605
0.6539 17.55 8250 0.6614
0.6437 18.09 8500 0.6551
0.6437 18.62 8750 0.6604
0.6341 19.15 9000 0.6606
0.6341 19.68 9250 0.6582
0.6305 20.21 9500 0.6714
0.6305 20.74 9750 0.6618
0.627 21.28 10000 0.6600
0.627 21.81 10250 0.6636
0.6244 22.34 10500 0.6692
0.6244 22.87 10750 0.6645
0.6178 23.4 11000 0.6670
0.6178 23.94 11250 0.6611
0.6157 24.47 11500 0.6697
0.6157 25.0 11750 0.6651
0.6108 25.53 12000 0.6642
0.6108 26.06 12250 0.6646
0.6008 26.6 12500 0.6672
0.6008 27.13 12750 0.6601
0.6067 27.66 13000 0.6760
0.6067 28.19 13250 0.6639
0.5985 28.72 13500 0.6662
0.5985 29.26 13750 0.6720
0.5957 29.79 14000 0.6710
0.5957 30.32 14250 0.6688
0.5944 30.85 14500 0.6714
0.5944 31.38 14750 0.6760
0.5886 31.91 15000 0.6639
0.5886 32.45 15250 0.6714
0.5868 32.98 15500 0.6722
0.5868 33.51 15750 0.6790
0.5851 34.04 16000 0.6728
0.5851 34.57 16250 0.6812
0.5819 35.11 16500 0.6756
0.5819 35.64 16750 0.6679
0.5811 36.17 17000 0.6719
0.5811 36.7 17250 0.6684
0.5759 37.23 17500 0.6776
0.5759 37.77 17750 0.6743
0.5743 38.3 18000 0.6725
0.5743 38.83 18250 0.6730
0.5761 39.36 18500 0.6712
0.5761 39.89 18750 0.6765
0.576 40.43 19000 0.6779
0.576 40.96 19250 0.6801
0.5734 41.49 19500 0.6756
0.5734 42.02 19750 0.6761
0.5743 42.55 20000 0.6857
0.5743 43.09 20250 0.6734
0.5732 43.62 20500 0.6753
0.5732 44.15 20750 0.6803
0.5657 44.68 21000 0.6743
0.5657 45.21 21250 0.6831
0.565 45.74 21500 0.6799
0.565 46.28 21750 0.6769
0.565 46.81 22000 0.6786
0.565 47.34 22250 0.6788
0.5583 47.87 22500 0.6830
0.5583 48.4 22750 0.6884
0.5652 48.94 23000 0.6827
0.5652 49.47 23250 0.6795
0.5625 50.0 23500 0.6807
0.5625 50.53 23750 0.6788
0.5605 51.06 24000 0.6862
0.5605 51.6 24250 0.6822
0.5571 52.13 24500 0.6819
0.5571 52.66 24750 0.6797
0.5633 53.19 25000 0.6835
0.5633 53.72 25250 0.6835
0.5572 54.26 25500 0.6881
0.5572 54.79 25750 0.6791
0.5571 55.32 26000 0.6815
0.5571 55.85 26250 0.6868
0.5534 56.38 26500 0.6876
0.5534 56.91 26750 0.6871
0.5525 57.45 27000 0.6836
0.5525 57.98 27250 0.6841
0.5542 58.51 27500 0.6911
0.5542 59.04 27750 0.6835
0.5512 59.57 28000 0.6806
0.5512 60.11 28250 0.6805
0.5474 60.64 28500 0.6858
0.5474 61.17 28750 0.6874
0.5548 61.7 29000 0.6811
0.5548 62.23 29250 0.6808
0.5545 62.77 29500 0.6868
0.5545 63.3 29750 0.6894
0.5522 63.83 30000 0.6815

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.14.1