Edit model card

speecht5_feniks

This model is a fine-tuned version of microsoft/speecht5_tts on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5319

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.4478 3.0418 100 0.4543
0.4534 6.0837 200 0.4621
0.4373 9.1255 300 0.4543
0.4224 12.1673 400 0.4494
0.4127 15.2091 500 0.4657
0.4134 18.2510 600 0.4529
0.4047 21.2928 700 0.4724
0.3932 24.3346 800 0.4777
0.3907 27.3764 900 0.4942
0.3855 30.4183 1000 0.4870
0.3783 33.4601 1100 0.4860
0.3794 36.5019 1200 0.4867
0.3704 39.5437 1300 0.4965
0.3687 42.5856 1400 0.5151
0.3674 45.6274 1500 0.5165
0.3618 48.6692 1600 0.5377
0.3536 51.7110 1700 0.5206
0.3621 54.7529 1800 0.5419
0.3533 57.7947 1900 0.5337
0.3513 60.8365 2000 0.5319

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1
Downloads last month
11
Safetensors
Model size
144M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Masternlp/tts

Finetuned
(786)
this model