--- license: mit base_model: microsoft/speecht5_tts tags: - generated_from_trainer model-index: - name: speecht5_tts-wolof results: [] --- # speecht5_tts-wolof This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.3697 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 255000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-------:|:-----:|:---------------:| | 0.5033 | 0.9989 | 562 | 0.4592 | | 0.471 | 1.9996 | 1125 | 0.4319 | | 0.4603 | 2.9984 | 1687 | 0.4224 | | 0.4496 | 3.9991 | 2250 | 0.4135 | | 0.4418 | 4.9998 | 2813 | 0.4090 | | 0.4296 | 5.9987 | 3375 | 0.4001 | | 0.425 | 6.9993 | 3938 | 0.3950 | | 0.4197 | 8.0 | 4501 | 0.3927 | | 0.4173 | 8.9989 | 5063 | 0.3888 | | 0.4133 | 9.9996 | 5626 | 0.3852 | | 0.4112 | 10.9984 | 6188 | 0.3832 | | 0.4072 | 11.9991 | 6751 | 0.3808 | | 0.404 | 12.9998 | 7314 | 0.3788 | | 0.4055 | 13.9987 | 7876 | 0.3792 | | 0.401 | 14.9993 | 8439 | 0.3759 | | 0.3988 | 16.0 | 9002 | 0.3755 | | 0.3984 | 16.9989 | 9564 | 0.3761 | | 0.3992 | 17.9996 | 10127 | 0.3735 | | 0.392 | 18.9984 | 10689 | 0.3731 | | 0.393 | 19.9991 | 11252 | 0.3730 | | 0.3945 | 20.9998 | 11815 | 0.3713 | | 0.3929 | 21.9987 | 12377 | 0.3720 | | 0.3919 | 22.9993 | 12940 | 0.3736 | | 0.3907 | 24.0 | 13503 | 0.3702 | | 0.3893 | 24.9989 | 14065 | 0.3700 | | 0.3894 | 25.9996 | 14628 | 0.3707 | | 0.39 | 26.9984 | 15190 | 0.3687 | | 0.3858 | 27.9991 | 15753 | 0.3712 | | 0.3874 | 28.9998 | 16316 | 0.3669 | | 0.3887 | 29.9987 | 16878 | 0.3685 | | 0.3854 | 30.9993 | 17441 | 0.3670 | | 0.3856 | 32.0 | 18004 | 0.3697 | ### Framework versions - Transformers 4.41.2 - Pytorch 2.4.0+cu121 - Datasets 3.2.0 - Tokenizers 0.19.1