Spaces:
Running
Running
Sync from GitHub repo
Browse filesThis Space is synced from the GitHub repo: https://github.com/SWivid/F5-TTS. Please submit contributions to the Space there
- README_REPO.md +4 -3
README_REPO.md
CHANGED
@@ -2,8 +2,9 @@
|
|
2 |
|
3 |
[](https://github.com/SWivid/F5-TTS)
|
4 |
[](https://arxiv.org/abs/2410.06885)
|
5 |
-
[](https://x-lance.sjtu.edu.cn/)
|
8 |
<img src="https://github.com/user-attachments/assets/12d7749c-071a-427c-81bf-b87b91def670" alt="Watermark" style="width: 40px; height: auto">
|
9 |
|
@@ -66,7 +67,7 @@ An initial guidance on Finetuning [#57](https://github.com/SWivid/F5-TTS/discuss
|
|
66 |
|
67 |
## Inference
|
68 |
|
69 |
-
The pretrained model checkpoints can be reached at [🤗 Hugging Face](https://huggingface.co/SWivid/F5-TTS) and [
|
70 |
|
71 |
Currently support 30s for a single generation, which is the **TOTAL** length of prompt audio and the generated. Batch inference with chunks is supported by `inference-cli` and `gradio_app`.
|
72 |
- To avoid possible inference failures, make sure you have seen through the following instructions.
|
|
|
2 |
|
3 |
[](https://github.com/SWivid/F5-TTS)
|
4 |
[](https://arxiv.org/abs/2410.06885)
|
5 |
+
[](https://swivid.github.io/F5-TTS/)
|
6 |
+
[](https://huggingface.co/spaces/mrfakename/E2-F5-TTS)
|
7 |
+
[](https://modelscope.cn/studios/modelscope/E2-F5-TTS)
|
8 |
[](https://x-lance.sjtu.edu.cn/)
|
9 |
<img src="https://github.com/user-attachments/assets/12d7749c-071a-427c-81bf-b87b91def670" alt="Watermark" style="width: 40px; height: auto">
|
10 |
|
|
|
67 |
|
68 |
## Inference
|
69 |
|
70 |
+
The pretrained model checkpoints can be reached at [🤗 Hugging Face](https://huggingface.co/SWivid/F5-TTS) and [🤖 Model Scope](https://www.modelscope.cn/models/SWivid/F5-TTS_Emilia-ZH-EN), or automatically downloaded with `inference-cli` and `gradio_app`.
|
71 |
|
72 |
Currently support 30s for a single generation, which is the **TOTAL** length of prompt audio and the generated. Batch inference with chunks is supported by `inference-cli` and `gradio_app`.
|
73 |
- To avoid possible inference failures, make sure you have seen through the following instructions.
|