jbetker commited on
Commit
e857911
1 Parent(s): e4384d4
Files changed (1) hide show
  1. README.md +12 -0
README.md CHANGED
@@ -201,6 +201,18 @@ Imagine what a TTS model trained at or near GPT-3 or DALLE scale could achieve.
201
  If you are an ethical organization with computational resources to spare interested in seeing what this model could do
202
  if properly scaled out, please reach out to me! I would love to collaborate on this.
203
 
 
 
 
 
 
 
 
 
 
 
 
 
204
  ## Notice
205
 
206
  Tortoise was built entirely by me using my own hardware. My employer was not involved in any facet of Tortoise's development.
 
201
  If you are an ethical organization with computational resources to spare interested in seeing what this model could do
202
  if properly scaled out, please reach out to me! I would love to collaborate on this.
203
 
204
+ ## Acknowledgements
205
+
206
+ This project has garnered more praise than I expected. I am standing on the shoulders of giants, though, and I want to
207
+ credit a few of the amazing folks in the community that have helped make this happen:
208
+
209
+ - Hugging Face, who wrote the GPT model and the generate API used by Tortoise, and who hosts the model weights.
210
+ - [Ramesh et al](https://arxiv.org/pdf/2102.12092.pdf) who authored the DALLE paper, which is the inspiration behind Tortoise.
211
+ - [Nichol and Dhariwal](https://arxiv.org/pdf/2102.09672.pdf) who authored the (revision of) the code that drives the diffusion model.
212
+ - [Jang et al](https://arxiv.org/pdf/2106.07889.pdf) who developed and open-sourced univnet, the vocoder this repo uses.
213
+ - [lucidrains](https://github.com/lucidrains) who writes awesome open source pytorch models, many of which are used here.
214
+ - [Patrick von Platen](https://huggingface.co/patrickvonplaten) whose guides on setting up wav2vec were invaluable to building my dataset.
215
+
216
  ## Notice
217
 
218
  Tortoise was built entirely by me using my own hardware. My employer was not involved in any facet of Tortoise's development.