Can people help reduce the load on the A.I.?

#34
by Sobek - opened

Dalle Mini is getting quite popular since the actual DALL-E is out of reach of end users, and as such it is quickly being overwhelmed with requests. The "Too much traffic, please try again." message is pretty much a constant, with inputs having to be submitted up to a dozen times before you can get anything if you are lucky.

Could people help the dev by offering their own CPUs and GPUs to help take some of the load? Maybe the Dev could set up a BOINC fork for people to help, a Dalle_mini@home?

Patreon. Assuming the dev guys are running on an elastic cloud service of some kind, then throwing more money at it should always linearly increase its capacity. :)

And/or, it looks like there's a pricing tab? So individuals (or "organizations") may be able to pay for dedicated capacity.
Anyone have feedback on whether or not that is effective, or what hurdles that might entail?
(for example, "organizations" getting a pay as you go tier w/o a $9/mo fee.. wassa with that?)

I'm also curious if there is/can be a pricing tier that allows ~half of a person's contribution to go towards supporting more aggregate hardware resource availability for the pool of free users.

I am a bit sus on Patreon and the like, those would work but could take a while and might be subject to whims of the market and possible other issues. With a BOINC @home project anyone could help that has available hardware and it would lighten the load for everyone.

That being said, a @home project might not even work depending on how the AI operates so the Patreon idea has some merit. Other crowdfunding plataforms exist too like Subscribestar and Kofi and such.

We can donate to Boris on Github...

https://github.com/sponsors/borisdayma?sc=t&sp=cypherlh

I just did it after seeing his post about it on Twitter.

Ah that's a good start! Leaves one avenue open already.

I wonder how the Dev would feel about adding something like Crypto Wallet links, so people could do it without having to get into the GitHub sponsor.

And I still wonder if the possibility of a @home Dalle exists.

I believe the author of the repository, Boris, provides a Colab notebook for inference. You likely could just use one of Googles free GPU enabled notebooks to generate as many images as you want. It might be worth citing that somewhere on Huggingface Model spaces in case you keep getting an error.

Yes, you can use the Colab notebook, I got it working for mini dalle quite easily, follow the "Fast Usage" instructions here:
https://github.com/saharmor/dalle-playground

and just press play on the colab buttons and it'll start a server and then copy/paste the URL it gives you (prepended with his proxy URL, or follow step 6 in Local Development to have your own localhost interface to paste in the URL without his proxy). Good luck!

Sign up or log in to comment