Downloadable version?

#361
by Semperr - opened

Since the servers are in flames 24/7, would it be possible to make the AI a runnable .exe program? I'm not sure how long it would take to make it though so take this suggestion with a pinch of salt.

You would need atleast GTX 980 Ti 6GB in 4-way SLI or 3x RX 580 in CrossFire.
DALL*E Mini requires atleast 24GB of VRAM. You could use system RAM, but it slower than waiting for website.

You would need atleast GTX 980 Ti 6GB in 4-way SLI or 3x RX 580 in CrossFire.
DALL*E Mini requires atleast 24GB of VRAM. You could use system RAM, but it slower than waiting for website.

Super, super possible for any single company or entity with servers to host.

@sonicnoose yep. So head over to GitHub if that’s you…

I mean you can just have it running on your machine for a few days, then never have to wait again.

It only takes a while to train afaik, the actual image generation is fairly fast.

I've got a 2080ti with 12gb VRAM, but I do have 32gb of standard memory available.
Could that potentially be enough to get DALL-E working without much of a problem? I'm planning on getting another 32gb of memory soon, so even with the VRAM limitation I'm hoping that getting the extra memory might be beneficial, even though DALL-E is GPU based.

I've got a 2080ti with 12gb VRAM, but I do have 32gb of standard memory available.
Could that potentially be enough to get DALL-E working without much of a problem?

https://www.youtube.com/watch?v=eWpzLIa6v9E from comments it looks like the video is legit.

Sign up or log in to comment