System requirement ? 24GB can't gen 256x256 here.

#23
by Teddydj - opened

Hello guys, just asking,
Do you need an H100 to get through the Xlab Sampler ? i used the flux-fp8
24gb.png

anyone got it to work with like a 4090 ?
i know its new, but infos are super rare, so im trying to get some. (im using this workflow https://civitai.com/models/635271/fluximg2imgipadaptercontrolnet )
Capture d'écran 2024-09-17 105118.png

Hi, I use 3090Ti and latent can be set as 1020*1024 when diffusion model weight type is set as fp8. I noticed that GPU memory could be occupied by other program.
123.png. Like my first GPU, 1G is occupied during usual using. So I use another GPU(not connect with screen), and it works while using fp8.

Sign up or log in to comment