Progressively increasing consumption of RAM

#25
by 1h2j458f84n3id - opened

I am running an sd-onnx model with a custom gradio ui on my local machine and observed increased consumption of RAM on each subsequent generation.
Is there a way to control how much RAM the model uses or a way to "empty" the pipe after each gen other than rebooting the program?

I generate one picture at a time.

Figure_1.png

It's probably the gradio ui. Definetly not the model.

Hi. i believe this would help

from numba import cuda
import gc
def free_gpu_cache():
torch.cuda.empty_cache()
gc.collect()

i would suggest put function execution right inside "Run for generating images" cell before the first line of code
it will clean some trash.
unfortunately error appears even with this code due to output picture size more then 512x512

Sign up or log in to comment