Fix T-GATE pollution

#8
by adamelliotfields - opened

When pipe.tgate() is called, register_tgate_forward() permanently alters the attention layers in the UNet, which is a stateful change to the pipeline and can't be reverted. In other words, after calling pipe.tgate(), subsequent calls to pipe() are the same as pipe.tgate(gate_step=0). I tried reloading just the UNet, but that made no difference; only thing that works is to reload the entire pipeline.

For now, treat gate_step=0 as gate_step=num_inference, which effectively disables T-GATE since it checks if the current step is greater than the gate step.

Fixed in b7fd57e.

adamelliotfields changed discussion status to closed

Sign up or log in to comment