support for longer prompt and weighting using custom_pipeline

#29
by skytnt - opened

Now we can input prompt without 77 tokens limit and adjust weighting by using custom_pipeline="waifu-research-department/long-prompt-weighting-pipeline".

It requires diffusers>=0.4.0

Check out waifu-research-department/long-prompt-weighting-pipeline for detial.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment