Hugging Face crash + follow-up

#591
by jbilcke-hf HF staff - opened

Hello everyone, there was a crash during the night of the platform (see https://twitter.com/huggingface/status/1762954032312639702)

some elements used by the AI Comic Factory have not been able to restart automatically following the crash, so I'm restarting them manually right now

this should take between 5 and 15 minutes, sorry for the inconvenience!

Status update:

after restarting everything, I notice two issues:

  1. sometimes the LLM will fail to render some panels (I'm investigating this Inference API issue with the HF team)
  2. the response time for images is very slow, taking several minutes to generate (I will see with the HF team if I can increase capacity for SDXL)

Capture d’écran 2024-02-29 à 17.38.42.png

I will continue to post here while I'm trying to find solutions to those issues

jbilcke-hf changed discussion title from Hugging Face crash to Hugging Face crash + follow-up

attempting to use a custom model I trained with images for testing purposes, but the generator is not getting pass the "good, we are half way there, hold tight!" step. Any suggestions @jbilcke-hf ?

@CryptoBulliez if you retry today, is it still stuck?

it could be an issue with the LLM generation (zephyr-7b-beta) due to the high usage (plus a few days ago the whole Hugging Face platform was down),
but normally this shouldn't prevent image generation, if this part fails it will only create boring panels (containing no interesting story and just the keywords you typed)

in your case, the issue seems to be on the image model: I assume you are using the Inference API, and it is possible the Inference API has trouble supporting all the requests of all users

it can also be a quota issue specific to your user account. Do you have a "PRO" Hugging Face account or not?
(the PRO account isn't mandatory to use the Inference API, but it comes with higher rate limits)

last question: is your model public or private? normally it shouldn't change anything as long as the api token is correct, but we never know

Please return two pages comics

Sign up or log in to comment