idk what is this
#3
by
Blane187
- opened
This error often occurs with the Inference serverless API.
Even if I subscribe to Pro, it still appears.
It does not appear when I use it from Spaces (and maybe from Endpoint API).
It seems to appear whether the model volume is small or large. If I change the scheduler (sampler), it still appears.
So, I don't know for sure why, but I guess I just have to assume that it appears frequently.
If you like, you can use my generation space, though it has a 120-second limit of ZeroGPU. π
https://huggingface.co/spaces/John6666/DiffuseCraftMod
https://huggingface.co/spaces/John6666/votepurchase-multiple-model
alright thanks :)