Text Generation
Transformers
PyTorch
English
gpt_neox
Inference Endpoints
text-generation-inference

Inference API not working

#4
by MrlolDev - opened

The inference API get stuck at loading model with estimated time of 20s. But it doesn't solve nothing.

Any update on this?πŸ™„

I get stuck at the loading phase also on colab notebook, is there any fix for that?

Sign up or log in to comment