Inference API endpoint not loading for using with Langchain

#24
by Kamaljp - opened

I have been trying to load this model inside langchain library. The inference endpoint is timing out. Could you please look into it. The openchatkit demo is working fine.

I have been trying to load this model inside langchain library. The inference endpoint is timing out. Could you please look into it. The openchatkit demo is working fine.

Of course, could you paste the entire error message here if possible? Or did it just stuck at that point?

It just stuck.. no error message.

Sign up or log in to comment