Hosted inference API: 500 Internal Server Error returned

#4
by MarkDeSouza - opened

I'm writing to file an issue encountered while using the hosted inference api for both the T0 and T0pp models (on the provided example prompts).

Please let me know if I have to pay to use these models :)

BigScience Workshop org

Hi! The team is aware of this and working hard to solve the hardware issue causing it. Sorry for the inconvenience πŸ€—

Sign up or log in to comment