500 error

#7
by HTahboub - opened

I cannot run the model API on huggingface and am getting a 500 error when I try.

Same! Would definitely like to experiment via inference API - looks very interesting!

Same getting 500 Internal Server error. Do users need to pay for for this model?

BigScience Workshop org

Hi! The team is aware of this and working hard to solve the hardware issue causing it. Sorry for the inconvenience πŸ€—

If anyone knows other websites/platforms hosting T0pp in a"playground form", please share :)

I don't think there are any playground forms of T0.

Hi, is the inference API still working? It still shows error 500 for me.

Hi, is the inference API still working? It still shows error 500 for me.
No, it's still not working. Don't know why. It really sucks.

BigScience Workshop org

Hi @aphosic and @Makya ,
Thanks for noticing the API for the model is down, we have taken note of this and are working on this although it has fallen through the cracks.
We'll update you as soon as this back up!
Sorry for the inconvenience. In the meantime, you can run the inference on your own hardware as shown in the T0 github repo! and thanks tooling in accelerate, you don't even need to have a large GPU (see https://huggingface.co/blog/accelerate-large-models)!
Victor

Hi @aphosic and @Makya ,
Thanks for noticing the API for the model is down, we have taken note of this and are working on this although it has fallen through the cracks.
We'll update you as soon as this back up!
Sorry for the inconvenience. In the meantime, you can run the inference on your own hardware as shown in the T0 github repo! and thanks tooling in accelerate, you don't even need to have a large GPU (see https://huggingface.co/blog/accelerate-large-models)!
Victor

Thanks Victor.

Hi @aphosic and @Makya ,
Thanks for noticing the API for the model is down, we have taken note of this and are working on this although it has fallen through the cracks.
We'll update you as soon as this back up!
Sorry for the inconvenience. In the meantime, you can run the inference on your own hardware as shown in the T0 github repo! and thanks tooling in accelerate, you don't even need to have a large GPU (see https://huggingface.co/blog/accelerate-large-models)!
Victor
Ye thanks Victor. Can't wait. Waiting patiently.

BigScience Workshop org

@michael-newsrx-com The model isn't currently hosted; I'm afraid. You will have to host it yourself for the time being.

See: https://huggingface.co/bigscience/T0pp/discussions/9#63898ccb80134ba508d4f2bc and https://huggingface.co/bigscience/T0pp/discussions/8

christopher changed discussion status to closed

i still have an error 500

the api is still down i. think

This comment has been hidden
This comment has been hidden
This comment has been hidden
This comment has been hidden
This comment has been hidden
This comment has been hidden
osanseviero locked this discussion

Sign up or log in to comment