disable inference API
#43
by
olivierdehaene
HF staff
- opened
No description provided.
lhoestq
changed pull request status to
merged
Hello!
Is this temporary?
No. Bloomz is too expensive for us to run because we need to dedicate 8xA100 80 GBs to it while the number of requests served is really low.
Bloom suffer of the same issue but the hardware is sponsored by AzureML.
That's too bad, I'm a student and I was actively working on it. But I'm gonna find a way to do the inference locally.
Thank you for your response!
Here's another way to run free inference https://huggingface.co/bigscience/bloomz/discussions/28 albeit a bit slower
Thank you so much! This seems to be a good alternative!