internal server error: Inference API is down for the bark-small model?

#11
by packmad - opened

Hi,
I tried to use the bark model inference service in my code and the generated audio file is just an empty 1KB file. Also the hosted inference API finish with an "internal server error" (URL: https://huggingface.co/suno/bark-small) so overall the service seems down at the moment.
Can you fix it?
My code:
API_URL = "https://api-inference.huggingface.co/models/suno/bark-small"
headers = {"Authorization": f"Bearer {HUGGINGFACEHUB_API_TOKEN}"}
payload = {
"inputs": message
}
response = requests.post(API_URL, headers=headers, json=payload)
with open('audio.flac', 'wb') as file:
file.write(response.content)

Sign up or log in to comment