Internal server error

#2
by airsheysr - opened

Whenever I run the Inference API on my local machine and even on Google Colab, I get the following error:
{'error': 'Internal Server Error'}

Please help how to solve this.

cc @reach-vb - the widget works on the Hub but the inference API is reported as broken. Do you know why this might be happening? The inference API code snippet also appears to be wrong, e.g. the code snippet here: https://huggingface.co/facebook/mms-tts-mrw?text=hey&inference_api=true

Returns:

----> 1 output.json()

File ~/venv/lib/python3.9/site-packages/requests/models.py:975, in Response.json(self, **kwargs)
    971     return complexjson.loads(self.text, **kwargs)
    972 except JSONDecodeError as e:
    973     # Catch JSON-related errors and raise as requests.JSONDecodeError
    974     # This aliases json.JSONDecodeError and simplejson.JSONDecodeError
--> 975     raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)

JSONDecodeError: Expecting value: line 1 column 1 (char 0)

=> should we fix this for all TTA models?

I also get the same error with facebook/mms-tts-eng.

Hey hey! Sorry for the faulty snippet. I'm working on a fix for this!

This should be fixed once this PR is merged: https://github.com/huggingface/hub-docs/pull/1030 πŸ€—

Sign up or log in to comment