Spaces:
Running
on
A10G
Running
on
A10G
How to use it as an API
#57
by
tushar310
- opened
Hi Team,
This is a wonderful capability but unfortnately given I do not have the capacity to run at my end I will have to use this space itself. Here I wanted to know if I could use it as an API that is offered using Gradio API at the bottom of the interface. The question I want to use a private repo at HF that has my custom model. How can I do that via the API? Any guidance? It would be amazing if we can crack this.
out of curiosity what program are you using on the right that it displays API documentation to you?
Thanks
Got this error when using with API.
from gradio_client import Client
client = Client("ggml-org/gguf-my-repo")
result = client.predict(
model_id="OpenLLM-France/Claire-7B-0.1",
q_method="Q5_K_M",
private_repo=False,
api_name="/predict"
)
print(result)
Loaded as API: https://ggml-org-gguf-my-repo.hf.space ✔
---------------------------------------------------------------------------
AppError Traceback (most recent call last)
<ipython-input-8-56cabcda44cc> in <cell line: 4>()
2
3 client = Client("ggml-org/gguf-my-repo")
----> 4 result = client.predict(
5 model_id="OpenLLM-France/Claire-7B-0.1",
6 q_method="Q5_K_M",
6 frames
/usr/local/lib/python3.10/dist-packages/gradio_client/client.py in _predict(*data)
1218 if "error" in result:
1219 if result["error"] is None:
-> 1220 raise AppError(
1221 "The upstream Gradio app has raised an exception but has not enabled "
1222 "verbose error reporting. To enable, set show_error=True in launch()."
AppError: The upstream Gradio app has raised an exception but has not enabled verbose error reporting. To enable, set show_error=True in launch().
Hi all, to prioritise experience of the space on the hub, we have removed the support for gradio_client
.
reach-vb
changed discussion status to
closed