requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

#49
by Jenad1kr - opened

I get this error when I try to use gradio to lanuch, both in spaces as well as my local.

raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

any idea what could be wrong?

Could you share the full traceback?

@ArthurZ ,
here it is. I created a new space.
===== Application Startup at 2023-10-08 21:21:11 =====

/home/user/app/app.py:3: GradioDeprecationWarning: gr.Interface.load() will be deprecated. Use gr.load() instead.
gr.Interface.load("models/mistralai/Mistral-7B-v0.1").launch()
Fetching model from: https://huggingface.co/mistralai/Mistral-7B-v0.1
Running on local URL: http://0.0.0.0:7860

To create a public link, set share=True in launch().
Traceback (most recent call last):
File "/home/user/.local/lib/python3.10/site-packages/requests/models.py", line 971, in json
return complexjson.loads(self.text, **kwargs)
File "/usr/local/lib/python3.10/json/init.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.10/site-packages/gradio/queueing.py", line 406, in call_prediction
output = await route_utils.call_process_api(
File "/home/user/.local/lib/python3.10/site-packages/gradio/route_utils.py", line 226, in call_process_api
output = await app.get_blocks().process_api(
File "/home/user/.local/lib/python3.10/site-packages/gradio/blocks.py", line 1554, in process_api
result = await self.call_function(
File "/home/user/.local/lib/python3.10/site-packages/gradio/blocks.py", line 1192, in call_function
prediction = await anyio.to_thread.run_sync(
File "/home/user/.local/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/home/user/.local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
File "/home/user/.local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
File "/home/user/.local/lib/python3.10/site-packages/gradio/utils.py", line 659, in wrapper
response = f(*args, **kwargs)
File "/home/user/.local/lib/python3.10/site-packages/gradio/external.py", line 415, in query_huggingface_api
errors_json = response.json()
File "/home/user/.local/lib/python3.10/site-packages/requests/models.py", line 975, in json
raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)
requests.exceptions.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Same issue

@ArthurZ Same Issue. I did the quick-create to host a model in a new space. I generated this Gradio code:

import gradio as gr
import os
import requests

HUGGING_FACE_TOKEN=os.environ.get('HUGGING_FACE_TOKEN', None)

gr.load(name="models/meta-llama/Llama-2-7b-chat-hf",hf_token=HUGGING_FACE_TOKEN).launch()

Did you find a solution to this?

Unfortunately no. If you search Spaces for "Llama-2-7b-chat-hf" numerous people have created Spaces the same way, and they have errors.

Sign up or log in to comment