got error: " Unexpected end of JSON input"

#102
by jietian1 - opened

I tried to get into the code and find the detailed error message is:
{"error":"Authorization header is correct, but the token seems invalid"}

but I didn't see any authorization in header

image.png

image.png

hi, you solve this sir? regards

Hugging Chat org

You need to create a .env.local file. Copy MODEL_ENDPOINTS from .env, but change hf_<access token> with your own access token from https://huggingface.co/settings/tokens

Hi, yes thanks you, now I have error Cannot read properties of undefined (reading 'special') I set PUBLIC_MAX_INPUT_TOKENS=1000 like you say in other post, but same error. How can I make debuh in visual studio but breakpoints in +server.ts no sync.

Hugging Chat org

Can you pull the latest change and do npm install agian? You should have a more descriptive error

yes, it works now, thank you very much. Now I want to debug with breakpoints but it doesn't recognize the breaks that I mark in the code, do I need a configuration in the visual code?

  1. After some messages sent, it give error, maybe is just the model, Error: Input validation error: inputs tokens + max_new_tokens must be <= 1512. Given: 505 inputs tokens and 1024 max_new_tokens
    at parseGeneratedText (/src/routes/conversation/[id]/+server.ts:160:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async saveMessage (/src/routes/conversation/[id]/+server.ts:86:26)

things fixed, now works fine.

Hi, Now I want to debug with breakpoints but it doesn't recognize the breaks that I mark in the code, do I need a configuration in the visual code?

  1. After some messages sent, it give error, maybe is just the model, Error: Input validation error: inputs tokens + max_new_tokens must be <= 1512. Given: 505 inputs tokens and 1024 max_new_tokens
    at parseGeneratedText (/src/routes/conversation/[id]/+server.ts:160:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async saveMessage (/src/routes/conversation/[id]/+server.ts:86:26)

I believe this is the issue of Continuous dialogue ability. if we want it to remember the context, it will save all messages, that will make token up and up again.
start a new session will resolve the issue.
As my unserstanding, if we want to increase max_tokens, we need more VRAM of GPU.

jietian1 changed discussion status to closed

Sign up or log in to comment