Exceeding max context length with only 8k tokens

#73
by dordonezc - opened

Hey, I am using the model for some documents and keep getting the following error
"Error code: 400 - {'error': {'message': "This model's maximum context length is 8192 tokens, however you requested 8804 tokens (8804 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.", 'type': 'invalid_request_error', 'param': None, 'code': None}"

How come I get this error if the max context of the model is supposed to be 128k?

dordonezc changed discussion status to closed

Sign up or log in to comment