Running issue

#47
by Vito99 - opened

Whatever I put in the chat box, it always returns "{‘error’: {‘message’: ‘This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?’, ‘type’: ‘invalid_request_error’, ‘param’: ‘model’, ‘code’: None}}". or "Token too long"

The token issue looks like :{‘error’: {‘message’: “This model’s maximum context length is 2049 tokens, however you requested 4096 tokens (2331 in your prompt; 1765 for the completion). Please reduce your prompt; or completion length.”, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: None}}

Sign up or log in to comment