最大token
#12
by
goodboys
- opened
Caught exception: Invalid response object from API: '{"object":"error","message":"This model's maximum context length is 2048 tokens. However, you requested 2637 tokens (2125 in the messages, 512 in the completion). Please reduce the length of the messages or completion.","code":40303}' (HTTP response code was 400) 大佬,我使用langchain调用时候 出现这种错误,用32k就没有
调这个模型加载py 程序 或其它 max_length长度建议至少加10倍 如 max_length=40960