What is the max token limit on this model?

#179
by RamanSB - opened

By MaxTokenLimit I'm referring to the maximum allowed sum of: Tokens used in Prompt + Tokens used in the output generated in the prompt completion response?

I have seen people mentioning 32k (32768), that refers to the context window token limit (which I would consider as the input token) - I want to know what the maximum output tokens is as well?

Or am I correct in assuming: Maximum Possible Output Tokens = (32768) - Tokens used in Prompt?

Please state your source or reference to the value mentioned.

Sign up or log in to comment