What's the maximum prompt length the model can take as an input?
Among the instruction, input (if there is some input), and output, the max length was set to 256 tokens
· Sign up or log in to comment