Text Generation
Transformers
English
alpaca
bloom
LLM

Max prompt length

#6
by ak2023 - opened

What's the maximum prompt length the model can take as an input?

Among the instruction, input (if there is some input), and output, the max length was set to 256 tokens

Sign up or log in to comment