Max context length/input token length.

#20
by gsaivinay - opened

hello, could somebody please answer what is the max token length of this model for input + generation?

Sign up or log in to comment